Abstract
Multiple Linear Regression analysis is one of the techniques in statistics that is used to analyse the relationship between a dependent variable and two or more independent (regressor) variables. Ordinary Least Square method is commonly used to estimate the parameters. Most frequently occurring problem in multiple linear regression analysis is the presence of multicollinearity. Multicollinearity in the least square estimation produces estimation with a large variance, so another method is needed to overcome the multicollinearity. The method is called ridge regression. In this method, a constant bias ridge κ is added to X'X matrix. A study had developed the method by using the prior information of the parameter β and introduced the Restricted Ridge Regression method. Prior information of the parameter β was defined as a non-sample information arising from past experiences and the opinions of an expert with similar situations and containing the same parameters β. This study explains the use of Restricted Ridge Regression method in overcoming the multicollinearity in regression model. Based on an application on a data set, β of Restricted Ridge Regression has smallest mean square error (MSE) than β of Ordinary Least Square.
Original language | English |
---|---|
Article number | 012021 |
Journal | Journal of Physics: Conference Series |
Volume | 1725 |
Issue number | 1 |
DOIs | |
Publication status | Published - 12 Jan 2021 |
Event | 2nd Basic and Applied Sciences Interdisciplinary Conference 2018, BASIC 2018 - Depok, Indonesia Duration: 3 Aug 2018 → 4 Aug 2018 |
Keywords
- Multicollinearity
- Regressor
- Restricted ridge regression