IUP Publications Online
Home About IUP Magazines Journals Books Archives
     
Recommend    |    Subscriber Services    |    Feedback    |     Subscribe Online
 
The IUP Journal of Computational Mathematics
Solution of Multicollinearity by Generalized Inverse Regression
:
:
:
:
:
:
:
:
:
 
 
 
 
 
 
 

This paper axiomatically characterizes Generalized Inverse Regression (GIR) only to combat multicollinearity. Generalized Inverse (GI) estimator is a better alternative to Ordinary Least Square (OLS) estimator in case of ill-conditioning. After discussing Moore-Penrose and Rao’s generalized inverses, the paper presents a discussion as to how these could lead to a unified theory of least square estimation when the design matrix is of less than full column rank. The GI estimator is biased and there exists a trade-off between bias and variance, like in the case of ridge estimator. The GI estimator and ridge estimator both coincide with the OLS estimator when biasing parameter is zero and the rank of design matrix is equal to the number of columns respectively.

 
 
 

Multicollinearity is a possible cause in a study with several explanatory variables. Multicollinearity arises when variables are highly correlated or tend to be highly correlated. Greene (2002) indicates that multicollinearity is a possible problem due to coefficients having wrong signs or implausible magnitudes. Freund and Littell (2000) point out that a set of eigenvalues of relatively equal magnitudes indicates that there is a little multicollinearity. Whatever may be the possible cause Ordinary Least Square (OLS) estimate cannot be obtained due to ill-conditioned design matrix for inversion.

The traditional solution is to collect more data or to drop one or more variables. Collecting more data may often be expensive or not practicable in many situations. Dropping one or more variables from the model to alleviate the problem of multicollinearity may lead to the specification bias (Theil, 1971) and hence the solution may be worse than the disease in certain situations. In this situation, our interest will be to squeeze out maximum information from whatever data we have at our disposal. This interest has motivated the researchers for the development of some very ingenious statistical methods like, ridge regression, principal component regression and Generalized Inverse Regression (GIR).

This paper discusses GIR only to solve the problem of multicollinearity. Moore (1920) and Penrose (1955) proposed the technique of Generalized Inverse (GI) of a matrix. Moore and Penrose gave similar type of definitions of the GI; it is referred to as Moore-Penrose GI. Rao (1962) introduced a GI, which does not give unique solution. GI estimator is a better alternative to OLS estimator in case of illconditioning.

 
 
 

Computational Mathematics Journal, Logistic Regression Models, Business Management, Decision Theory, Logistic Regression Programs, SPSS Nonlinear Program, Proportional Reductions, Statistical Aanalysis Software Package, Statistical Software SPSS, Squared Pearson Correlation .