Skip to main content
Log in

Comparisons among some estimators in misspecified linear models with multicollinearity

  • Estimation
  • Published:
Annals of the Institute of Statistical Mathematics Aims and scope Submit manuscript

Abstract

In this paper we deal with comparisons among several estimators available in situations of multicollinearity (e.g., the r-k class estimator proposed by Baye and Parker, the ordinary ridge regression (ORR) estimator, the principal components regression (PCR) estimator and also the ordinary least squares (OLS) estimator) for a misspecified linear model where misspecification is due to omission of some relevant explanatory variables. These comparisons are made in terms of the mean square error (mse) of the estimators of regression coefficients as well as of the predictor of the conditional mean of the dependent variable. It is found that under the same conditions as in the true model, the superiority of the r-k class estimator over the ORR, PCR and OLS estimators and those of the ORR and PCR estimators over the OLS estimator remain unchanged in the misspecified model. Only in the case of comparison between the ORR and PCR estimators, no definite conclusion regarding the mse dominance of one over the other in the misspecified model can be drawn.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Baye, M. R. and Parker, D. F. (1984) Combining ridge and principal component regression: A money demand illustration. Comm. Statist. A—Theory Methods, 13, 197–205.

    Google Scholar 

  • Farebrother, R. W. (1972) Principal component estimators and minimum mean square error criteria in regression analysis, Rev. Econom. Statist., 54, 322–336.

    Google Scholar 

  • Fomby, T. B., Hill, R. C. and Johnson, S. R. (1978) An optimality property of principal components regression. J. Amer. Statist. Assoc., 73, 191–193.

    Google Scholar 

  • Hoerl, A. E. and Kennard, R. W. (1970). Ridge regression: Biased estimation of nonorthogonal problems, Technometrics, 12, 55–67.

    Google Scholar 

  • Judge, G. G., Griffiths, W. E., Hill, R. C. and Lee, T. C. (1980) The Theory and Practice of Econometrics, Wiley, New York.

    Google Scholar 

  • Nomura, M. and Ohkubo, T. (1985) A note on combining ridge and principal component regression, Comm. Statist. A—Theory Methods, 14, 2489–2493.

    Google Scholar 

  • Rao, C. R. (1974) Linear Statistical Inference and Its Applications, Wiley Eastern Private Limited, New Delhi.

    Google Scholar 

  • Vinod, H. D. (1978) A survey of ridge regression and related techniques for improvements over ordinary least squares, Rev. Econom. Statist., 60, 121–131.

    Google Scholar 

  • Vinod, H. D. and Ullah, A. (1981) Recent Advances in Regression Methods, Marcel Dekker, New York.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

About this article

Cite this article

Sarkar, N. Comparisons among some estimators in misspecified linear models with multicollinearity. Ann Inst Stat Math 41, 717–724 (1989). https://doi.org/10.1007/BF00057737

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00057737

Key words and phrases

Navigation