Abstract
In this paper we introduce strategies how the ordinary least square estimator of the coefficient vector of the multiple regression could be shrunk. One of the strategies shrinks all components while the other shrinks only some components. We show that the proposed shrunken estimators are admissibile. We also provide theoretical results on risk comparisons.
Similar content being viewed by others
References
Farebrother RW (1976) Further results on the mean square error of ridge regression.Journal of the Royal Statistical Society B38, 248–250.
Gnot S, Trenkler G, Zymslony RJ (1995) Non-negative minimum biased quadratic estimation in the linear regression model.Journal of Multivariate Analysis 54, 113–125.
Hoerl AE, Kennard RW (1970) Ridge regression: Biased estimation for non-orthogonal problems.Technometrics 12, 55–67.
Mayer LS, Willke TA (1973) On biased estimation in linear models.Technometrics 15, 497–508.
Rao CR (1976) Estimation of parameters in a linear model.Ann. Statist. 4, 1023–1037.
Trenkler G (1981)Biased Estimators in the Linear Regression. In: Mathematical Systems in Economics 58. Oelgeschlager, Gunn und Hain, Massachusetts.
Wencheko E (2000) Total shrinkage versus partial shrinkage in multiple linear regression.SINET: Ethiopian Journal of Science 23, 67–72.
Wencheko E (1998) A deterministically shrunken estimator.SINET: Ethiopian Journal of Science 21, 273–277.
Wencheko E (1996) Estimation of the squared norm of the regressor vector.SINET: Ethiopian Journal of Science 19, 51–63.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Wencheko, E. Softly shrunk and partially shrunk rank-reduced estimation of the regression coefficients. Statistical Papers 46, 267–279 (2005). https://doi.org/10.1007/BF02762971
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF02762971