Abstract
We consider the linear regression model where only a particular linear function of the dependent variables is observed, Stahlecker and Schmidt (1987) proposed a “naive” least squares (LS) estimator for regression coefficients in such a case. In this note we represent their estimator as a general ridge estimator. This observation leads to a view different from the previous work and provides an easy way of obtaining many important properties of the naive LS estimator. Our approach also gives some insight into the relationship between the naive LS estimator and the generalized least squares estimator.
Similar content being viewed by others
References
Alalouf IS, Styan GPH (1979) Characterizations of estimability in the general linear model. Annals of Statistics 7: 194–200
Baksalary JK, Kala R (1983) Partial ordering between matrices one of which is of rank one. Bulletin of the Polish Academy of Sciences, Mathematics: 5–7
Baksalary JK, Pordzik PR, Trenkler G (1990) A note on generalized ridge estimators. Communications in Statistics A 19: 2871–2877
Bose RC (1944) The fundamental theorem of linear estimation. Abstract. Proceedings of the Thirtyfirst Indian Science Congress 4 Part III: 2–3
Belsley DA, Kuh E, Welsch RE (1980) Regression diagnostics. John Wiley New York
Chawla JS (1990) A note on ridge regression. Statistics and Probability Letters 9: 343–345
Draper NR, Smith H (1981) Applied regression analysis. 2nd Edition. Wiley, New York
Farebrother RW (1978) Partitioned ridge regression. Technometrics 20: 121–122
Hoerl AE, Kennard RW (1970) Ridge regression: Application for nonorthogonal problems Technometrics 12: 69–72
Horn RA, Johnson CR (1985) Matrix analysis. London Cambridge University Press
Khatri CG (1968) Some results for the singular normal multivariate regression models. Sankhya 30: 267–280
Kotz S, Johnson NL (1988) Zyskind-Martin models: In encyclopedia of statistical sciences. Kotz S, Johnson NL (Ed) p 683. Wiley New York
Liski EP (1979) On reduced risk estimation in linear models. Acta Univ Tamperensis Ser A 105 Tampere
Liski EP (1982) A test of the mean square error criterion for shrinkage estimators. Communications in Statistics-Simulation Computation 11(5): 543–562
Liski EP (1988) A test of the mean square error criterion for linear admissible estimators. Communications in Statistics-Theory and Methods 17(11): 3743–3756
Marsaglia G, Styan GPH (1974) Equalities and inequalities for ranks of matrices. Linear and Multilinear Algebra 2: 269–292
Nordström K (1984) On a decomposition of the singular Gauss-Markov model: in Linear Statistical Inference (Proceedings, Poznan 1984, Calinski T, Klonecki W (Ed) p 231–245. Lecture Notes in Statistics 35 Springer-Verlag, Berlin
Rao CR (1973) Linear statistical inference and its applications 2nd Edition, Wiley New York
Rao CR (1976) Estimation of parameters in a linear model (The 1975 Wald Memorial Lectures). Annals of Statistics 4: 1023–1037 Corrigendum 1979 7: 696
Scheffe H (1959) The analysis of variance, John Wiley and Sons, New York
Stahlecker P, Schmidt K (1987) On least squares estimation with particular linear function of the dependent variable. Economic Letter 23: 59–64
Trenkler G (1985) Mean square error matrix comparisons of estimators in linear regression. Communications in Statistics 14: 2495–2509
Zyskind G, Martin FB (1969) On best linear estimation and a general Gauss-Markoff theorem in a linear model with arbitrary non-negative structure. SIAM Journal of Applied Mathematics 17: 1190–1202
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Liski, E.P., Wang, SG. Another look at the naive estimator in a regression model. Metrika 41, 55–64 (1994). https://doi.org/10.1007/BF01895304
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01895304