Skip to main content
Log in

Another look at the naive estimator in a regression model

  • Published:
Metrika Aims and scope Submit manuscript

Abstract

We consider the linear regression model where only a particular linear function of the dependent variables is observed, Stahlecker and Schmidt (1987) proposed a “naive” least squares (LS) estimator for regression coefficients in such a case. In this note we represent their estimator as a general ridge estimator. This observation leads to a view different from the previous work and provides an easy way of obtaining many important properties of the naive LS estimator. Our approach also gives some insight into the relationship between the naive LS estimator and the generalized least squares estimator.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Alalouf IS, Styan GPH (1979) Characterizations of estimability in the general linear model. Annals of Statistics 7: 194–200

    Google Scholar 

  • Baksalary JK, Kala R (1983) Partial ordering between matrices one of which is of rank one. Bulletin of the Polish Academy of Sciences, Mathematics: 5–7

  • Baksalary JK, Pordzik PR, Trenkler G (1990) A note on generalized ridge estimators. Communications in Statistics A 19: 2871–2877

    Google Scholar 

  • Bose RC (1944) The fundamental theorem of linear estimation. Abstract. Proceedings of the Thirtyfirst Indian Science Congress 4 Part III: 2–3

    Google Scholar 

  • Belsley DA, Kuh E, Welsch RE (1980) Regression diagnostics. John Wiley New York

    Google Scholar 

  • Chawla JS (1990) A note on ridge regression. Statistics and Probability Letters 9: 343–345

    Google Scholar 

  • Draper NR, Smith H (1981) Applied regression analysis. 2nd Edition. Wiley, New York

    Google Scholar 

  • Farebrother RW (1978) Partitioned ridge regression. Technometrics 20: 121–122

    Google Scholar 

  • Hoerl AE, Kennard RW (1970) Ridge regression: Application for nonorthogonal problems Technometrics 12: 69–72

    Google Scholar 

  • Horn RA, Johnson CR (1985) Matrix analysis. London Cambridge University Press

    Google Scholar 

  • Khatri CG (1968) Some results for the singular normal multivariate regression models. Sankhya 30: 267–280

    Google Scholar 

  • Kotz S, Johnson NL (1988) Zyskind-Martin models: In encyclopedia of statistical sciences. Kotz S, Johnson NL (Ed) p 683. Wiley New York

    Google Scholar 

  • Liski EP (1979) On reduced risk estimation in linear models. Acta Univ Tamperensis Ser A 105 Tampere

  • Liski EP (1982) A test of the mean square error criterion for shrinkage estimators. Communications in Statistics-Simulation Computation 11(5): 543–562

    Google Scholar 

  • Liski EP (1988) A test of the mean square error criterion for linear admissible estimators. Communications in Statistics-Theory and Methods 17(11): 3743–3756

    Google Scholar 

  • Marsaglia G, Styan GPH (1974) Equalities and inequalities for ranks of matrices. Linear and Multilinear Algebra 2: 269–292

    Google Scholar 

  • Nordström K (1984) On a decomposition of the singular Gauss-Markov model: in Linear Statistical Inference (Proceedings, Poznan 1984, Calinski T, Klonecki W (Ed) p 231–245. Lecture Notes in Statistics 35 Springer-Verlag, Berlin

    Google Scholar 

  • Rao CR (1973) Linear statistical inference and its applications 2nd Edition, Wiley New York

    Google Scholar 

  • Rao CR (1976) Estimation of parameters in a linear model (The 1975 Wald Memorial Lectures). Annals of Statistics 4: 1023–1037 Corrigendum 1979 7: 696

    Google Scholar 

  • Scheffe H (1959) The analysis of variance, John Wiley and Sons, New York

    Google Scholar 

  • Stahlecker P, Schmidt K (1987) On least squares estimation with particular linear function of the dependent variable. Economic Letter 23: 59–64

    Google Scholar 

  • Trenkler G (1985) Mean square error matrix comparisons of estimators in linear regression. Communications in Statistics 14: 2495–2509

    Google Scholar 

  • Zyskind G, Martin FB (1969) On best linear estimation and a general Gauss-Markoff theorem in a linear model with arbitrary non-negative structure. SIAM Journal of Applied Mathematics 17: 1190–1202

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Liski, E.P., Wang, SG. Another look at the naive estimator in a regression model. Metrika 41, 55–64 (1994). https://doi.org/10.1007/BF01895304

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01895304

Key Words and Phrases

Navigation