Skip to main content

Part of the book series: Springer Series in Statistics ((SSS))

  • 3286 Accesses

Abstract

Suppose we have the model \(\mathbf{y} =\boldsymbol{\theta } +\boldsymbol{\varepsilon }\), where \(\mathrm{E}[\boldsymbol{\varepsilon }] = \mathbf{0}\), \(\mathrm{Var}[\boldsymbol{\varepsilon }] =\sigma ^{2}\mathbf{I}_{n}\), and \(\boldsymbol{\theta }\in \varOmega\), a p-dimensional vector space. One reasonable estimate of \(\boldsymbol{\theta }\) would be the value \(\hat{\boldsymbol{\theta }}\), called the least squares estimate, that minimizes the total “error” sum of squares

$$\displaystyle{SS =\sum _{ i=1}^{n}\varepsilon _{ i}^{2} =\parallel \mathbf{y}-\boldsymbol{\theta }\parallel ^{2}}$$

subject to \(\boldsymbol{\theta }\in \varOmega\). A clue as to how we might calculate \(\hat{\boldsymbol{\theta }}\) is by considering the simple case in which y is a point P in three dimensions and Ω is a plane through the origin O. We have to find the point Q (\(=\hat{\boldsymbol{\theta }}\)) in the plane so that PQ 2 is a minimum; this is obviously the case when OQ is the orthogonal projection of OP onto the plane. This idea can now be generalized in the following theorem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Atiqullah, M. (1962). The estimation of residual variance in quadratically balanced least squares problems and the robustness of the F test. Biometrika, 49, 83–91.

    Article  MATH  MathSciNet  Google Scholar 

  • Cook, R. D., & Weisberg, S. (1982). Residuals and influence in regression. New York: Chapman & Hall.

    MATH  Google Scholar 

  • Rao, C. R. (1952). Some theorems on minimum variance estimation. Sankhyā, 12, 27–42.

    MATH  Google Scholar 

  • Seber, G. A. F. (2008). A matrix handbook for statisticians. New York: Wiley.

    Google Scholar 

  • Seber, G. A. F., & Lee, A. J. (2003). Linear regression analysis (2nd ed.). New York: Wiley.

    Book  MATH  Google Scholar 

  • Seber, G. A. F., & Wild, C. J. (1989). Nonlinear regression. New York: Wiley. Also reproduced in paperback by Wiley in (2004).

    Google Scholar 

  • Swindel, B. F. (1968). On the bias of some least-squares estimators of variance in a general linear model. Biometrika, 55, 313–316.

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Seber, G.A.F. (2015). Estimation. In: The Linear Model and Hypothesis. Springer Series in Statistics. Springer, Cham. https://doi.org/10.1007/978-3-319-21930-1_3

Download citation

Publish with us

Policies and ethics