Advertisement

Estimation

  • Ronald Christensen
Part of the Springer Texts in Statistics book series (STS)

Abstract

In this chapter, properties of least squares estimates are examined for the model
$$\begin{array}{*{20}{c}} {Y = X\beta + e,}&{E(e) = 0,}&{Cov(e)} \end{array} = {\sigma ^2}I$$
The chapter begins with a discussion of the concept of estimability in linear models. Section 2 characterizes least squares estimates. Sections 3, 4, and 5 establish that least squares estimates are best linear unbiased estimates, maximum likelihood estimates, and minimum variance unbiased estimates. The last two of these properties require the additional assumption e ~ N(0, σ 2 I)Section 6 also assumes that the errors are normally distributed and presents the distributions of various estimates. From these distributions various tests and confidence intervals are easily obtained. Section 7 examines the model
$$\begin{array}{*{20}{c}} {Y = X\beta + e,}&{E(e) = 0,}&{Cov(e)} \end{array} = {\sigma ^2}V$$
where V is a known positive definite matrix. Section 7 introduces weighted least squares estimates and presents properties of those estimates. Section 8 presents the normal equations and establishes their relationship to least squares and weighted least squares estimation. Section 9 discusses Bayesian estimation.

Keywords

Mean Square Error Bayesian Analysis Unbiased Estimate Normal Equation Prediction Interval 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1996

Authors and Affiliations

  • Ronald Christensen
    • 1
  1. 1.Department of Mathematics and StatisticsUniversity of New MexicoAlbuquerqueUSA

Personalised recommendations