Accuracy and Inference
There is a classic result in statistics called the Gauss - Markov theorem. It de- scribes situations under which the least-squares estimator of the slope and intercept of a regression line is optimal. A special case of this theorem de- scribes conditions under which the sample mean is optimal among the class of weighted means. In order to justify any competitor of least squares, we must understand the Gauss - Markov theorem and why it does not rule out competing methods such as the median and other estimators to be described. So one goal in this chapter is to give a relatively nontechnical description of this theorem. Another goal is to introduce the notion of a confidence interval, a fundamental tool used to make inferences about a population of individuals or things. We will see that a so-called homoscedastic error term plays a central role in both the Gauss-Markov theorem and the conventional confidence interval used in regression. Homoscedasticity turns out to be of crucial importance in applied work because it is typically assumed and because recent results indicate that violating the homoscedasticity assumption can be disastrous. Fortunately, effective methods for dealing with this problem have been devised.
KeywordsCentral Limit Theorem Probability Coverage Normal Curve Sample Standard Deviation Probability Curve
Unable to display preview. Download preview PDF.