# Accuracy and Inference

• Rand R. Wilcox

## Abstract

There is a classic result in statistics called the Gauss-Markov theorem. It describes situations under which the least squares estimator of the slope and intercept of a regression line is optimal. A special case of this theorem describes conditions under which the sample mean is optimal among the class of weighted means. In order to justify any competitor of least squares, we must understand the Gauss-Markov theorem and why it does not rule out competing methods such as the median, and other estimators to be described later. Therefore, one goal in this chapter is to give a relatively nontechnical description of this theorem. Another goal is to introduce the notion of a confidence interval, a fundamental tool used to make inferences about a population of individuals or things We will see that a so-called homoscedastic error term plays a central role in both the Gauss-Markov theorem and the conventional confidence interval used in regression. Homoscedasticity turns out to be of crucial importance in applied work because it is typically assumed and because recent results indicate that violating the homoscedasticity assumption can be disastrous. Fortunately, effective methods for dealing with this problem have been devised.

## Keywords

Central Limit Theorem Probability Coverage Normal Curve Probability Curve Applied Work