Skip to main content

Statistics and Sampling Distributions

  • Chapter
  • First Online:
Modern Mathematical Statistics with Applications

Part of the book series: Springer Texts in Statistics ((STS))

  • 111k Accesses

Abstract

This chapter helps make the transition between probability and inferential statistics. Given a sample of \( n \) observations from a population, we will be calculating estimates of the population mean, median, standard deviation, and various other population characteristics (parameters).

The original version of this chapter was revised. An erratum to this chapter can be found at https://doi.org/10.1007/978-1-4614-0391-3_15

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 95.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Bibliography

  • Larsen, Richard, and Morris Marx, An Introduction to Mathematical Statistics and Its Applications (4th ed.), Prentice Hall, Englewood Cliffs, NJ, 2005. More limited coverage than in the book by Olkin et al., but well written and readable.

    Google Scholar 

  • Olkin, Ingram, Cyrus Derman, and Leon Gleser, Probability Models and Applications (2nd ed.), Macmillan, New York, 1994. Contains a careful and comprehensive exposition of limit theorems.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jay L. Devore .

Appendix: Proof of the Central Limit Theorem

Appendix: Proof of the Central Limit Theorem

First, here is a restatement of the theorem. Let X 1, X 2, …, X n be a random sample from a distribution with mean μ and variance σ 2. Then, if Z is a standard normal random variable,

$$ \mathop{{\lim }}\limits_{{n \to \infty }} P\left( {\frac{{\overline{X} - \mu }}{{\sigma /\sqrt {n} }} \ <\ z} \right) = P(Z \ <\ z) $$

The theorem says that the distribution of the standardized \( \overline{X} \) approaches the standard normal distribution. Our proof is only for the special case in which the moment generating function exists, which implies also that all its derivatives exist and that they are continuous. We will show that the moment generating function of the standardized \( \overline{X} \) approaches the moment generating function of the standard normal distribution. However, convergence of the moment generating function does not by itself imply the desired convergence of the distribution. This requires a theorem, which we will not prove, showing that convergence of the moment generating function implies the convergence of the distribution.

The standardized \( \overline{X} \) can be written as

$$ Y = \frac{{\overline{X} - \mu }}{{\sigma /\sqrt {n} }} = \frac{{(1/n)[({X_1} - \mu )/\sigma + ({X_2} - \mu )/\sigma + \cdots + ({X_n} - \mu )/\sigma ] - 0}}{{1/\sqrt {n} }} $$

The mean and standard deviation for the first ratio come from the first proposition of Section 6.2, and the second ratio is algebraically equivalent to the first. It says that, if we define W to be the standardized X, so W i  = (X i – μ)/σ, i = 1 2,…, n, then the standardized \( \overline{X} \) can be written as the standardized \( \bar{W} \),

$$ Y = \frac{{\overline{X} - \bar{\mu }}}{{{{\sigma } \left/ {{\sqrt {n} }} \right.}}} = \frac{{\overline{W} - 0}}{{{{1} \left/ {{\sqrt {n} }} \right.}}}. $$

This allows a simplification of the proof because we can work with the simpler variable W, which has mean 0 and variance 1. We need to obtain the moment generating function of

$$ Y = \frac{{\overline{W} - 0}}{{1/\sqrt {n} }} = \sqrt {n} \ \overline{W} = ({W_1} + {W_2} + \cdots + {W_n})/\sqrt {n} $$

from the moment generating function M(t) of W. With the help of the Section 6.3 proposition on moment generating functions of linear combinations of independent random variables, we get \( {M_Y}(t) = M{\left( {{{t} \left/ {{\sqrt {n} }} \right.}} \right)^n} \). We want to show that this converges to the moment generating function of a standard normal random variable, \( {M_Z}(t) = {e^{{{{{{t^2}}} \left/ {2} \right.}}}} \). It is easier to take the logarithm of both sides and show instead that \( \ln [{M_Y}(t)] = n\ln [M(t/\sqrt {n}\, ] \to {t^2}/2 \). This is equivalent because the logarithm and its inverse are continuous functions.

The limit can be obtained from two applications of L’Hôpital’s rule if we set \( x = {{1} \left/ {{\sqrt {n} }} \right.} \), \( \ln [{M_Y}(t)] = n\ln [M(t/\sqrt {n} )] = \ln [M(tx)]/{x^2} \). Both the numerator and the denominator approach 0 as n gets large and x gets small (recall that M(0) = 1 and M(t) is continuous), so L’Hôpital’s rule is applicable. Thus, differentiating the numerator and denominator with respect to x

$$ \mathop{{\lim }}\limits_{{x \to 0}} \frac{{\ln [M(tx)]}}{{{x^2}}} = \mathop{{\lim }}\limits_{{x \to 0}} \frac{{M^{\prime}(tx)t/M(tx)}}{{2x}} = \mathop{{\lim }}\limits_{{x \to 0}} \frac{{M^{\prime}(tx)t}}{{2xM(tx)}} $$

Recall that M(0) = 1, M′(0) = E(W) = 0 and M(t) and its derivative M′(t) are continuous, so both the numerator and denominator of the limit on the right approach 0. Thus we can use L’Hôpital’s rule again.

$$ \mathop{{\lim }}\limits_{{x \to 0}} \frac{{M^{\prime}(tx)t}}{{2xM(tx)}} = \mathop{{\lim }}\limits_{{x \to 0}} \frac{{M^{{\prime}{\prime}}(tx){t^2}}}{{2M(tx) + 2xM^{\prime}(tx)t}} = \frac{{1({t^2})}}{{2(1) + 2(0)(0)t}} = {t^2}/2 $$

In evaluating the limit we have used the continuity of M(t) and its derivatives and M(0) = 1, M′(0) = E(W) = 0, M′′(0) = E(W 2) = 1. We conclude that the mgf converges to the mgf of a standard normal random variable.

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Devore, J.L., Berk, K.N. (2012). Statistics and Sampling Distributions. In: Modern Mathematical Statistics with Applications. Springer Texts in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-0391-3_6

Download citation

Publish with us

Policies and ethics