Skip to main content

Statistics and Sampling Distributions

  • Chapter
  • First Online:
Modern Mathematical Statistics with Applications

Part of the book series: Springer Texts in Statistics ((STS))

  • 18k Accesses

Abstract

This chapter helps make the transition between probability and inferential statistics. Given a sample of n observations from a population, we will be calculating estimates of the population mean, median, standard deviation, and various other population characteristics (parameters). Prior to obtaining data, there is uncertainty as to which of all possible samples will occur.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jay L. Devore .

Appendix: Proof of the Central Limit Theorem

Appendix: Proof of the Central Limit Theorem

First, here is a restatement of the theorem. Let X1, X2, …, Xn be a random sample from a distribution with mean μ and standard deviation σ. Then, if Z is a standard normal random variable,

$$ \mathop {\lim }\limits_{n \to \infty } P\left( {\frac{{\overline{X} - \mu }}{\sigma /\sqrt n } \le z} \right) = P(Z \le z) =\Phi (z) $$

The theorem says that the distribution of the standardized \( \overline{X} \) approaches the standard normal distribution. Our proof is for the special case in which the moment generating function exists, which implies also that all its derivatives exist and that they are continuous. We will show that the mgf of the standardized \( \overline{X} \) approaches the mgf of the standard normal distribution. Convergence of the mgf implies convergence of the distribution, though we will not prove that here (the mathematics is beyond the scope of this book).

To simplify the proof slightly, define new rvs by Wi  = (Xi – μ)/σ for i = 1, 2,…, n, the standardized versions of the Xi. Then Xi = µ + σWi, from which \( \overline{X} = \mu + \sigma \overline{W} \) and we may write the standardized \( \overline{X} \) expression as

$$ Y = \frac{{\overline{X} - \mu }}{\sigma /\sqrt n } = \frac{{(\mu + \sigma \overline{W} ) - \mu }}{\sigma /\sqrt n } = \sqrt n \cdot \overline{W} = \frac{1}{\sqrt n }\sum\limits_{i = 1}^{n} {W_{i} } $$

Let MW(t) denote the common mgf of the Wi’s (since the Xi’s are iid, so are the Wi’s). We will obtain the mgf of Y in terms of MW(t); we then want to show that the mgf of Y converges to the mgf of a standard normal random variable, \( M_{Z} (t) = e^{{t^{2} /2}} \).

From the mgf properties in Section 5.3, we have the following:

$$ M_{Y} (t) = M_{{W_{1} + \cdots + W_{n} }} (t/\sqrt n ) = [M_{W} (t/\sqrt n )]^{n} $$

For the limit, we will use the fact that MW(0) = 1, a basic property of all mgfs. And, critically, because the Wi’s are standardized rvs, E(Wi) = 0 and V(Wi) = 1, from which we also have \( M_{W}^{{\prime }} (0) = E(W) = 0 \) and \( M_{W}^{{\prime \prime }} (0) = E(W^{2} ) = V(W) + [E(W)]^{2} = 1 \).

To determine the limit as n → \( \infty \), we take a natural logarithm, make the substitution \( x = 1/\sqrt n \), then apply L’Hôpital’s Rule twice:

$$ \begin{aligned} {\mathop {\lim }\limits_{n \to \infty }} \ln [M_{Y} (t)] & = {\mathop {\lim }\limits_{n \to \infty} } n\ln [M_{W} (t/{\sqrt{n}} )] \\ & = {\mathop {\lim }\limits_{x \to 0}} \frac{{\ln [M_{W} (tx)]}}{{x^{2} }}\quad {\text{substitute}}\;x = 1/\sqrt n \\ & = {\mathop {\lim }\limits_{x \to 0}} \frac{{M_{W}^{{\prime }} (tx) \cdot t/M_{W} (tx)}}{2x}\quad {\text{L'H}}{\hat{\text{o}}}{\text{pital'}}{\text{s Rule}} \\ & = {\mathop {\lim }\limits_{x \to 0}} \frac{{tM_{W}^{{\prime }} (tx)}}{{2xM_{W} (tx)}} \\ & = {\mathop {\lim }\limits_{x \to 0}} \frac{{t^{2} M_{W}^{{\prime \prime }} (tx)}}{{2M_{W} (tx) + 2xtM_{W}^{{\prime }} (tx)}}\quad {\text{L'H}}{\hat{\text{o}}}{\text{pital'}}{\text{s Rule}} \\ & = \frac{{t^{2} M_{W}^{{\prime \prime }} (0)}}{{2M_{W} (0) + 2(0)tM_{W}^{{\prime }} (0)}} = \frac{{t^{2} (1)}}{2(1) + 0} = \frac{{t^{2} }}{2} \\ \end{aligned} $$

You can verify for yourself that at each use of L’Hôpital’s Rule, the preceding fraction had the indeterminate 0/0 form. Finally, since the logarithm function and its inverse are continuous, we may conclude that \( M_{Y} (t) \to e^{{t^{2} /2}} \), which completes the proof. ■

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Devore, J.L., Berk, K.N., Carlton, M.A. (2021). Statistics and Sampling Distributions. In: Modern Mathematical Statistics with Applications. Springer Texts in Statistics. Springer, Cham. https://doi.org/10.1007/978-3-030-55156-8_6

Download citation

Publish with us

Policies and ethics