Abstract
This chapter helps make the transition between probability and inferential statistics. Given a sample of n observations from a population, we will be calculating estimates of the population mean, median, standard deviation, and various other population characteristics (parameters). Prior to obtaining data, there is uncertainty as to which of all possible samples will occur.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Corresponding author
Appendix: Proof of the Central Limit Theorem
Appendix: Proof of the Central Limit Theorem
First, here is a restatement of the theorem. Let X1, X2, …, Xn be a random sample from a distribution with mean μ and standard deviation σ. Then, if Z is a standard normal random variable,
The theorem says that the distribution of the standardized \( \overline{X} \) approaches the standard normal distribution. Our proof is for the special case in which the moment generating function exists, which implies also that all its derivatives exist and that they are continuous. We will show that the mgf of the standardized \( \overline{X} \) approaches the mgf of the standard normal distribution. Convergence of the mgf implies convergence of the distribution, though we will not prove that here (the mathematics is beyond the scope of this book).
To simplify the proof slightly, define new rvs by Wi  = (Xi – μ)/σ for i = 1, 2,…, n, the standardized versions of the Xi. Then Xi = µ + σWi, from which \( \overline{X} = \mu + \sigma \overline{W} \) and we may write the standardized \( \overline{X} \) expression as
Let MW(t) denote the common mgf of the Wi’s (since the Xi’s are iid, so are the Wi’s). We will obtain the mgf of Y in terms of MW(t); we then want to show that the mgf of Y converges to the mgf of a standard normal random variable, \( M_{Z} (t) = e^{{t^{2} /2}} \).
From the mgf properties in Section 5.3, we have the following:
For the limit, we will use the fact that MW(0) = 1, a basic property of all mgfs. And, critically, because the Wi’s are standardized rvs, E(Wi) = 0 and V(Wi) = 1, from which we also have \( M_{W}^{{\prime }} (0) = E(W) = 0 \) and \( M_{W}^{{\prime \prime }} (0) = E(W^{2} ) = V(W) + [E(W)]^{2} = 1 \).
To determine the limit as n → \( \infty \), we take a natural logarithm, make the substitution \( x = 1/\sqrt n \), then apply L’Hôpital’s Rule twice:
You can verify for yourself that at each use of L’Hôpital’s Rule, the preceding fraction had the indeterminate 0/0 form. Finally, since the logarithm function and its inverse are continuous, we may conclude that \( M_{Y} (t) \to e^{{t^{2} /2}} \), which completes the proof. â–
Rights and permissions
Copyright information
© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Devore, J.L., Berk, K.N., Carlton, M.A. (2021). Statistics and Sampling Distributions. In: Modern Mathematical Statistics with Applications. Springer Texts in Statistics. Springer, Cham. https://doi.org/10.1007/978-3-030-55156-8_6
Download citation
DOI: https://doi.org/10.1007/978-3-030-55156-8_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-55155-1
Online ISBN: 978-3-030-55156-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)