Abstract
This chapter helps make the transition between probability and inferential statistics. Given a sample of \( n \) observations from a population, we will be calculating estimates of the population mean, median, standard deviation, and various other population characteristics (parameters).
Keywords
- Sampling Distribution
- Standard Deviation
- Independent Standard Normal Random Variables
- Sample Mean Number
- Tail Area
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
The original version of this chapter was revised. An erratum to this chapter can be found at https://doi.org/10.1007/978-1-4614-0391-3_15
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsBibliography
Larsen, Richard, and Morris Marx, An Introduction to Mathematical Statistics and Its Applications (4th ed.), Prentice Hall, Englewood Cliffs, NJ, 2005. More limited coverage than in the book by Olkin et al., but well written and readable.
Olkin, Ingram, Cyrus Derman, and Leon Gleser, Probability Models and Applications (2nd ed.), Macmillan, New York, 1994. Contains a careful and comprehensive exposition of limit theorems.
Author information
Authors and Affiliations
Corresponding author
Appendix: Proof of the Central Limit Theorem
Appendix: Proof of the Central Limit Theorem
First, here is a restatement of the theorem. Let X 1, X 2, …, X n be a random sample from a distribution with mean μ and variance σ 2. Then, if Z is a standard normal random variable,
The theorem says that the distribution of the standardized \( \overline{X} \) approaches the standard normal distribution. Our proof is only for the special case in which the moment generating function exists, which implies also that all its derivatives exist and that they are continuous. We will show that the moment generating function of the standardized \( \overline{X} \) approaches the moment generating function of the standard normal distribution. However, convergence of the moment generating function does not by itself imply the desired convergence of the distribution. This requires a theorem, which we will not prove, showing that convergence of the moment generating function implies the convergence of the distribution.
The standardized \( \overline{X} \) can be written as
The mean and standard deviation for the first ratio come from the first proposition of Section 6.2, and the second ratio is algebraically equivalent to the first. It says that, if we define W to be the standardized X, so W i = (X i – μ)/σ, i = 1 2,…, n, then the standardized \( \overline{X} \) can be written as the standardized \( \bar{W} \),
This allows a simplification of the proof because we can work with the simpler variable W, which has mean 0 and variance 1. We need to obtain the moment generating function of
from the moment generating function M(t) of W. With the help of the Section 6.3 proposition on moment generating functions of linear combinations of independent random variables, we get \( {M_Y}(t) = M{\left( {{{t} \left/ {{\sqrt {n} }} \right.}} \right)^n} \). We want to show that this converges to the moment generating function of a standard normal random variable, \( {M_Z}(t) = {e^{{{{{{t^2}}} \left/ {2} \right.}}}} \). It is easier to take the logarithm of both sides and show instead that \( \ln [{M_Y}(t)] = n\ln [M(t/\sqrt {n}\, ] \to {t^2}/2 \). This is equivalent because the logarithm and its inverse are continuous functions.
The limit can be obtained from two applications of L’Hôpital’s rule if we set \( x = {{1} \left/ {{\sqrt {n} }} \right.} \), \( \ln [{M_Y}(t)] = n\ln [M(t/\sqrt {n} )] = \ln [M(tx)]/{x^2} \). Both the numerator and the denominator approach 0 as n gets large and x gets small (recall that M(0) = 1 and M(t) is continuous), so L’Hôpital’s rule is applicable. Thus, differentiating the numerator and denominator with respect to x
Recall that M(0) = 1, M′(0) = E(W) = 0 and M(t) and its derivative M′(t) are continuous, so both the numerator and denominator of the limit on the right approach 0. Thus we can use L’Hôpital’s rule again.
In evaluating the limit we have used the continuity of M(t) and its derivatives and M(0) = 1, M′(0) = E(W) = 0, M′′(0) = E(W 2) = 1. We conclude that the mgf converges to the mgf of a standard normal random variable.
Rights and permissions
Copyright information
© 2012 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Devore, J.L., Berk, K.N. (2012). Statistics and Sampling Distributions. In: Modern Mathematical Statistics with Applications. Springer Texts in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-0391-3_6
Download citation
DOI: https://doi.org/10.1007/978-1-4614-0391-3_6
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-0390-6
Online ISBN: 978-1-4614-0391-3
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)