Skip to main content
Log in

Approximate Central Limit Theorems

  • Published:
Journal of Theoretical Probability Aims and scope Submit manuscript

Abstract

We refine the classical Lindeberg–Feller central limit theorem by obtaining asymptotic bounds on the Kolmogorov distance, the Wasserstein distance, and the parameterized Prokhorov distances in terms of a Lindeberg index. We thus obtain more general approximate central limit theorems, which roughly state that the row-wise sums of a triangular array are approximately asymptotically normal if the array approximately satisfies Lindeberg’s condition. This allows us to continue to provide information in nonstandard settings in which the classical central limit theorem fails to hold. Stein’s method plays a key role in the development of this theory.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Barbour, A.D., Chen, L.H.Y.: An introduction to Stein’s method. Singapore University Press, World Scientific Publishing Co. Pte. Ltd, Singapore, Hackensack, NJ (2005)

    Book  MATH  Google Scholar 

  2. Barbour, A.D., Hall, P.: Stein’s method and the Berry–Esseen theorem. Aust. J. Stat. 26(1), 8–15 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  3. Berckmoes, B., Lowen, R., Van Casteren, J.: Distances on probability measures and random variables. J. Math. Anal. Appl. 374(2), 412–428 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  4. Berckmoes, B., Lowen, R., Van Casteren, J.: An isometric study of the Lindeberg–Feller central limit theorem via Stein’s method. J. Math. Anal. Appl. 405(2), 484–498 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  5. Berckmoes, B., Lowen, R., Van Casteren, J.: Stein’s method and a quantitative Lindeberg CLT for the Fourier transforms of random vectors. J. Math. Anal. Appl. 433(2), 1441–1458 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  6. Billingsley, P.: Convergence of Probability Measures. Wiley Series in Probability and Statistics: Probability and Statistics, 2nd edn. Wiley, New York (1999)

    Book  MATH  Google Scholar 

  7. Bolley, F.: Separability and completeness for the Wasserstein distance. In: Séminaire de probabilités XLI, Lecture Notes in Mathematics, vol. 1934, pp. 371–377. Springer, Berlin (2008)

  8. Chen, L.H.Y., Goldstein, L., Shao, Q.-M.: Normal Approximation by Stein’s method. Probability and its Applications (New York). Springer, Heidelberg (2011)

    Book  Google Scholar 

  9. Chen, L.H.Y., Shao, Q.-M.: A non-uniform Berry–Esseen bound via Stein’s method. Probab. Theory Relat. Fields 120(2), 236–254 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  10. Osipov, L.V.: A refinement of Lindeberg’s theorem. Teor. Verojatnost. i Primenen. 11, 339–342 (1966). (Russian)

    MathSciNet  MATH  Google Scholar 

  11. Feller, W.: On the Berry–Esseen theorem. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 10, 261–268 (1968)

    Article  MathSciNet  MATH  Google Scholar 

  12. Loh, W.Y.: On the normal approximation for sums of mixing random variables. Master Thesis, Department of Mathematics, University of Singapore (1975)

  13. Lowen, R.: Index Analysis. Approach Theory at Work. Springer Monographs in Mathematics. Springer, London (2015)

    Google Scholar 

  14. Meckes, E.: On Stein’s method for multivariate normal approximation. In: High dimensional probability V: the Luminy volume, pp. 153–178, Institute of Mathematical Statistics Collection, 5, Institute of Mathematical Statistics, Beachwood, OH (2009)

  15. Nourdin, I., Peccati, G., Réveillac, A.: Multivariate normal approximation using Stein’s method and Malliavin calculus. Ann. Inst. Henri Poincaré Probab. Stat. 46(1), 45–58 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  16. Rachev, S.T.: Probability metrics and the stability of stochastic models. Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics. Wiley, Chichester (1991)

    MATH  Google Scholar 

  17. Villani, C.: Topics in Optimal Transportation. Graduate Studies in Mathematics, vol. 58. American Mathematical Society, Providence, RI (2003)

    MATH  Google Scholar 

  18. Zolotarev, V.M.: Probability metrics. Teor. Veroyatnost. i Primenen. 28(2), 264–287 (1983). (Russian)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ben Berckmoes.

Additional information

Ben Berckmoes is postdoctoral fellow at the Fund for Scientific Research of Flanders (FWO).

Geert Molenberghs gratefully acknowledges financial support from the IAP research network #P7/06 of the Belgian Government (Belgian Science Policy).

Appendix: Proof of Theorem 3.1

Appendix: Proof of Theorem 3.1

We follow [4], Sect. 2. We keep a continuously differentiable \(h : \mathbb {R} \rightarrow [0,1]\), with bounded derivative, fixed, and let \(f_h\) be its Stein transform defined by (6). Also, we put

$$\begin{aligned} \sigma _{n,k}^2 = \mathbb {E}[\xi _{n,k}^2]. \end{aligned}$$

The following lemma is easily verified. It can be found in, e.g., [1] (pp. 10–11).

Lemma 1

\(f_h\) is twice continuously differentiable, has bounded first and second derivatives, and

$$\begin{aligned} \mathbb {E}\left[ h(\xi )\right] - h(x)= x f_h(x) - f_h^\prime (x). \end{aligned}$$
(31)

The following lemma can be found in [4] (Lemma 2.4). We give the proof for completeness.

Lemma 2

Put

$$\begin{aligned} \delta _{n,k} = f_h\left( \sum _{i \ne k} \xi _{n,i} + \xi _{n,k}\right) - f_{h}\left( \sum _{i \ne k} \xi _{n,i}\right) - \xi _{n,k} f^\prime _h\left( \sum _{i \ne k} \xi _{n,i}\right) \end{aligned}$$

and

$$\begin{aligned} \epsilon _{n,k} = f^\prime _h\left( \sum _{i \ne k} \xi _{n,i} + \xi _{n,k}\right) - f^\prime _{h}\left( \sum _{i \ne k} \xi _{n,i}\right) - \xi _{n,k} f^{\prime \prime }_h\left( \sum _{i \ne k}\xi _{n,i}\right) . \end{aligned}$$

Then

$$\begin{aligned}&\mathbb {E}\left[ \left( \sum _{k=1}^n \xi _{n,k}\right) f_h\left( \sum _{k=1}^n \xi _{n,k}\right) - f_h^\prime \left( \sum _{k=1}^n \xi _{n,k}\right) \right] \nonumber \\&\quad = \sum _{k=1}^n \mathbb {E}\left[ \xi _{n,k}\delta _{n,k}\right] - \sum _{k=1}^n \sigma _{n,k}^2 \mathbb {E}\left[ \epsilon _{n,k}\right] . \end{aligned}$$
(32)

Proof

Recalling that \(\xi _{n,k}\) and \(\sum _{i \ne k} \xi _{n,i}\) are independent, \(\mathbb {E}\left[ \xi _{n,k}\right] = 0\), and \(\sum _{k=1}^n \sigma _{n,k}^2 = 1\), we get

$$\begin{aligned}&\sum _{k=1}^n \mathbb {E}\left[ \xi _{n,k} \delta _{n,k}\right] - \sum _{k=1}^n \sigma _{n,k}^2 \mathbb {E}\left[ \epsilon _{n,k}\right] \\&\quad = \sum _{k=1}^n \mathbb {E}\left[ \xi _{n,k} f_{h}\left( \sum _{k = 1}^n \xi _{n,k}\right) \right] - \mathbb {E}\left[ \xi _{n,k} f_{h}\left( \sum _{i \ne k} \xi _{n,i}\right) \right] \\&\qquad - \sum _{k=1}^n \mathbb {E}\left[ \xi _{n,k}^2 f^\prime _h\left( \sum _{i \ne k} \xi _{n,i}\right) \right] - \sum _{k=1}^n \sigma _{n,k}^2 \mathbb {E}\left[ f_{h}^{\prime }\left( \sum _{k=1}^n \xi _{n,k}\right) \right] \\&\qquad + \sum _{k=1}^n \mathbb {E}\left[ \xi _{n,k}^2\right] \mathbb {E}\left[ f^\prime _{h}\left( \sum _{i \ne k} \xi _{n,i}\right) \right] + \sum _{k=1}^n \sigma _{n,k}^2 \mathbb {E}\left[ \xi _{n,k} f^{\prime \prime }_h\left( \sum _{i \ne k}\xi _{n,i}\right) \right] . \end{aligned}$$

The last expression further reduces to

$$\begin{aligned}&\mathbb {E}\left[ \left( \sum _{k=1}^n\xi _{n,k}\right) f_{h}\left( \sum _{k = 1}^n \xi _{n,k}\right) \right] - \mathbb {E}\left[ \xi _{n,k}\right] \mathbb {E}\left[ f_{h}\left( \sum _{i \ne k} \xi _{n,i}\right) \right] \\&\qquad - \sum _{k=1}^n \mathbb {E}\left[ \xi _{n,k}^2 f^\prime _h\left( \sum _{i \ne k} \xi _{n,i}\right) \right] - \mathbb {E}\left[ f_{h}^{\prime }\left( \sum _{k=1}^n \xi _{n,k}\right) \right] \\&\qquad + \sum _{k=1}^n \mathbb {E}\left[ \xi _{n,k}^2 f^\prime _{h}\left( \sum _{i \ne k} \xi _{n,i}\right) \right] + \sum _{k=1}^n \sigma _{n,k}^2 \mathbb {E}\left[ \xi _{n,k}\right] \mathbb {E}\left[ f^{\prime \prime }_h\left( \sum _{i \ne k}\xi _{n,i}\right) \right] , \end{aligned}$$

which is easily seen to equal

$$\begin{aligned} \mathbb {E}\left[ \left( \sum _{k=1}^n \xi _{n,k}\right) f_h\left( \sum _{k=1}^n \xi _{n,k}\right) - f_h^\prime \left( \sum _{k=1}^n \xi _{n,k}\right) \right] . \end{aligned}$$

This finishes the proof. \(\square \)

The following lemma is an application of Taylor’s theorem.

Lemma 3

For any \(a, x \in \mathbb {R}\),

$$\begin{aligned}&\left| f_h(a + x) - f_h(a) - f_h^\prime (a) x \right| \nonumber \\&\quad \le \min \left\{ \left( \sup _{x_1,x_2 \in \mathbb {R}}\left| f_h^\prime (x_1) - f^{\prime }(x_2)\right| \right) \left| x\right| ,\frac{1}{2} \left\| f_h^{\prime \prime }\right\| _\infty x^2\right\} . \end{aligned}$$
(33)

We are now in a position to present a proof of Theorem 3.1.

Proof of Theorem 3.1

For n and \(\epsilon > 0\), we have, by (31), (32), and (33),

$$\begin{aligned} \left| \mathbb {E}\left[ h\left( \xi \right) - h\left( \sum _{k=1}^{n}\xi _{n,k}\right) \right] \right|= & {} \left| \mathbb {E}\left[ \left( \sum _{k=1}^n \xi _{n,k}\right) f_h\left( \sum _{k=1}^n \xi _{n,k}\right) - f_h^\prime \left( \sum _{k=1}^n \xi _{n,k}\right) \right] \right| \\\le & {} \sum _{k=1}^n \mathbb {E}\left[ \left| \xi _{n,k}\delta _{n,k}\right| \right] + \sum _{k=1}^n \sigma _{n,k}^2 \mathbb {E}\left[ \left| \epsilon _{n,k}\right| \right] \\\le & {} \frac{1}{2}\left\| f_h^{\prime \prime }\right\| _\infty \sum _{k=1}^n \mathbb {E}\left[ \left| \xi _{n,k}\right| ^3 ; \left| \xi _{n,k}\right| <\epsilon \right] \\&+ \left( \sup _{x_1,x_2 \in \mathbb {R}} \left| f_h^\prime (x_1) - f_{h}^\prime (x_2)\right| \right) \sum _{k=1}^n \mathbb {E}\left[ \left| \xi _{n,k}\right| ^2;\left| \xi _{n,k}\right| \ge \epsilon \right] \\&+ \left( \sup _{x_1, x_2 \in \mathbb {R}} \left| f_h^{\prime \prime }(x_1) - f_h^{\prime \prime } (x_2)\right| \right) \sum _{k=1}^n\sigma _{n,k}^2 \mathbb {E}\left[ \left| \xi _{n,k}\right| \right] , \end{aligned}$$

which proves the desired result since \(\sum _{k=1}^n \sigma _{n,k}^2 = 1\). \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Berckmoes, B., Molenberghs, G. Approximate Central Limit Theorems. J Theor Probab 31, 1590–1605 (2018). https://doi.org/10.1007/s10959-017-0744-6

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10959-017-0744-6

Keywords

Mathematics Subject Classification (2010)

Navigation