Advertisement

Fisher information and the central limit theorem

  • Sergey G. Bobkov
  • Gennadiy P. Chistyakov
  • Friedrich GötzeEmail author
Article

Abstract

An Edgeworth-type expansion is established for the relative Fisher information distance to the class of normal distributions of sums of i.i.d. random variables, satisfying moment conditions. The validity of the central limit theorem is studied via properties of the Fisher information along convolutions.

Keywords

Entropy Entropic distance Central limit theorem  Edgeworth-type expansions 

Mathematics Subject Classification (1991)

Primary 60E 

Notes

Acknowledgments

The authors would like to thank the referee for careful reading of the manuscript and useful comments.

References

  1. 1.
    Artstein, S., Ball, K.M., Barthe, F., Naor, A.: On the rate of convergence in the entropic central limit theorem. Probab. Theory Relat. Fields 129(3), 381–390 (2004)CrossRefzbMATHMathSciNetGoogle Scholar
  2. 2.
    Artstein, S., Ball, K.M., Barthe, F., Naor, A.: Solution of Shannon’s problem on the monotonicity of entropy. J. Am. Math. Soc. 17(4), 975–982 (2004)CrossRefzbMATHMathSciNetGoogle Scholar
  3. 3.
    Barron, A.R., Johnson, O.: Fisher information inequalities and the central limit theorem. Probab. Theory Relat. Fields 129(3), 391–409 (2004)CrossRefzbMATHMathSciNetGoogle Scholar
  4. 4.
    Ball, K.M., Barthe, F., Naor, A.: Entropy jumps in the presence of a spectral gap. Duke Math. J. 119(1), 41–63 (2003)CrossRefzbMATHMathSciNetGoogle Scholar
  5. 5.
    Bhattacharya, R.N., Ranga Rao, R.: Normal Approximation and Asymptotic Expansions. Wiley, London (1976) (Soc. for Industrial and Appl. Math., Philadelphia 2010)Google Scholar
  6. 6.
    Billingsley, P.: Convergence of Probability Measures. Wiley, New York, p. xii+253 (1968)Google Scholar
  7. 7.
    Blachman, N.M.: The convolution inequality for entropy powers. IEEE Trans. Inform. Theory 11, 267–271 (1965)CrossRefzbMATHMathSciNetGoogle Scholar
  8. 8.
    Bobkov, S.G.: Large deviations and isoperimetry over convex probability measures with heavy tails. Electr. J. Probab. 12, 1072–1100 (2007)zbMATHMathSciNetGoogle Scholar
  9. 9.
    Bobkov, S.G., Chistyakov, G.P., Götze, F.: Non-uniform bounds in local limit theorems in case of fractional moments. I. Math. Methods Stat. 20(3), 171–191 (2011)CrossRefzbMATHGoogle Scholar
  10. 10.
    Bobkov, S.G., Chistyakov, G.P., Götze, F.: Non-uniform bounds in local limit theorems in case of fractional moments. II. Math. Methods Stat. 20(4), 269–287 (2011)CrossRefGoogle Scholar
  11. 11.
    Bobkov, S.G., Chistyakov, G.P., Götze F.: Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem. Ann. Probab. (to appear)Google Scholar
  12. 12.
    Bolley, F., Villani, C.: Weighted Csiszár-Kullback-Pinsker inequalities and applications to transportation inequalities. Ann. Fac. Sci. Toulouse Math. (6) 14(3), 331–352 (2005)CrossRefzbMATHMathSciNetGoogle Scholar
  13. 13.
    Burago, Yu.D., Zalgaller V.A.: Geometric inequalities. Springer Series in Soviet Mathematics (trans: Sosinskii A.B.). Springer, Berlin, p. xiv+331 (1988)Google Scholar
  14. 14.
    Cohen, M.: The Fisher information and convexity. IEEE Trans. Inform. Theory 14, 591–592 (1968)CrossRefGoogle Scholar
  15. 15.
    Johnson, O.: Information Theory and the Central Limit Theorem. Imperial College Press, London, 9 p. xiv+20 (2004)Google Scholar
  16. 16.
    Kolmogorov, A.N., Fomin S.V.: Elements of the Theory of Functions and Functional Analysis. With a Supplement “Banach algebras”, by V. M. Tikhomirov. (Russian), 6th edition. “Nauka”, Moscow, 6 p. 24 (1989)Google Scholar
  17. 17.
    Maurey, B.: Personal communicationGoogle Scholar
  18. 18.
    Meyer, P.-A.: Probability and Potentials. Blaisdell Publishing Co/Ginn and Co., Waltham/London. p. xiii+266 (1966)Google Scholar
  19. 19.
    Osipov, L.V., Petrov, V.V.: On the estimation of the remainder term in the central limit theorem (Russian). Theory Probab. Appl. 12, 322–329 (1967)CrossRefzbMATHMathSciNetGoogle Scholar
  20. 20.
    Petrov, V.V.: Sums of Independent Random Variables. Springer, New York, p. x+345 (1975)Google Scholar
  21. 21.
    Shimizu, R.: On Fisher’s amount of information for location family. In: Patil, G.P. et al. (eds), Statistical Distributions in Scientific Work, Vol. 3, Reidel, Dordrecht, pp. 305–312 (1975)Google Scholar
  22. 22.
    Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 2, 101–112 (1959)CrossRefzbMATHMathSciNetGoogle Scholar
  23. 23.
    Szegö, G.: Orthogonal polynomials, 3rd edn. Amer. Math. Soc. Colloquium Publications, vol. 23. Amer. Math. Soc., Providence, RI, p. xiii+423 (1967)Google Scholar
  24. 24.
    Zhang, Z.: Inequalities for characteristic functions involving Fisher information. C. R. Math. Acad. Sci. Paris 344(5), 327–330 (2007)CrossRefzbMATHMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Sergey G. Bobkov
    • 1
  • Gennadiy P. Chistyakov
    • 2
  • Friedrich Götze
    • 2
    Email author
  1. 1.School of MathematicsUniversity of MinnesotaMinneapolisUSA
  2. 2.Fakultät für MathematikUniversität BielefeldBielefeldGermany

Personalised recommendations