Advertisement

Probability Theory and Related Fields

, Volume 129, Issue 3, pp 381–390 | Cite as

On the rate of convergence in the entropic central limit theorem

  • Shiri Artstein
  • Keith M. Ball
  • Franck Barthe
  • Assaf Naor
Article

Abstract.

We study the rate at which entropy is produced by linear combinations of independent random variables which satisfy a spectral gap condition.

Keywords

Entropy Linear Combination Limit Theorem Central Limit Central Limit Theorem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Artstein, S., Ball, K., Barthe, F., Naor, A.: Solution of Shannon’s Problem on the Monotonicity of Entropy. Submitted, 2002Google Scholar
  2. 2.
    Bakry, D., Emery, M.: Diffusions hypercontractives. In: Séminaire de Probabilités XIX, number 1123 in Lect. Notes in Math., Springer, 1985, pp. 179–206Google Scholar
  3. 3.
    Ball, K., Barthe, F., Naor, A.: Entropy jumps in the presence of a spectral gap. Duke Math. J. 119 (1), 41–63 (2003)zbMATHGoogle Scholar
  4. 4.
    Barron, A.R.: Entropy and the central limit theorem. Ann. Probab. 14, 336–342 (1986)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Barron, A.R., Johnson, O.: Fisher information inequalities and the central limit theorem. Preprint, ArXiv:math.PR/0111020Google Scholar
  6. 6.
    Blachman, N.M.: The convolution inequality for entropy powers. IEEE Trans. Info. Theory 2, 267–271 (1965)CrossRefGoogle Scholar
  7. 7.
    Brown, L.D.: A proof of the central limit theorem motivated by the Cramer-Rao inequality. In: Kalliampur et al., (eds.), Statistics and Probability: Essays in Honor of C. R. Rao, Amsterdam, North-Holland, 1982, pp. 314–328Google Scholar
  8. 8.
    Carlen, E.A., Soffer, A.: Entropy production by block variable summation and central limit theorem. Commun. Math. Phys. 140 (2), 339–371 (1991)zbMATHGoogle Scholar
  9. 9.
    Csiszar, I.: Informationstheoretische Konvergenzbegriffe im Raum der Wahrscheinlichkeitsverteilungen. Publications of the Mathematical Institute, Hungarian Academy of Sciences, VII, Series A, 137–157 (1962)Google Scholar
  10. 10.
    Kullback, S.: A lower bound for discrimination information in terms of variation. IEEE Trans. Info. Theory 4, 126–127 (1967)CrossRefGoogle Scholar
  11. 11.
    Linnik, Ju.V.: An information theoretic proof of the central limit theorem with lindeberg conditions. Theory Probab. Appl. 4, 288–299 (1959)zbMATHGoogle Scholar
  12. 12.
    Pinsker, M.Open image in new window: Information and information stability of random variables and processes. Holden-Day, San Francisco, 1964Google Scholar
  13. 13.
    Shannon, C.E., Weaver, W.: The mathematical theory of communication. University of Illinois Press, Urbana, IL, 1949Google Scholar
  14. 14.
    Shimizu, R.: On Fisher’s amount of information for location family. In: Patil et al., (eds.), A modern course on statistical distributions in scientific work, Boston, MA, 1974. D. ReidelGoogle Scholar
  15. 15.
    Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Info. Control 2, 101–112 (1959)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Shiri Artstein
    • 1
  • Keith M. Ball
    • 2
  • Franck Barthe
    • 3
  • Assaf Naor
    • 4
  1. 1.School of Mathematical SciencesTel Aviv UniversityTel AvivIsrael
  2. 2.Department of MathematicsUniversity College LondonLondonUnited Kingdom
  3. 3.Institut de MathematiquesLaboratoire de Statistiques et Probabilites-CNRS UMRToulouse cedexFrance
  4. 4.Theory GroupMicrosoft ResearchUSA

Personalised recommendations