Advertisement

Journal of Theoretical Probability

, Volume 22, Issue 1, pp 186–202 | Cite as

Convergence of Markov Chains in Information Divergence

  • Peter Harremoës
  • Klaus Kähler Holst
Article

Abstract

Information theoretic methods are used to prove convergence in information divergence of reversible Markov chains. Also some ergodic theorems for information divergence are proved.

Keywords

Information divergence Increasing information Decreasing information Markov chain Reversible Markov chain Ergodic theorems 

Mathematics Subject Classification (2000)

60J10 94A15 60B11 60F15 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Artstein, S., Ball, K., Barthe, F., Naor, A.: Solution of Shannon’s problem on the monotonicity of entropy. J. Am. Math. Soc. 17, 975–982 (2004) zbMATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Bakry, D., Emery, M.: Diffusions hypercontractives. In: Lecture Notes in Mathematics, vol. 1123, pp. 179–206. Springer, Berlin (1985) Google Scholar
  3. 3.
    Barron, A.R.: Entropy and the central limit theorem. Ann. Probab. Theory 14(1), 336–342 (1986) zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Barron, A.R.: Limits of information, Markov chains, and projections. In: Proceedings 2000 International Symposium on Information Theory, p. 25 (2000) Google Scholar
  5. 5.
    Csiszár, I.: Eine informationstheoretische Ungleichung und ihre anwendung auf denBeweis der ergodizität von Markoffschen Ketten. Publ. Math. Inst. Hung. Acad. 8, 95–108 (1963) Google Scholar
  6. 6.
    Csiszár, I.: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hung. 2, 299–318 (1967) zbMATHGoogle Scholar
  7. 7.
    Csiszár, I.: I-divergence geometry of probability distributions and minimization problems. Ann. Probab. 3, 146–158 (1975) zbMATHCrossRefGoogle Scholar
  8. 8.
    Csiszár, I.: Sanov property, generalized I-projection and a conditional limit theorem. Ann. Probab. 12, 768–793 (1984) zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Csiszár, I., Matús, F.: Information projections revisited. IEEE Trans. Inf. Theory 49(6), 1474–1490 (2003) zbMATHCrossRefGoogle Scholar
  10. 10.
    Dobrusin, R.L.: General formulation of Shannon’s main theorem in information theory. Usp. Mat. Nauk. 14, 3–104 (1959) (in Russian). English translation in Transl. Am. Math. Soc., Ser. 2 33, 323–438 (1963) MathSciNetGoogle Scholar
  11. 11.
    Dobrusin, R.L.: Passage to the limit under the information and entropy integrals. Theory Probab. Appl. 5, 25–32 (1960). English translation CrossRefGoogle Scholar
  12. 12.
    Fritz, J.: An information-theoretical proof of limit theorems for reversible Markov processes. In: Trans. Sixth Prague Conf. on Inform. Theory, Statist. Decision Functions, Random Processes, Prague. Czech. Sept. 1971. Academia, Prague (1973) Google Scholar
  13. 13.
    Harremoës, P.: Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Inf. Theory 47(5), 2039–2041 (2001) zbMATHCrossRefGoogle Scholar
  14. 14.
    Harremoës, P.: Martingales and information divergence. In: Proceedings of 2005 IEEE International Symposium on Information Theory, pp. 164–168. IEEE Press, Adelaide (2005) CrossRefGoogle Scholar
  15. 15.
    Harremoës, P.: Maximum entropy on compact groups. In: Proceedings of International Symposium on Information Theory, pp. 108–112. IEEE Press, Seattle (2006) Google Scholar
  16. 16.
    Harremoës, P.: Information topologies with applications. In: Csiszár, I., Katona, G.O.H., Tardos, G. (eds.) Entropy, Search, Complexity. Bolyai Society Mathematical Studies, vol. 16, pp. 113–150. János Bolyai Math. Soc./Springer, New York (2007) CrossRefGoogle Scholar
  17. 17.
    Johnson, O.: Information Theory and Central Limit Theorem. Imperial College Press, London (2004) zbMATHGoogle Scholar
  18. 18.
    Johnson, O., Barron, A.R.: Fisher information inequalities and the central limit theorem. Probab. Theory Relat. Fields 129(3), 391–409 (2004) zbMATHCrossRefMathSciNetGoogle Scholar
  19. 19.
    Kendall, D.G.: Information theory and the limit theorem for Markov chains and processes with a countable infinity of states. Ann. Inst. Stat. Math. 15, 137–143 (1964) CrossRefMathSciNetGoogle Scholar
  20. 20.
    Kontoyiannis, I., Harremoës, P., Johnson, O.: Entropy and the law of small numbers. IEEE Trans. Inf. Theory 51(2), 466–472 (2005) CrossRefGoogle Scholar
  21. 21.
    Pinsker, M.S.: Information and Information Stability of Random Variables and Processes. Izd. Akad. Nauk, Moscow (1960) (in Russian) Google Scholar
  22. 22.
    Rényi, A.: On measures of entropy and information. In: Proc. 4th Berkeley Symp. Math. Statist. and Probab., vol. 1, pp. 547–561. University of California Press, Berkeley (1961) Google Scholar
  23. 23.
    Takano, S.: Convergence of entropy in the central limit theorem. Yokohama Math. J. 35, 143–148 (1987) zbMATHMathSciNetGoogle Scholar
  24. 24.
    Topsøe, F.: Informations Theorie, eine Einführung. Teubner, Stuttgart (1974) Google Scholar
  25. 25.
    Topsøe, F.: Information theoretical optimization techniques. Kybernetika 15(1), 8–27 (1979) MathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2007

Authors and Affiliations

  1. 1.Quantum Computing and Advanced Systems Research Centre for Mathematics and Computer Science (CWI)AmsterdamThe Netherlands
  2. 2.Department of BiostatisticsUniversity of CopenhagenCopenhagenDenmark

Personalised recommendations