Skip to main content
Log in

Entropy of Hidden Markov Processes via Cycle Expansion

  • Published:
Journal of Statistical Physics Aims and scope Submit manuscript

Abstract

Hidden Markov Processes (HMP) is one of the basic tools of the modern probabilistic modeling. The characterization of their entropy remains however an open problem. Here the entropy of HMP is calculated via the cycle expansion of the zeta-function, a method adopted from the theory of dynamical systems. For a class of HMP this method produces exact results both for the entropy and the moment-generating function. The latter allows to estimate, via the Chernoff bound, the probabilities of large deviations for the HMP. More generally, the method offers a representation of the moment-generating function and of the entropy via convergent series.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Rabiner, L.R.: Proc. IEEE 77, 257–286 (1989)

    Article  Google Scholar 

  2. Ephraim, Y., Merhav, N.: IEEE Trans. Inf. Theory 48, 1518–1569 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  3. Crouse, M., Nowak, R., Baraniuk, R.: IEEE Trans. Signal Process. 46, 886 (1998)

    Article  MathSciNet  Google Scholar 

  4. Koski, T.: Hidden Markov Models for Bioinformatics. Kluwer Academic Publishers, Dordrecht (2001)

    MATH  Google Scholar 

  5. Baldi, P., Brunak, S.: Bioinformatics. MIT Press, Cambridge (2001)

    MATH  Google Scholar 

  6. Ash, R.: Information Theory. Interscience Publishers, New York (1965)

    MATH  Google Scholar 

  7. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)

    MATH  Google Scholar 

  8. Blackwell, D.: The entropy of functions of finite-state Markov chains. In: Trans. First Prague Conf. Inf. Theory, Statistical Decision Functions, Random Processes, p. 13. Pub. House Chechoslovak Acad. Sci., Prague (1957)

    Google Scholar 

  9. Stratonovich, R.L.: Information Theory. Sovietskoe Radio, Moscow (1976). (in Russian)

    Google Scholar 

  10. Rezaeian, M.: Hidden Markov process: a new representation, entropy rate and estimation entropy. arXiv:cs.IT/0606114 (2006)

  11. Birch, I.J.: Ann. Math. Stat. 33, 930 (1962)

    Article  MATH  MathSciNet  Google Scholar 

  12. Jacquet, P., Seroussi, G., Szpankowski, W.: On the entropy of a hidden Markov process. In: Int. Symp. Inf. Theory, p. 10. Chicago, IL, 2004

  13. Holliday, T., Goldsmith, A., Glynn, P.: IEEE Trans. Inf. Theory 52, 3509 (2006)

    Article  MathSciNet  Google Scholar 

  14. Ordentlich, E., Weissman, T.: IEEE Trans. Inf. Theory 52, 19 (2006)

    Article  MathSciNet  Google Scholar 

  15. Egner, S. et al.: On the entropy rate of a hidden Markov model. In: Int. Symp. Inf. Theory, p. 12. Chicago, IL, 2004

  16. Zuk, O., Kanter, I., Domany, E.: J. Stat. Phys. 121, 343 (2005)

    Article  MATH  ADS  MathSciNet  Google Scholar 

  17. Zuk, O., Kanter, I., Domany, E., Aizenman, M.: IEEE Signal Process. Lett. 13, 517 (2006)

    Article  Google Scholar 

  18. Chigansky, P.: The entropy rate of a binary channel with slowly varying input. arXiv:cs/0602074

  19. Horn, R.A., Johnson, C.R.: Matrix Analysis. Cambridge University Press, New Jersey (1985)

    MATH  Google Scholar 

  20. Kingman, J.F.C.: Ann. Probab. 1, 883 (1973)

    Article  MATH  MathSciNet  Google Scholar 

  21. Steele, J.M.: Ann. de l’I.H.P. B 25, 93 (1989)

    MATH  MathSciNet  Google Scholar 

  22. Crisanti, A., Paladin, G., Vulpiani, A.: Products of random matrices in statistical physics. In: Springer Series in Solid State Sciences, vol. 104. Springer, Berlin (1993)

  23. Goldsheid, L.Y., Margulis, G.A.: Russ. Math. Surv. 44, 11 (1989)

    Article  MathSciNet  Google Scholar 

  24. Orszag, S.A., Sulem, P.L., Goldirsch, I.: Physica D 27, 311 (1987)

    Article  MATH  ADS  MathSciNet  Google Scholar 

  25. Kontorovich, L.: Measure concentration of hidden Markov processes. arXiv:math/0608064 (2006)

  26. Artuso, R., Aurell, E., Cvitanovic, P.: Nonlinearity 3, 325 (1990)

    Article  ADS  MathSciNet  Google Scholar 

  27. Cvitanovic, P.: Phys. Rev. Lett. 61, 2729 (1988)

    Article  ADS  MathSciNet  Google Scholar 

  28. Ruelle, D.: Statistical Mechanics, Thermodynamic Formalism. Addison-Wesley, Reading (1978)

    Google Scholar 

  29. Mainieri, R.: Chaos 2, 91 (1992)

    Article  MATH  ADS  MathSciNet  Google Scholar 

  30. Aurell, E.: J. Stat. Phys. 58, 967 (1990)

    Article  MATH  ADS  MathSciNet  Google Scholar 

  31. Nielsen, J.: Lyapunov exponents for products of random matrices. Available at http://citeseer.ist.psu.edu/438423.html

  32. Arnold, L., Gundlach, V.M., Demetrius, L.: Ann. Appl. Probab. 4, 859 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  33. Peres, Y.: Ann. Inst. H. Poincare Probab. Statist. 28, 131 (1992)

    MATH  MathSciNet  Google Scholar 

  34. Han, G., Markus, B.: IEEE Trans. Inf. Theory 52, 5251 (2006)

    Article  Google Scholar 

  35. Gurvits, L., Ledoux, J.: Linear Algebra and Applications 404, 85 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  36. Petersen, K.: Lectures on Ergodic Theory. Available from http://www.math.unc.edu/Faculty/petersen/lecturespdf.pdf

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Armen E. Allahverdyan.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Allahverdyan, A.E. Entropy of Hidden Markov Processes via Cycle Expansion. J Stat Phys 133, 535–564 (2008). https://doi.org/10.1007/s10955-008-9613-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10955-008-9613-0

Keywords

Navigation