Journal of Statistical Physics

, Volume 121, Issue 3–4, pp 343–360 | Cite as

The Entropy of a Binary Hidden Markov Process

Article

Abstract

The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter ε. We map the problem onto a one-dimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in ε. Using a conjecture we extend the calculation to 11th order and discuss the convergence of the resulting series

Keywords

Hidden Markov Process entropy random-field Ising model 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ephraim, Y., Merhav, N. 2002Hidden Markov processesIEEE Trans. Inform. Theory4815181569CrossRefMathSciNetGoogle Scholar
  2. 2.
    Schliep, A., Schönhuth, A., Steinhoff, C. 2003Using hidden Markov models to analyze gene expression time course dataBioinformatics19i255i263Google Scholar
  3. 3.
    Rabiner, L.R. 1989A tutorial on hidden Markov models and selected applications in speech recognitionProc. IEEE77257286CrossRefGoogle Scholar
  4. 4.
    Kanter, I., Frydman, A., Ater, A. 2005Is a multiple excitation of a single atom equivalent to a single ensemble of atoms?Europhys. Lett.69874878CrossRefGoogle Scholar
  5. 5.
    Kanter, I., Frydman, A., Ater, A. 2005Utilizing hidden Markov processes as a new tool for experimental physicsEurophys. Lett.69798804CrossRefGoogle Scholar
  6. 6.
    Shannon C.E. A mathematical theory of communication. Bell Sys. Tech. J. 27:379–423 and 623–656 (1948).Google Scholar
  7. 7.
    Nattermann T., Theory of the random field Ising model, in Spin Glasses and Random Fields, Young A.P., ed. (World Scientific, 1997).Google Scholar
  8. 8.
    Jacquet P., Seroussi G., and Szpankowski W., On the Entropy of a Hidden Markov Process, Data Compression Conference, Snowbird (2004).Google Scholar
  9. 9.
    Ordentlich E., and Weissman T., New Bounds on the Entropy Rate of Hidden Markov Processes, San Antonio Information Theory Workshop (Oct. 2004).Google Scholar
  10. 10.
    A preliminary presentation of our results is given in Zuk O., Kanter I., and Domany E., Asymptotics of the Entropy Rate for a Hidden Markov Process, Data Compression Conference, Snowbird (2005).Google Scholar
  11. 11.
    Cover, T.M., Thomas, J.A. 1991Elements of Information TheoryWileyNew YorkGoogle Scholar
  12. 12.
    Saul L.K., and Jordan M.I., Boltzmann Chains and Hidden Markov Models, Advances in Neural Information Processing Systems, Volume 7 (MIT Press, 1994).Google Scholar
  13. 13.
    MacKay, D.J.C. 1996Equivalence of Boltzmann chains and Hidden Markov ModelsNeural Comput.8178181MathSciNetGoogle Scholar
  14. 14.
    Derrida, B., France, M.M., Peyriere, J. 1986Exactly solvable one-dimensional inhomogeneous modelsJ. of Stat. Phys45439449Google Scholar
  15. 15.
    Fisher, D.S., Le Doussal, P., Monthus, P. 2001Nonequilibrium dynamics of random field Ising spin chains: Exact results via real space renormalization groupPhys. Rev. E64066107CrossRefMathSciNetGoogle Scholar
  16. 16.
    Grinstein, G., Mukamel, D. 1983Exact solution of a one-dimensional ising-model in a random magnetic fieldPhys. Rev. B2745034506CrossRefADSGoogle Scholar
  17. 17.
    Nieuwenhuizen T.M., and Luck J.M., Exactly soluble random field Ising models in one-dimension, J. Phys. A: Math. Gen. 19:1207–1227 (May 1986).Google Scholar
  18. 18.
    Derrida B., and Hilhorst H.J. (1983). Singular behavior of certain infinite products of random 2× 2 matrices. J. Phys. A 16:2641–2654Google Scholar
  19. 19.
    http://www.maplesoft.com/Google Scholar

Copyright information

© Springer Science+Business Media, Inc. 2005

Authors and Affiliations

  1. 1.Department of Physics of Complex SystemsWeizmann Institute of ScienceRehovotIsrael
  2. 2.Department of PhysicsBar Ilan UniversityRamat GanIsrael

Personalised recommendations