Advertisement

Thermodynamic Gibbs Formalism and Information Theory

Conference paper
Part of the Mathematics for Industry book series (MFI, volume 1)

Abstract

Links between Information Theory and Thermodynamics are well known. The concept of entropy, introduced by C. Shannon in 1948 in his groundbreaking work which gave birth to Information Theory, originates in Statistical Mechanics. In the past 60 years multiple links have been found, which led to new results. Information Theory has been incredibly successful in utilization of probabilistic methods in problems like data compression, prediction, classification, and coding of information sources. Most approaches and algorithms can be classified as causal, or omni-directional: the data are processed in a directed sequential fashion; distribution of the present with respect to the past is used for prediction or classification purposes. However, recently some novel approaches have been proposed in Information Theory. It turns out that the non-causal (bi-directional) approaches, i.e., when the influences of the past as well as of their future are taken into account, lead to very interesting and often superior solutions in problems like denoising and classification. The theory of Gibbs states in Statistical Mechanics—the so-called Gibbs formalism—provides the right framework for treatment of stochastic processes in a non-causal way. We will discuss specific information-theoretic algorithms based on the Gibbs formalism.

Keywords

Information theory Gibbs states Thermodynamic formalism 

References

  1. 1.
    Bejerano, G., Yona, G.: Variations on probabilistic suffix trees: statistical modeling and prediction of protein families. Bioinformatics 17(1), 23 (2001)CrossRefGoogle Scholar
  2. 2.
    Bramson, M., Kalikow, S.: Nonuniqueness in g-functions. Isr. J. Math. 84(1), 153–160 (1993)CrossRefMATHMathSciNetGoogle Scholar
  3. 3.
    Fernández, F., Viola, A., Weinberger, M.J.: Efficient algorithms for constructing optimal bi-directional context sets. In: IEEE Data Compression Conference, pp. 179–188 (2010)Google Scholar
  4. 4.
    Fernández, R.: Gibbsianness and non-Gibbsianness in lattice random fields. Math. Stat. Phys. Session LXXXIII, 731–799 (2006) (Elsevier)Google Scholar
  5. 5.
    Fernández, R., Maillard, G.: Chains with complete connections: general theory, uniqueness, loss of memory and mixing properties. J. Stat. Phys. 118(3), 555–588 (2005)CrossRefMATHMathSciNetGoogle Scholar
  6. 6.
    Galves, A., Galves, C., Garcia, N.L., Leonardi, F.: Context tree selection and linguistic rhythm retrieval from written texts. Ann. Appl. Stat. 6(1), 186–209 (2012)CrossRefMATHMathSciNetGoogle Scholar
  7. 7.
    Georgii, H.O.: Gibbs measures and phase transitions. Walter de Gruyter, Berlin (2011)CrossRefMATHGoogle Scholar
  8. 8.
    Haixiao, C., Kulkarni, S.R., Verdu, S.: Universal divergence estimation for finite-alphabet sources. IEEE Trans. Inf. Theory 52(8), 3456–3475 (2006)CrossRefGoogle Scholar
  9. 9.
    Kozlov, O.K.: Gibbs description of a system of random variables. Probl. Peredachi Infor. 10(3), 94–103 (1974)MATHGoogle Scholar
  10. 10.
    Ordentlich, E., Weinberger, M.J., Weissman, T.: Multi-directional context sets with applications to universal denoising and compression, In: Proceedings of International Symposium on Information Theory, 2005. ISIT 2005, pp. 1270–1274 (2005)Google Scholar
  11. 11.
    Rissanen, J.: A universal prior for integers and estimation by minimum description length. Ann. Stat. 11(2), 416–431 (1983)CrossRefMATHMathSciNetGoogle Scholar
  12. 12.
    Verbitskiy, E.: Thermodynamics of hidden Markov processes. In: Entropy of Hidden Markov Processes and Connections to Dynamical Systems. London Mathematical Society, Lecture note series, vol. 385, pp. 258–272. Cambridge University Press, Cambridge (2011)Google Scholar
  13. 13.
    Weissman, T., Ordentlich, E., Seroussi, G., Verdú, S., Weinberger, M.J.: Universal discrete denoising: known channel. IEEE Trans. Infor. Theory 51(1), 5–28 (2005)CrossRefGoogle Scholar
  14. 14.
    Willems, F.M.J., Shtarkov, Y.M., Tjalkens, T.J.: The context-tree weighting method: basic properties. IEEE Trans. Infor. Theory 41(3), 653–664 (1995)CrossRefMATHGoogle Scholar
  15. 15.
    Yu, J., Verdú, S.: Schemes for bidirectional modeling of discrete stationary sources. IEEE Trans. Infor. Theory 52(11), 4789–4807 (2006)CrossRefGoogle Scholar

Copyright information

© Springer Japan 2014

Authors and Affiliations

  1. 1.Johann Bernoulli Institute for Mathematics and Computer ScienceUniversity of GroningenGroningenThe Netherlands
  2. 2.Nedap N.V.GroenloThe Netherlands
  3. 3.Mathematical InsitituteLeiden UniversityLeidenThe Netherlands

Personalised recommendations