Advertisement

Journal of Statistical Physics

, Volume 115, Issue 3–4, pp 949–976 | Cite as

Multi-Information in the Thermodynamic Limit

  • Ionas Erb
  • Nihat Ay
Article

Abstract

A multivariate generalization of mutual information, multi-information, is defined in the thermodynamic limit. The definition takes phase coexistence into account by taking the infimum over the translation-invariant Gibbs measures of an interaction potential. It is shown that this infimum is attained in a pure state. An explicit formula can be found for the Ising square lattice, where the quantity is proved to be maximized at the phase-transition point. By this, phase coexistence is linked to high model complexity in a rigorous way.

mutual information Ising model phase transitions excess entropy complexity 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

REFERENCES

  1. 1.
    S. Amari, Information geometry on hierarchy of probability distributions, IEEE Trans. Inform. Theory 47:1701–1711 (2001).Google Scholar
  2. 2.
    S. Amari and H. Nagaoka, in Methods of information geometry, AMS Translations of Mathematical Monographs, Vol. 191 (Oxford University Press, Oxford, 2000).Google Scholar
  3. 3.
    D. V. Arnold, Information-theoretic analysis of phase transitions, Complex Systems 10:143–155 (1996).Google Scholar
  4. 4.
    N. Ay, An information geometric approach to a theory of pragmatic structuring, Ann. Probab. 30:416–436 (2002).Google Scholar
  5. 5.
    N. Ay, Locality of global stochastic interaction in directed acyclic networks, Neural Comput. 14:2959–2980 (2002).Google Scholar
  6. 6.
    N. Ay and A. Knauf, Maximizing Multi-Information, Max-Planck-Institute of Mathematics in the Sciences, Preprint No. 42/2003 (2003).Google Scholar
  7. 7.
    H. Bauer, Probability Theory (de Gruyter, Berlin/New York, 1996).Google Scholar
  8. 8.
    A. J. Bell, The Co-Information Lattice, preprint Redwood Neuroscience Institute RNI-TR-02-1 (2002).Google Scholar
  9. 9.
    W. Bialek, I. Nemenman, and N. Tishby, Predictability, complexity, and learning, Neural Comput. 13:2409–2463 (2001).Google Scholar
  10. 10.
    P. M. Binder and J. Plazas, Multiscale analysis of complex systems, Phys. Rev. E 63:065203R(2001).Google Scholar
  11. 11.
    I. N. Bronstein and K. A. Semendjajew, Taschenbuch der Mathematik (Teubner, Leipzig, 1989).Google Scholar
  12. 12.
    D. E. Catlin, Estimation, Control, and the Discrete Kalman Filter (Springer-Verlag, New York, 1989).Google Scholar
  13. 13.
    G. Choquet, Lectures on Analysis, Vols. I-III (Benjamin, London, 1969).Google Scholar
  14. 14.
    T. M. Cover and J. A. Thomas, Elements of Information Theory (Wiley, New York, 1991).Google Scholar
  15. 15.
    J. P. Crutchfield and K. Young, Computation at the onset of chaos, in Complexity, Entropy, and the Physics of Information, W. H. Zurek, ed. (Addison-Wesley, Reading, 1990).Google Scholar
  16. 16.
    J. P. Crutchfield and D. P. Feldman, Statistical complexity of simple one-dimensional spin systems, Phys. Rev. E 55:R1239–R1242 (1997).Google Scholar
  17. 17.
    J. P. Crutchfield and D. P. Feldman, Regularities unseen, randomness observed: Levels of entropy convergence, Chaos 13:25–54 (2003).Google Scholar
  18. 18.
    R. L. Dobrushin, The description of a random field by means of conditional probabilities and condition of its regularities, Theory Probab. Appl. 13:458–486 (1968).Google Scholar
  19. 19.
    W. Ebeling, Prediction and entropy of nonlinear dynamical systems and symbolic sequences with LRO, Physica D 109:42–52 (1997).Google Scholar
  20. 20.
    D. P. Feldman and J. P. Crutchfield, Discovering noncritical organization: Statistical mechanical, information theoretic, and computational views of patterns in one-dimensional spin systems, Santa Fe Institute Working Paper 98-04-026.Google Scholar
  21. 21.
    D. P. Feldman and J. P. Crutchfield, Structural information in two-dimensional patterns: Entropy convergence and excess entropy, Phys. Rev. E 67:051104(2003).Google Scholar
  22. 22.
    H.-O. Georgii, Gibbs Measures and Phase Transitions (de Gruyter, Berlin/New York, 1988).Google Scholar
  23. 23.
    P. Grassberger, Toward a quantitative theory of self-generated complexity, Internat. J. Theoret. Phys. 25:907–938 (1986).Google Scholar
  24. 24.
    E. Greenfield and H. Lecar, Mutual information in a dilute, asymmetric neural network model, Phys. Rev. E 63:041905(2001).Google Scholar
  25. 25.
    D. Griffeath, Introduction to random fields, in Denumerable Markov Chains, J. G. Kemeney, J. L. Snell, and A. W. Knapp, eds. (Springer, New York/Heidelberg/Berlin, 1976).Google Scholar
  26. 26.
    A. Hyvärinen, J. Karhunen, and E. Oja, Independent Component Analysis (Wiley, 2001).Google Scholar
  27. 27.
    R. B. Israel, Convexity in the Theory of Lattice Gases (Princeton University Press, Princeton, 1979).Google Scholar
  28. 28.
    E. T. Jaynes, Information theory and statistical mechanics, Phys. Rev. 106:620–630 (1957).Google Scholar
  29. 29.
    W. Li, Mutual information versus correlation functions, J. Stat. Phys. 60:823–837 (1990).Google Scholar
  30. 30.
    W. Li, On the relationship between complexity and entropy for Markov chains and regular languages, Complex Systems 5:381–399 (1991).Google Scholar
  31. 31.
    R. Linsker, Self-organization in a perceptual network, IEEE Comput. 21:105–117 (1988).Google Scholar
  32. 32.
    O. E. Lanford and D. Ruelle, Observables at infinity and states with short range correlations in statistical mechanics, Comm. Math. Phys. 13:194–215 (1969).Google Scholar
  33. 33.
    B. Luque and A. Ferrera, Measuring mutual information in random Boolean networks, Complex Systems 12:241–246 (2001).Google Scholar
  34. 34.
    H. Matsuda, K. Kudo, R. Nakamura, O. Yamakawa, and T. Murata, Mutual information of Ising systems, Internat. J. Theoret. Phys. 35:839–845 (1996).Google Scholar
  35. 35.
    B. M. McCoy and T. T. Wu, The Two-Dimensional Ising Model (Harvard University Press, Cambridge, Massachusetts, 1973).Google Scholar
  36. 36.
    R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1996).Google Scholar
  37. 37.
    D. Petritis, Thermodynamic formalism of neural computing, in Dynamics of Complex Interacting Systems, E. Goles and M. Servet, eds. (Kluwer Academic Publishers, Dordrecht/Boston/London, 1996).Google Scholar
  38. 38.
    D. Ruelle, Statistical Mechanics-Rigorous Results (World Scientific, Singapore, 1999).Google Scholar
  39. 39.
    C. E. Shannon, A mathematical theory of communication, Bell System Tech. J. 27:379-423, 623-656 (1948).Google Scholar
  40. 40.
    B. Simon, The Statistical Mechanics of Lattice Gases (Princeton University Press, Princeton, 1993).Google Scholar
  41. 41.
    M. Studeny and J. Vejnarova, The multiinformation function as a tool for measuring stochastic dependence, in Learning in Graphical models, M. I. Jordan, ed. (Kluwer, Dordrecht, 1998).Google Scholar
  42. 42.
    G. Tononi, O. Sporns, and G. Edelman, A measure for brain complexity: Relating functional segregation and integration in the nervous system, Proc. Natl. Acad. Sci. USA 91:5033–5037 (1994).Google Scholar
  43. 43.
    G. H. Wannier, Statistical Physics (Dover, New York, 1987).Google Scholar

Copyright information

© Plenum Publishing Corporation 2004

Authors and Affiliations

  • Ionas Erb
    • 1
    • 2
  • Nihat Ay
    • 2
    • 3
    • 4
  1. 1.Bioinformatik, Institut für InformatikUniversity of LeipzigLeipzigGermany
  2. 2.Max-Planck Institute for MathematicsLeipzigGermany
  3. 3.Santa Fe InstituteSanta Fe, New Mexico
  4. 4.Mathematical InstituteFriedrich-Alexander University Erlangen-NurembergErlangenGermany

Personalised recommendations