Skip to main content
Log in

Multi-Information in the Thermodynamic Limit

  • Published:
Journal of Statistical Physics Aims and scope Submit manuscript

Abstract

A multivariate generalization of mutual information, multi-information, is defined in the thermodynamic limit. The definition takes phase coexistence into account by taking the infimum over the translation-invariant Gibbs measures of an interaction potential. It is shown that this infimum is attained in a pure state. An explicit formula can be found for the Ising square lattice, where the quantity is proved to be maximized at the phase-transition point. By this, phase coexistence is linked to high model complexity in a rigorous way.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  1. S. Amari, Information geometry on hierarchy of probability distributions, IEEE Trans. Inform. Theory 47:1701–1711 (2001).

    Google Scholar 

  2. S. Amari and H. Nagaoka, in Methods of information geometry, AMS Translations of Mathematical Monographs, Vol. 191 (Oxford University Press, Oxford, 2000).

    Google Scholar 

  3. D. V. Arnold, Information-theoretic analysis of phase transitions, Complex Systems 10:143–155 (1996).

    Google Scholar 

  4. N. Ay, An information geometric approach to a theory of pragmatic structuring, Ann. Probab. 30:416–436 (2002).

    Google Scholar 

  5. N. Ay, Locality of global stochastic interaction in directed acyclic networks, Neural Comput. 14:2959–2980 (2002).

    Google Scholar 

  6. N. Ay and A. Knauf, Maximizing Multi-Information, Max-Planck-Institute of Mathematics in the Sciences, Preprint No. 42/2003 (2003).

  7. H. Bauer, Probability Theory (de Gruyter, Berlin/New York, 1996).

    Google Scholar 

  8. A. J. Bell, The Co-Information Lattice, preprint Redwood Neuroscience Institute RNI-TR-02-1 (2002).

  9. W. Bialek, I. Nemenman, and N. Tishby, Predictability, complexity, and learning, Neural Comput. 13:2409–2463 (2001).

    Google Scholar 

  10. P. M. Binder and J. Plazas, Multiscale analysis of complex systems, Phys. Rev. E 63:065203R(2001).

    Google Scholar 

  11. I. N. Bronstein and K. A. Semendjajew, Taschenbuch der Mathematik (Teubner, Leipzig, 1989).

    Google Scholar 

  12. D. E. Catlin, Estimation, Control, and the Discrete Kalman Filter (Springer-Verlag, New York, 1989).

    Google Scholar 

  13. G. Choquet, Lectures on Analysis, Vols. I-III (Benjamin, London, 1969).

    Google Scholar 

  14. T. M. Cover and J. A. Thomas, Elements of Information Theory (Wiley, New York, 1991).

    Google Scholar 

  15. J. P. Crutchfield and K. Young, Computation at the onset of chaos, in Complexity, Entropy, and the Physics of Information, W. H. Zurek, ed. (Addison-Wesley, Reading, 1990).

    Google Scholar 

  16. J. P. Crutchfield and D. P. Feldman, Statistical complexity of simple one-dimensional spin systems, Phys. Rev. E 55:R1239–R1242 (1997).

    Google Scholar 

  17. J. P. Crutchfield and D. P. Feldman, Regularities unseen, randomness observed: Levels of entropy convergence, Chaos 13:25–54 (2003).

    Google Scholar 

  18. R. L. Dobrushin, The description of a random field by means of conditional probabilities and condition of its regularities, Theory Probab. Appl. 13:458–486 (1968).

    Google Scholar 

  19. W. Ebeling, Prediction and entropy of nonlinear dynamical systems and symbolic sequences with LRO, Physica D 109:42–52 (1997).

    Google Scholar 

  20. D. P. Feldman and J. P. Crutchfield, Discovering noncritical organization: Statistical mechanical, information theoretic, and computational views of patterns in one-dimensional spin systems, Santa Fe Institute Working Paper 98-04-026.

  21. D. P. Feldman and J. P. Crutchfield, Structural information in two-dimensional patterns: Entropy convergence and excess entropy, Phys. Rev. E 67:051104(2003).

    Google Scholar 

  22. H.-O. Georgii, Gibbs Measures and Phase Transitions (de Gruyter, Berlin/New York, 1988).

    Google Scholar 

  23. P. Grassberger, Toward a quantitative theory of self-generated complexity, Internat. J. Theoret. Phys. 25:907–938 (1986).

    Google Scholar 

  24. E. Greenfield and H. Lecar, Mutual information in a dilute, asymmetric neural network model, Phys. Rev. E 63:041905(2001).

    Google Scholar 

  25. D. Griffeath, Introduction to random fields, in Denumerable Markov Chains, J. G. Kemeney, J. L. Snell, and A. W. Knapp, eds. (Springer, New York/Heidelberg/Berlin, 1976).

    Google Scholar 

  26. A. Hyvärinen, J. Karhunen, and E. Oja, Independent Component Analysis (Wiley, 2001).

  27. R. B. Israel, Convexity in the Theory of Lattice Gases (Princeton University Press, Princeton, 1979).

    Google Scholar 

  28. E. T. Jaynes, Information theory and statistical mechanics, Phys. Rev. 106:620–630 (1957).

    Google Scholar 

  29. W. Li, Mutual information versus correlation functions, J. Stat. Phys. 60:823–837 (1990).

    Google Scholar 

  30. W. Li, On the relationship between complexity and entropy for Markov chains and regular languages, Complex Systems 5:381–399 (1991).

    Google Scholar 

  31. R. Linsker, Self-organization in a perceptual network, IEEE Comput. 21:105–117 (1988).

    Google Scholar 

  32. O. E. Lanford and D. Ruelle, Observables at infinity and states with short range correlations in statistical mechanics, Comm. Math. Phys. 13:194–215 (1969).

    Google Scholar 

  33. B. Luque and A. Ferrera, Measuring mutual information in random Boolean networks, Complex Systems 12:241–246 (2001).

    Google Scholar 

  34. H. Matsuda, K. Kudo, R. Nakamura, O. Yamakawa, and T. Murata, Mutual information of Ising systems, Internat. J. Theoret. Phys. 35:839–845 (1996).

    Google Scholar 

  35. B. M. McCoy and T. T. Wu, The Two-Dimensional Ising Model (Harvard University Press, Cambridge, Massachusetts, 1973).

    Google Scholar 

  36. R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1996).

    Google Scholar 

  37. D. Petritis, Thermodynamic formalism of neural computing, in Dynamics of Complex Interacting Systems, E. Goles and M. Servet, eds. (Kluwer Academic Publishers, Dordrecht/Boston/London, 1996).

    Google Scholar 

  38. D. Ruelle, Statistical Mechanics-Rigorous Results (World Scientific, Singapore, 1999).

    Google Scholar 

  39. C. E. Shannon, A mathematical theory of communication, Bell System Tech. J. 27:379-423, 623-656 (1948).

    Google Scholar 

  40. B. Simon, The Statistical Mechanics of Lattice Gases (Princeton University Press, Princeton, 1993).

    Google Scholar 

  41. M. Studeny and J. Vejnarova, The multiinformation function as a tool for measuring stochastic dependence, in Learning in Graphical models, M. I. Jordan, ed. (Kluwer, Dordrecht, 1998).

    Google Scholar 

  42. G. Tononi, O. Sporns, and G. Edelman, A measure for brain complexity: Relating functional segregation and integration in the nervous system, Proc. Natl. Acad. Sci. USA 91:5033–5037 (1994).

    Google Scholar 

  43. G. H. Wannier, Statistical Physics (Dover, New York, 1987).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Erb, I., Ay, N. Multi-Information in the Thermodynamic Limit. Journal of Statistical Physics 115, 949–976 (2004). https://doi.org/10.1023/B:JOSS.0000022375.49904.ea

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/B:JOSS.0000022375.49904.ea

Navigation