Abstract
A multivariate generalization of mutual information, multi-information, is defined in the thermodynamic limit. The definition takes phase coexistence into account by taking the infimum over the translation-invariant Gibbs measures of an interaction potential. It is shown that this infimum is attained in a pure state. An explicit formula can be found for the Ising square lattice, where the quantity is proved to be maximized at the phase-transition point. By this, phase coexistence is linked to high model complexity in a rigorous way.
Similar content being viewed by others
REFERENCES
S. Amari, Information geometry on hierarchy of probability distributions, IEEE Trans. Inform. Theory 47:1701–1711 (2001).
S. Amari and H. Nagaoka, in Methods of information geometry, AMS Translations of Mathematical Monographs, Vol. 191 (Oxford University Press, Oxford, 2000).
D. V. Arnold, Information-theoretic analysis of phase transitions, Complex Systems 10:143–155 (1996).
N. Ay, An information geometric approach to a theory of pragmatic structuring, Ann. Probab. 30:416–436 (2002).
N. Ay, Locality of global stochastic interaction in directed acyclic networks, Neural Comput. 14:2959–2980 (2002).
N. Ay and A. Knauf, Maximizing Multi-Information, Max-Planck-Institute of Mathematics in the Sciences, Preprint No. 42/2003 (2003).
H. Bauer, Probability Theory (de Gruyter, Berlin/New York, 1996).
A. J. Bell, The Co-Information Lattice, preprint Redwood Neuroscience Institute RNI-TR-02-1 (2002).
W. Bialek, I. Nemenman, and N. Tishby, Predictability, complexity, and learning, Neural Comput. 13:2409–2463 (2001).
P. M. Binder and J. Plazas, Multiscale analysis of complex systems, Phys. Rev. E 63:065203R(2001).
I. N. Bronstein and K. A. Semendjajew, Taschenbuch der Mathematik (Teubner, Leipzig, 1989).
D. E. Catlin, Estimation, Control, and the Discrete Kalman Filter (Springer-Verlag, New York, 1989).
G. Choquet, Lectures on Analysis, Vols. I-III (Benjamin, London, 1969).
T. M. Cover and J. A. Thomas, Elements of Information Theory (Wiley, New York, 1991).
J. P. Crutchfield and K. Young, Computation at the onset of chaos, in Complexity, Entropy, and the Physics of Information, W. H. Zurek, ed. (Addison-Wesley, Reading, 1990).
J. P. Crutchfield and D. P. Feldman, Statistical complexity of simple one-dimensional spin systems, Phys. Rev. E 55:R1239–R1242 (1997).
J. P. Crutchfield and D. P. Feldman, Regularities unseen, randomness observed: Levels of entropy convergence, Chaos 13:25–54 (2003).
R. L. Dobrushin, The description of a random field by means of conditional probabilities and condition of its regularities, Theory Probab. Appl. 13:458–486 (1968).
W. Ebeling, Prediction and entropy of nonlinear dynamical systems and symbolic sequences with LRO, Physica D 109:42–52 (1997).
D. P. Feldman and J. P. Crutchfield, Discovering noncritical organization: Statistical mechanical, information theoretic, and computational views of patterns in one-dimensional spin systems, Santa Fe Institute Working Paper 98-04-026.
D. P. Feldman and J. P. Crutchfield, Structural information in two-dimensional patterns: Entropy convergence and excess entropy, Phys. Rev. E 67:051104(2003).
H.-O. Georgii, Gibbs Measures and Phase Transitions (de Gruyter, Berlin/New York, 1988).
P. Grassberger, Toward a quantitative theory of self-generated complexity, Internat. J. Theoret. Phys. 25:907–938 (1986).
E. Greenfield and H. Lecar, Mutual information in a dilute, asymmetric neural network model, Phys. Rev. E 63:041905(2001).
D. Griffeath, Introduction to random fields, in Denumerable Markov Chains, J. G. Kemeney, J. L. Snell, and A. W. Knapp, eds. (Springer, New York/Heidelberg/Berlin, 1976).
A. Hyvärinen, J. Karhunen, and E. Oja, Independent Component Analysis (Wiley, 2001).
R. B. Israel, Convexity in the Theory of Lattice Gases (Princeton University Press, Princeton, 1979).
E. T. Jaynes, Information theory and statistical mechanics, Phys. Rev. 106:620–630 (1957).
W. Li, Mutual information versus correlation functions, J. Stat. Phys. 60:823–837 (1990).
W. Li, On the relationship between complexity and entropy for Markov chains and regular languages, Complex Systems 5:381–399 (1991).
R. Linsker, Self-organization in a perceptual network, IEEE Comput. 21:105–117 (1988).
O. E. Lanford and D. Ruelle, Observables at infinity and states with short range correlations in statistical mechanics, Comm. Math. Phys. 13:194–215 (1969).
B. Luque and A. Ferrera, Measuring mutual information in random Boolean networks, Complex Systems 12:241–246 (2001).
H. Matsuda, K. Kudo, R. Nakamura, O. Yamakawa, and T. Murata, Mutual information of Ising systems, Internat. J. Theoret. Phys. 35:839–845 (1996).
B. M. McCoy and T. T. Wu, The Two-Dimensional Ising Model (Harvard University Press, Cambridge, Massachusetts, 1973).
R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1996).
D. Petritis, Thermodynamic formalism of neural computing, in Dynamics of Complex Interacting Systems, E. Goles and M. Servet, eds. (Kluwer Academic Publishers, Dordrecht/Boston/London, 1996).
D. Ruelle, Statistical Mechanics-Rigorous Results (World Scientific, Singapore, 1999).
C. E. Shannon, A mathematical theory of communication, Bell System Tech. J. 27:379-423, 623-656 (1948).
B. Simon, The Statistical Mechanics of Lattice Gases (Princeton University Press, Princeton, 1993).
M. Studeny and J. Vejnarova, The multiinformation function as a tool for measuring stochastic dependence, in Learning in Graphical models, M. I. Jordan, ed. (Kluwer, Dordrecht, 1998).
G. Tononi, O. Sporns, and G. Edelman, A measure for brain complexity: Relating functional segregation and integration in the nervous system, Proc. Natl. Acad. Sci. USA 91:5033–5037 (1994).
G. H. Wannier, Statistical Physics (Dover, New York, 1987).
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Erb, I., Ay, N. Multi-Information in the Thermodynamic Limit. Journal of Statistical Physics 115, 949–976 (2004). https://doi.org/10.1023/B:JOSS.0000022375.49904.ea
Issue Date:
DOI: https://doi.org/10.1023/B:JOSS.0000022375.49904.ea