Journal of Statistical Physics

, Volume 60, Issue 5, pp 823–837

Mutual information functions versus correlation functions


  • Wentian Li
    • Center for Complex Systems Research, Physics Department, Beckman InstituteUniversity of Illinois
    • Department of PhysicsColumbia University

DOI: 10.1007/BF01025996

Cite this article as:
Li, W. J Stat Phys (1990) 60: 823. doi:10.1007/BF01025996


This paper studies one application of mutual information to symbolic sequences: the mutual information functionM(d). This function is compared with the more frequently used correlation functionΓ(d). An exact relation betweenM(d) andΓ(d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular,Γ(d)=0 may or may not lead toM(d)=0. This linear, but not general, independence between symbols separated by a distance is studied for ternary sequences. Also included is the estimation of the finite-size effect on calculating mutual information. Finally, the concept of “symbolic noise” is discussed.

Key words

Mutual information functioncorrelation functionslinear and general dependencesymbolic noise
Download to read the full article text

Copyright information

© Plenum Publishing Corporation 1990