Journal of Statistical Physics

, Volume 60, Issue 5, pp 823–837

Mutual information functions versus correlation functions

  • Wentian Li
Articles

DOI: 10.1007/BF01025996

Cite this article as:
Li, W. J Stat Phys (1990) 60: 823. doi:10.1007/BF01025996

Abstract

This paper studies one application of mutual information to symbolic sequences: the mutual information functionM(d). This function is compared with the more frequently used correlation functionΓ(d). An exact relation betweenM(d) andΓ(d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular,Γ(d)=0 may or may not lead toM(d)=0. This linear, but not general, independence between symbols separated by a distance is studied for ternary sequences. Also included is the estimation of the finite-size effect on calculating mutual information. Finally, the concept of “symbolic noise” is discussed.

Key words

Mutual information functioncorrelation functionslinear and general dependencesymbolic noise

Copyright information

© Plenum Publishing Corporation 1990

Authors and Affiliations

  • Wentian Li
    • 1
    • 2
  1. 1.Center for Complex Systems Research, Physics Department, Beckman InstituteUniversity of IllinoisUrbana
  2. 2.Department of PhysicsColumbia UniversityNew York
  3. 3.Santa Fe InstituteSanta Fe