Journal of Statistical Physics

, Volume 60, Issue 5, pp 823-837

First online:

Mutual information functions versus correlation functions

  • Wentian LiAffiliated withCenter for Complex Systems Research, Physics Department, Beckman Institute, University of IllinoisDepartment of Physics, Columbia University

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


This paper studies one application of mutual information to symbolic sequences: the mutual information functionM(d). This function is compared with the more frequently used correlation functionΓ(d). An exact relation betweenM(d) andΓ(d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular,Γ(d)=0 may or may not lead toM(d)=0. This linear, but not general, independence between symbols separated by a distance is studied for ternary sequences. Also included is the estimation of the finite-size effect on calculating mutual information. Finally, the concept of “symbolic noise” is discussed.

Key words

Mutual information function correlation functions linear and general dependence symbolic noise