Mutual Information Functions Versus Correlation Functions in Binary Sequences
Mutual information is a well known concept used in information theory . Recently, it has been suggested that it can be used in the study of chaotic dynamical systems  and for the characterization of spatial complex patterns . Although it has been shown that mutual information is a better quantity than correlation function in the determination of the time delay for the delayed signal in reconstructing the phase space of the chaotic trajectory , there is no attempt to systematically compare it with the more frequently used correlation functions. Binary sequences provide an excellent example for this comparative study. Some results are included here, for more details see Ref .
Unable to display preview. Download preview PDF.
- 2.Rob Shaw, “The dripping faucet as a model chaotic system”, (Aerial Press) (1984); and unpublished ideas.Google Scholar
- 3.Gregory J. Chaitin, “Toward a mathematical definition of life’ ”, The Maximum Entropy Formalism, ed. Levine and Tribus, ( MIT Press 1979 ).Google Scholar
- 5.Wentian Li, “Mutual information versus correlation functions”, (CCSR Tech Report, CCSR, Univ. of Illinois, 1989 ).Google Scholar
- 6.Wentian Li, Problems in Complex Systems (Ph.D thesis, Columbia University, 1989); “Context-free languages can give 1/f spectra”, (CCSR Tech Report No.10, CCSR Univ. of Illinois, 1988 ).Google Scholar
- 7.ed. Stephen Wolfram, Theory and Application of Cellular Automata, (World Scientific, 1986 ).Google Scholar
- 8.Chris Langton, Norman Packard, Wentian Li, “Bifurcation-like phenomena in cellular automata rule space”, (paper in preparation, 1989 ).Google Scholar
- 9.Wentian Li, “Correlation analysis of JFK’s inaugural speech”, (work in progress).Google Scholar