Optimal Information Measures for Weakly Chaotic Dynamical Systems
The study of weakly chaotic dynamical systems suggests that an important indicator for their classification is the quantity of information that is needed to describe their orbits. The information can be measured by the use of suitable compression algorithms. The algorithms are “optimal” for this purpose if they compress very efficiently zero entropy strings. We discuss a definition of optimality in this sense. We also show that the set of optimal algorithms is not empty, showing a concrete example. We prove that the algorithms which are optimal according to the above definition are suitable to measure the information needed to describe the orbits of the Manneville maps: in these examples the information content measured by these algorithms has the same asymptotic behavior as the algorithmic information content.
KeywordsInformation Content Compression Ratio Generalize Entropy Compression Algorithm Asymptotic Optimality
Unable to display preview. Download preview PDF.
- 4.Benci, V., Bonanno, C., Galatolo, S., Menconi, G., Virgilio, M.: Dynamical systems and computable information. Disc. Cont. Dyn. Syst.-B 4(4) (2004)Google Scholar
- 5.Bonanno, C., Galatolo, S.: The complexity of the Manneville map (work in preparation)Google Scholar
- 8.Brudno, A.A.: Entropy and the complexity of the trajectories of a dynamical system. Trans. Moscow Math. Soc. 2, 127–151 (1983)Google Scholar
- 9.Chaitin, G.J.: Information, Randomness and Incompleteness. Papers on Algorithmic Information Theory. World Scientific, Singapore (1987)Google Scholar
- 15.Han, T.S., Kobayashi, K.: Mathematics of Information and Coding. Math. Monographs, vol. 203. AMS (2002)Google Scholar
- 21.Pesin, Y.B.: Dimension Theory in Dynamical Systems. Chicago Lectures in Mathematics (1997)Google Scholar