Advertisement

Optimal Information Measures for Weakly Chaotic Dynamical Systems

  • V. Benci
  • S. Galatolo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4123)

Abstract

The study of weakly chaotic dynamical systems suggests that an important indicator for their classification is the quantity of information that is needed to describe their orbits. The information can be measured by the use of suitable compression algorithms. The algorithms are “optimal” for this purpose if they compress very efficiently zero entropy strings. We discuss a definition of optimality in this sense. We also show that the set of optimal algorithms is not empty, showing a concrete example. We prove that the algorithms which are optimal according to the above definition are suitable to measure the information needed to describe the orbits of the Manneville maps: in these examples the information content measured by these algorithms has the same asymptotic behavior as the algorithmic information content.

Keywords

Information Content Compression Ratio Generalize Entropy Compression Algorithm Asymptotic Optimality 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Argenti, F., Benci, V., Cerrai, P., Cordelli, A., Galatolo, S., Menconi, G.: Information and dynamical systems: a concrete measurement on sporadic dynamics. Chaos, Solitons and Fractals 13(3), 461–469 (2002)MATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Allegrini, P., Barbi, M., Grigolini, P., West, B.J.: Dynamical model for DNA sequences. Phys. Rev. E 52(5), 5281–5297 (1995)CrossRefGoogle Scholar
  3. 3.
    Allegrini, P., Benci, V., Grigolini, P., Hamilton, P., Ignaccolo, M., Menconi, G., Palatella, L., Raffaelli, G., Scafetta, N., Virgilio, M., Jang, J.: Compression and diffusion: a joint approach to detect complexity. Chaos Solitons Fractals 15(3), 517–535 (2003)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Benci, V., Bonanno, C., Galatolo, S., Menconi, G., Virgilio, M.: Dynamical systems and computable information. Disc. Cont. Dyn. Syst.-B 4(4) (2004)Google Scholar
  5. 5.
    Bonanno, C., Galatolo, S.: The complexity of the Manneville map (work in preparation)Google Scholar
  6. 6.
    Bonanno, C., Menconi, G.: Computational information for the logistic map at the chaos threshold. Disc. Cont. Dyn. Syst.- B 2(3), 415–431 (2002)MATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Blume, F.: Possible rates of entropy convergence. Ergodic Theory and Dynam. Systems 17(1), 45–70 (1997)MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Brudno, A.A.: Entropy and the complexity of the trajectories of a dynamical system. Trans. Moscow Math. Soc. 2, 127–151 (1983)Google Scholar
  9. 9.
    Chaitin, G.J.: Information, Randomness and Incompleteness. Papers on Algorithmic Information Theory. World Scientific, Singapore (1987)Google Scholar
  10. 10.
    Csiszár, I.: The method of types. IEEE Trans. Inform. Theory 44, 2505–2523 (1998)MATHCrossRefMathSciNetGoogle Scholar
  11. 11.
    Galatolo, S.: Orbit complexity by computable structures. Nonlinearity 13, 1531–1546 (2000)MATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Galatolo, S.: Orbit complexity and data compression. Discrete and Continuous Dynamical Systems 7(3), 477–486 (2001)MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Galatolo, S.: Complexity, initial condition sensitivity, dimension and weak chaos in dynamical systems. Nonlinearity 16, 1219–1238 (2003)MATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Gaspard, P., Wang, X.J.: Sporadicity: between periodic and chaotic dynamical behavior. Proc. Nat. Acad. Sci. USA 85, 4591–4595 (1988)MATHCrossRefMathSciNetGoogle Scholar
  15. 15.
    Han, T.S., Kobayashi, K.: Mathematics of Information and Coding. Math. Monographs, vol. 203. AMS (2002)Google Scholar
  16. 16.
    Khinchin, A.I.: Mathematical Foundations of Information Theory. Dover Publications, New York (1957)MATHGoogle Scholar
  17. 17.
    Isola, S.: Renewal sequences and intermittency. J. Statist. Phys. 97(1-2), 263–280 (1999)MATHCrossRefMathSciNetGoogle Scholar
  18. 18.
    Kosaraju, S.R., Manzini, G.: Compression of low entropy strings with Lempel-Ziv algorithms. SIAM J. Comput. 29, 893–911 (2000)MATHCrossRefMathSciNetGoogle Scholar
  19. 19.
    Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and its Applications. Springer, Heidelberg (1993)MATHGoogle Scholar
  20. 20.
    Manneville, P.: Intermittency, self-similarity and 1/f spectrum in dissipative dynamical systems. J. Physique 41, 1235–1243 (1980)CrossRefMathSciNetGoogle Scholar
  21. 21.
    Pesin, Y.B.: Dimension Theory in Dynamical Systems. Chicago Lectures in Mathematics (1997)Google Scholar
  22. 22.
    Pollicott, M., Weiss, H.: Multifractal analysis of Lyapunov exponent for continued fraction and Manneville-Pomeau transformations and applications to Diophantine approximation. Comm. Math. Phys. 207(1), 145–171 (1999)MATHCrossRefMathSciNetGoogle Scholar
  23. 23.
    Ryabko, B.: Twice-universal coding, Russian. Problemy Peredachi Informatsii 20(3), 24–28 (1984)MATHMathSciNetGoogle Scholar
  24. 24.
    Takens, F., Verbitski, E.: Generalized entropies: Renyi and correlation integral approach. Nonlinearity 11(4), 771–782 (1998)MATHCrossRefMathSciNetGoogle Scholar
  25. 25.
    White, H.: Algorithmic complexity of points in dynamical systems. Ergodic Theory Dynam. Syst. 13, 807–830 (1993)MATHCrossRefGoogle Scholar
  26. 26.
    Ziv, J., Lempel, A.: A universal algorithm for sequential data compression. IEEE Trans. Inform. Theory 23, 337–342 (1977)MATHCrossRefMathSciNetGoogle Scholar
  27. 27.
    Ziv, J., Lempel, A.: Compression of individual sequences via variable-rate coding. IEEE Trans. Inform. Theory 24, 530–536 (1978)MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • V. Benci
  • S. Galatolo

There are no affiliations available

Personalised recommendations