Metric Permutation Entropy

  • José María AmigóEmail author
Part of the Springer Series in Synergetics book series (SSSYN)


The word “entropy” was coined by the German physicist R. Clausius (1822–1888), who introduced it in thermodynamics in 1865 to measure the amount of energy in a system that cannot produce work. The fact that the entropy of an isolated system never decreases constitutes the second law of thermodynamics and clearly shows the central role of entropy in many-particle physics. The direction of time is then explained as a consequence of the increase of entropy in all irreversible processes. Later on the concept of entropy was given a microscopic interpretation in the foundational works of L. Boltzmann (1844–1906) on gas kinetics and statistical mechanics [184]. The celebrated Boltzmann’s equation reads in the usual physical notation


Lyapunov Exponent Shannon Entropy Topological Entropy Symbolic Dynamic Rank Variable 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

  1. 6.
    J.M. Amigó, J. Szczepanski, E. Wajnryb, and M.V. Sanchez-Vives, Estimating the entropy of spike trains via Lempel-Ziv complexity, Neural Computation 16 (2004) 717–736.zbMATHCrossRefGoogle Scholar
  2. 28.
    C. Bandt and B. Pompe, Permutation entropy: A natural complexity measure for time series, Physical Review Letters 88 (2002) 174102.CrossRefADSGoogle Scholar
  3. 29.
    C. Bandt, G. Keller, and B. Pompe, Entropy of interval maps via permutations. Nonlinearity 15 (2002) 1595–1602.zbMATHCrossRefMathSciNetADSGoogle Scholar
  4. 40.
    M. Buhl and M.B. Kennel, Statistically relaxing to generating partitions for observed time-series data, Physical Review E 71 (2005) 046213: 1–14.Google Scholar
  5. 45.
    Y. Cao, W. Tung, J.B. Gao, V.A. Protopopescu, and L.M. Hively, Detecting dynamical changes in time series using the permutation entropy, Physical Review E 70 (2004) 046217.CrossRefMathSciNetADSGoogle Scholar
  6. 52.
    G.H. Choe, Computational Ergodic Theory. Springer Verlag, Berlin, 2005.Google Scholar
  7. 59.
    T.M. Cover and J.A. Thomas, Elements of Information Theory, 2nd edition. New York, John Wiley & Sons, 2006.zbMATHGoogle Scholar
  8. 64.
    K. Denbigh, How subjective is entropy. In: H.S. Leff and A.F. Rex (Ed.), Maxwell’s Demon, Entropy, Information, Computing, pp. 109–115. Princeton University Press, Princeton, 1990.Google Scholar
  9. 89.
    R.M. Gray, Entropy and Information Theory. Springer Verlag, New York, 1990.Google Scholar
  10. 114.
    A. Katok and B. Hasselbaltt, Introduction to the Theory of Dynamical Systems. Cambridge University Press, Cambridge, 1998.Google Scholar
  11. 121.
    M.B. Kennel and A.I. Mees, Context-tree modeling of observed symbolic dynamics, Physical Review E 66 (2002) 056209.CrossRefMathSciNetADSGoogle Scholar
  12. 122.
    M.B. Kennel, J. Shlens, H.D.I. Abarbanel, and E.J. Chichilnisky, Estimating entropy rates with Bayesian confidence intervals, Neural Computation 17 (2005) 1531–1576.zbMATHCrossRefMathSciNetGoogle Scholar
  13. 126.
    A.N. Kolmogorov, Entropy per unit time as a metric invariant of automorphism, Doklady of Russian Academy of Sciences 124 (1959) 754–755.zbMATHMathSciNetGoogle Scholar
  14. 127.
    I. Kontoyiannis, P.H. Algoet, Y.M. Suhov, and A.J. Wyner, Nonparametric entropy estimation for stationary processes and random fields, with applications to English text. IEEE Transactions on Information Theory 44 (1998) 1319–1327.zbMATHCrossRefMathSciNetGoogle Scholar
  15. 137.
    A. Lempel and J. Ziv, On the complexity of an individual sequence, IEEE Transactions on Information Theory IT-22 (1976) 75–78.CrossRefMathSciNetGoogle Scholar
  16. 167.
    L. Paninski, Estimation of entropy and mutual information, Neural Computation 15 (2003) 1191–1253.zbMATHCrossRefGoogle Scholar
  17. 184.
    R. Sexl and J. Blackmore (Eds.), Ludwig Boltzmann - Ausgewahlte Abhandlungen (Ludwig Boltzmann Gesamtausgabe, Band 8). Vieweg, Braunschweig, 1982.Google Scholar
  18. 186.
    C.E. Shannon, A mathematical theory of communication, Bell System Technical Journal 27 (1948) 379–423, 623–653.Google Scholar
  19. 189.
    Y.G. Sinai, On the Notion of Entropy of a Dynamical System, Doklady of Russian Academy of Sciences 124 (1959) 768–771.zbMATHMathSciNetGoogle Scholar
  20. 195.
    S.P. Strong, R. Koberle, R.R. de Ruyter van Steveninck, and W. Bialek, Entropy and information in neural spike trains. Physical Review Letters 80 (1998) 197–200.CrossRefADSGoogle Scholar
  21. 211.
    J. Ziv and A. Lempel, Compression of individual sequences via variable-rate coding IEEE Transactions on Information Theory IT-24 (1978) 530–536.CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  1. 1.Centro de Investigacion OperativaUniversidad Miguel HernandezElcheSpain

Personalised recommendations