Dynamical complexity of short and noisy time series

Compression-Complexity vs. Shannon entropy
Regular Article

DOI: 10.1140/epjst/e2016-60397-x

Cite this article as:
Nagaraj, N. & Balasubramanian, K. Eur. Phys. J. Spec. Top. (2017). doi:10.1140/epjst/e2016-60397-x
  • 19 Downloads
Part of the following topical collections:
  1. Aspects of Statistical Mechanics and Dynamical Complexity

Abstract

Shannon entropy has been extensively used for characterizing complexity of time series arising from chaotic dynamical systems and stochastic processes such as Markov chains. However, for short and noisy time series, Shannon entropy performs poorly. Complexity measures which are based on lossless compression algorithms are a good substitute in such scenarios. We evaluate the performance of two such Compression-Complexity Measures namely Lempel-Ziv complexity (LZ) and Effort-To-Compress (ETC) on short time series from chaotic dynamical systems in the presence of noise. Both LZ and ETC outperform Shannon entropy (H) in accurately characterizing the dynamical complexity of such systems. For very short binary sequences (which arise in neuroscience applications), ETC has higher number of distinct complexity values than LZ and H, thus enabling a finer resolution. For two-state ergodic Markov chains, we empirically show that ETC converges to a steady state value faster than LZ. Compression-Complexity measures are promising for applications which involve short and noisy time series.

Copyright information

© EDP Sciences and Springer 2017

Authors and Affiliations

  1. 1.Consciousness Studies Programme, National Institute of Advanced Studies, Indian Institute of Science CampusBengaluruIndia
  2. 2.Department of Electronics and Communication EngineeringAmrita School of Engineering, Amrita Vishwa Vidyapeetham, Amrita UniversityCoimbatoreIndia