Advertisement

Journal of Systems Science and Complexity

, Volume 25, Issue 1, pp 30–45 | Cite as

Predictive models and generative complexity

  • Wolfgang Löhr
Article

Abstract

The causal states of computational mechanics define the minimal sufficient memory for a given discrete stationary stochastic process. Their entropy is an important complexity measure called statistical complexity (or true measure complexity). They induce the ɛ-machine, which is a hidden Markov model (HMM) generating the process. But it is not the minimal one, although generative HMMs also have a natural predictive interpretation. This paper gives a mathematical proof of the idea that the ɛ-machine is the minimal HMM with an additional (partial) determinism condition. Minimal internal state entropy of a generative HMM is in analogy to statistical complexity called generative complexity. This paper also shows that generative complexity depends on the process in a nice way. It is, as a function of the process, lower semi-continuous (w.r.t. weak-* topology), concave, and behaves nice under ergodic decomposition of the process.

Key words

Causal states complexity ɛ-machine generative complexity HMM partially deterministic HMM predictive model statistical 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    J. P. Crutchfield and K. Young, Inferring statistical complexity, Phys. Rev. Let., 1989, 63: 105–108.MathSciNetCrossRefGoogle Scholar
  2. [2]
    C. R. Shalizi and J. P. Crutchfield, Computational mechanics: Pattern and prediction, structure and simplicity, Journal of Statistical Physics, 2001, 104: 817–879.MathSciNetzbMATHCrossRefGoogle Scholar
  3. [3]
    N. Ay and J. P. Crutchfield, Reductions of hidden information sources, Journal of Statistical Physics, 2005, 120: 659–684.MathSciNetzbMATHCrossRefGoogle Scholar
  4. [4]
    P. Grassberger, Toward a quantitative theory of self-generated complexity, Int. J. Theor. Phys., 1986, 25: 907–938.MathSciNetzbMATHCrossRefGoogle Scholar
  5. [5]
    D. Zambella and P. Grassberger, Complexity of forecasting in a class of simple models, Complex Systems, 1988, 2: 269–303.MathSciNetzbMATHGoogle Scholar
  6. [6]
    H. Jänicke, et al., Multifield visualization using local statistical complexity, IEEE Transactions on Visualization and Computer Graphics, 2007, 13(6): 1384–1391.CrossRefGoogle Scholar
  7. [7]
    R. W. Clarke, M. P. Freeman, and N. W. Watkins, Application of computational mechanics to the analysis of natural data: An example in geomagnetism, Phys. Rev. E, 2003, 67(1): 016203.CrossRefGoogle Scholar
  8. [8]
    J. P. Crutchfield, The calculi of emergence: Computation, dynamics and induction, Physica D, 1994, 75: 11–54.zbMATHCrossRefGoogle Scholar
  9. [9]
    W. Löhr and N. Ay, On the generative nature of prediction, Advances in Complex Systems, 2009, 12(2): 169–194.MathSciNetzbMATHCrossRefGoogle Scholar
  10. [10]
    W. Löhr and N. Ay, Non-Sufficient Memories that are Sufficient for Prediction, in Proceedings of Complex’2009, Shanghai, volume 4 part I of LNICST, pages 265–276. Springer, 2009.Google Scholar
  11. [11]
    Alex Heller, On stochastic processes derived from Markov chains, Annals of Mathematical Statistics, 1965, 36: 1286–1291.MathSciNetzbMATHCrossRefGoogle Scholar
  12. [12]
    Wolfgang Löhr, Models of Discrete-Time Stochastic Processes and Associated Complexity Measures, University of Leipzig, 2010, http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-38267.
  13. [13]
    G. Keller, Equilibrium States in Ergodic Theory, London Mathematical Society, 1998.Google Scholar
  14. [14]
    L. Dębowski, Ergodic decomposition of excess entropy and conditional mutual information, IPI PAN Reports, nr 993, 2006.Google Scholar
  15. [15]
    W. Löhr, Properties of the statistical complexity functional and partially deterministic HMMs, Entropy, 2009, 11(3): 385–401.MathSciNetzbMATHCrossRefGoogle Scholar
  16. [16]
    S. Still, J. P. Crutchfield, and C. J. Ellison, Optimal Causal Inference, Informal publication, http://arxiv.org/abs/0708.1580, 2007.
  17. [17]
    James P. Crutchfield and David P. Feldman, Regularities unseen, randomness observed: Levels of entropy convergence, Chaos, 2003, 13(1): 25–54.MathSciNetzbMATHCrossRefGoogle Scholar
  18. [18]
    W. Bialek, I. Nemenman, and N. Tishby, Predictability, complexity, and learning, Neural Computation, 2001, 13: 2409–2463.zbMATHCrossRefGoogle Scholar
  19. [19]
    J. P. Crutchfield and N. H. Packard, Symbolic dynamics of noisy chaos, Physica D, 1983, 7: 201–223.MathSciNetCrossRefGoogle Scholar

Copyright information

© Institute of Systems Science, Academy of Mathematics and Systems Science, CAS and Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Wolfgang Löhr
    • 1
  1. 1.Max Planck Institute for Mathematics in the SciencesLeipzigGermany

Personalised recommendations