Skip to main content

Part of the book series: Springer Theses ((Springer Theses))

  • 808 Accesses

Abstract

Information storage is considered an important aspect of the dynamics of many natural and man-made processes, for example: in human brain networks [1] and artificial neural networks [2], synchronisation between coupled systems [3], coordinated motion in modular robots [4], and in the dynamics of inter-event distribution times [5]. The term is still often used rather loosely or in a qualitative sense however, and as yet we do not have a good understanding of how information storage interacts with information transfer and modification to give rise to distributed computation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The methods and results in this chapter were first reported in [6, 7].

  2. 2.

    Lindgren and Nordahl [13] also measured excess entropy (referred to as effective measure complexity) for some ECAs. They measured spatial excess entropies however, and we note that it is only temporal excess entropies which are interpretable as information storage from our perspective of distributed computation.

  3. 3.

    This is as per Shalizi’s original formulation of the local excess entropy in [14], however our presentation is for a single time-series rather than the light-cone formulation used there.

  4. 4.

    Certainly the formulation of entropy rate overestimates in Eq. (2.18) could be used to directly form alternative localisations \(e_X^{\prime }\) also, and in the limit \(k \rightarrow \infty \) their averages \(E_X\) will be the same. However, only the local formulation from the predictive information captures the total information stored at a particular temporal point, which is our quantity of interest here.

  5. 5.

    These equations are correct not only in the limit \(k \rightarrow \infty \) but for estimates with any value of \(k \ge 1\).

  6. 6.

    The distinction between causal effect and the perspective of computation is explored in Chap. 4.

  7. 7.

    Note that our sum and the plot start from \(k=0\), unlike the expressions and plots in [15] which start from \(L=1\). The difference is that we have adopted \(k=L-1\) (as per footnote 4 on p. xxx) to keep a focus on the number of steps \(k\) in the past history, which is important for our computational view.

  8. 8.

    Sub-figures (e)–(i) of these figures will be discussed in later chapters.

  9. 9.

    The exceptions are where long temporal chains of 0’s occur, disturbing the memory of the phase due to finite-\(k\) effects.

  10. 10.

    Recall though from Sect. 3.2.2 that \(h_\mu (i,n,k)\) itself is never negative.

  11. 11.

    As described in footnote 4 on p. xxx, while there are alternative formulations of the local excess entropy which can be computed from past observations alone, they cannot be interpreted as the total information storage at the given time point. A similar concept would be the partial localisation (see Appendix A) \(I(x^{(k)}_n ;X^{(k^+)})\), which quantifies how much information from the past is likely to be used in the future.

References

  1. M.G. Kitzbichler, M.L. Smith, S.R. Christensen, E. Bullmore, Broadband criticality of human brain network synchronization. PLoS Comput. Biol. 5(3), e1000314 (2009)

    Article  MathSciNet  Google Scholar 

  2. J. Boedecker, O. Obst, N.M. Mayer, M. Asada, Initialization and self-organized optimization of recurrent neural network connectivity. HFSP J. 3(5), 340–349 (2009)

    Article  Google Scholar 

  3. R. Morgado, M. Cieśla, L. Longa, F.A. Oliveira, Synchronization in the presence of memory. Europhys. Lett. 79(1), 10002 (2007)

    Article  MathSciNet  Google Scholar 

  4. M. Prokopenko, V. Gerasimov, I. Tanev, Evolving spatiotemporal coordination in a modular robotic system, in Proceedings of the Ninth International Conference on the Simulation of Adaptive Behavior (SAB’06), Rome, ed. by S. Nolfi, G. Baldassarre, R. Calabretta, J. Hallam, D. Marocco, J.-A. Meyer, D. Parisi. Lecture Notes in Artificial Intelligence, vol. 4095 (Springer, Berlin, 2006), pp. 548–559

    Google Scholar 

  5. K.I. Goh, A.L. Barabási, Burstiness and memory in complex systems. Europhys. Lett. 81(4), 48002 (2008)

    Article  MathSciNet  Google Scholar 

  6. J.T. Lizier, M. Prokopenko, A.Y. Zomaya, Detecting non-trivial computation in complex dynamics, in Proceedings of the 9th European Conference on Artificial Life (ECAL, 2007) Lisbon, Portugal, ed. by F. Almeida e Costa, L.M. Rocha, E. Costa, I. Harvey, A. Coutinho. Lecture Notes in Artificial Intelligence, vol. 4648 (Springer, Berlin, 2007), pp. 895–904

    Google Scholar 

  7. J.T. Lizier, M. Prokopenko, A.Y. Zomaya, Local measures of information storage in complex distributed computation. Inf. Sci. 208, 39–54 (2012)

    Google Scholar 

  8. C.G. Langton, Computation at the edge of chaos: phase transitions and emergent computation. Physica D 42(1–3), 12–37 (1990)

    Article  MathSciNet  Google Scholar 

  9. A.S. Klyubin, D. Polani, C.L. Nehaniv, Tracking information flow through the environment: simple cases of stigmergy, in Proceedings of the Ninth International Conference on the Simulation and Synthesis of Living Systems (ALife IX), Boston, ed. by J. Pollack, M. Bedau, P. Husbands, T. Ikegami, R.A. Watson (MIT Press, Cambridge, 2004), pp. 563–568

    Google Scholar 

  10. I. Couzin, R. James, D. Croft, J. Krause, Social organization and information transfer in schooling fishes, ed. by B.C.K. Laland, J. Krause, in Fish Cognition and Behavior, ser. Fish and Aquatic Resources (Blackwell Publishing, Boston, 2006), pp. 166–185

    Google Scholar 

  11. P. Grassberger, Toward a quantitative theory of self-generated complexity. Int. J. Theor. Phys. 25(9), 907–938 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  12. P. Grassberger, Long-range effects in an elementary cellular automaton. J. Stat. Phys. 45(1–2), 27–39 (1986)

    Google Scholar 

  13. K. Lindgren, M.G. Nordahl, Complexity measures and cellular automata. Complex Syst. 2(4), 409–440 (1988)

    MathSciNet  MATH  Google Scholar 

  14. C.R. Shalizi, Causal architecture, complexity and self-organization in time series and cellular automata. Ph.D. Dissertation, University of Wisconsin-Madison, 2001

    Google Scholar 

  15. J.P. Crutchfield, D.P. Feldman, Regularities unseen, randomness observed: levels of entropy convergence. Chaos 13(1), 25–54 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  16. M. Wójtowicz, Java Cellebration v. 1.50, Online software (2002), http://psoup.math.wisc.edu/mcell/mjcell/mjcell.html

  17. N. Ay, N. Bertschinger, R. Der, F. Güttler, E. Olbrich, Predictive information and explorative behavior of autonomous robots. Eur. Phys. J. B 63(3), 329–339 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  18. M. Lungarella, T. Pegors, D. Bulwinkle, O. Sporns, Methods for quantifying the informational structure of sensory and motor data. Neuroinformatics 3(3), 243–262 (2005)

    Article  Google Scholar 

  19. T.M. Cover, J.A. Thomas, Elements of Information Theory (Wiley, New York, 1991)

    Google Scholar 

  20. K. Marton, P.C. Shields, Entropy and the consistent estimation of joint distributions. Ann. Probab. 22(2), 960–977 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  21. J.T. Lizier, M. Prokopenko, A.Y. Zomaya, Information modification and particle collisions in distributed computation. Chaos 20(3), 037109 (2010)

    Article  MathSciNet  Google Scholar 

  22. W. Hordijk, C.R. Shalizi, J.P. Crutchfield, Upper bound on the products of particle interactions in cellular automata. Physica D 154(3–4), 240–258 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  23. D.P. Feldman, J.P. Crutchfield, Synchronizing to periodicity: the transient information and synchronization time of periodic sequences. Adv. Complex Syst. 7(3–4), 329–355 (2004)

    MathSciNet  MATH  Google Scholar 

  24. J.E. Hanson, J.P. Crutchfield, Computational mechanics of cellular automata: an example. Physica D 103(1–4), 169–189 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  25. J.E. Hanson, J.P. Crutchfield, The attractor-basin portait of a cellular automaton. J. Stat. Phys. 66, 1415–1462 (1992)

    Google Scholar 

  26. C.R. Shalizi, R. Haslinger, J.-B. Rouquier, K.L. Klinkner, C. Moore, Automatic filters for the detection of coherent structure in spatiotemporal systems. Phys. Rev. E 73(3), 036104 (2006)

    Article  MathSciNet  Google Scholar 

  27. J.T. Lizier, M. Prokopenko, A.Y. Zomaya, Coherent information structure in complex computation. Theory Biosci. Theory Biosci. 131(3), 193–203 (2012), doi:10.1007/s12064-011-0145-9

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joseph T. Lizier .

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Lizier, J.T. (2013). Information Storage. In: The Local Information Dynamics of Distributed Computation in Complex Systems. Springer Theses. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32952-4_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-32952-4_3

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-32951-7

  • Online ISBN: 978-3-642-32952-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics