Measuring the Dynamics of Information Processing on a Local Scale in Time and Space

Part of the Understanding Complex Systems book series (UCS)

Abstract

Studies of how information is processed in natural systems, in particular in nervous systems, are rapidly gaining attention. Less known however is that the local dynamics of such information processing in space and time can be measured. In this chapter, we review the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes.We then review how these techniques are used to construct measures of local information storage and transfer within a distributed system, and we describe how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do. This is done by examining their application to cellular automata, a classic complex system, where these local information profiles have provided quantitative evidence for long-held conjectures regarding the information transfer and processing role of gliders and glider collisions. Finally, we describe the outlook in anticipating the broad application of these local measures of information processing in computational neuroscience.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ash, R.B.: Information Theory. Dover Publishers, Inc., New York (1965)MATHGoogle Scholar
  2. 2.
    Ay, N., Polani, D.: Information Flows in Causal Networks. Advances in Complex Systems 11(1), 17–41 (2008)CrossRefMATHMathSciNetGoogle Scholar
  3. 3.
    Bandt, C., Pompe, B.: Permutation entropy: A natural complexity measure for time series. Physical Review Letters 88(17) (2002)Google Scholar
  4. 4.
    Barnett, L., Barrett, A.B., Seth, A.K.: Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables. Physical Review Letters 103(23), 238701 (2009)CrossRefGoogle Scholar
  5. 5.
    Barnett, L., Bossomaier, T.: Transfer Entropy as a Log-Likelihood Ratio. Physical Review Letters 109, 138105 (2012)CrossRefGoogle Scholar
  6. 6.
    Barnett, L., Buckley, C.L., Bullock, S.: Neural complexity and structural connectivity. Physical Review E 79(5), 051914 (2009)Google Scholar
  7. 7.
    Boedecker, J., Obst, O., Lizier, J.T., Mayer, N.M., Asada, M.: Information processing in echo state networks at the edge of chaos. Theory in Biosciences 131(3), 205–213 (2012)CrossRefGoogle Scholar
  8. 8.
    Bressler, S.L., Tang, W., Sylvester, C.M., Shulman, G.L., Corbetta, M.: Top-Down Control of Human Visual Cortex by Frontal and Parietal Cortex in Anticipatory Visual Spatial Attention. Journal of Neuroscience 28(40), 10056–10061 (2008)Google Scholar
  9. 9.
    Ceguerra, R.V., Lizier, J.T., Zomaya, A.Y.: Information storage and transfer in the synchronization process in locally-connected networks. In: Proceedings of the 2011 IEEE Symposium on Artificial Life (ALIFE), pp. 54–61. IEEE (2011)Google Scholar
  10. 10.
    Chávez, M., Martinerie, J., Le Van Quyen, M.: Statistical assessment of nonlinear causality: application to epileptic EEG signals. Journal of Neuroscience Methods 124(2), 113–128 (2003)CrossRefGoogle Scholar
  11. 11.
    Chicharro, D., Ledberg, A.: When Two Become One: The Limits of Causality Analysis of Brain Dynamics. PLoS One 7(3), e32466 (2012)Google Scholar
  12. 12.
    Couzin, I.D., James, R., Croft, D.P., Krause, J.: Social Organization and Information Transfer in Schooling Fishes. In: Brown, C., Laland, K.N., Krause, J. (eds.) Fish Cognition and Behavior, Fish and Aquatic Resources, pp. 166–185. Blackwell Publishing (2006)Google Scholar
  13. 13.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley-Interscience, New York (1991)CrossRefMATHGoogle Scholar
  14. 14.
    Crutchfield, J.P., Feldman, D.P.: Regularities Unseen, Randomness Observed: Levels of Entropy Convergence. Chaos 13(1), 25–54 (2003)CrossRefMATHMathSciNetGoogle Scholar
  15. 15.
    Crutchfield, J.P., Young, K.: Inferring statistical complexity. Physical Review Letters 63(2), 105–108 (1989)CrossRefMathSciNetGoogle Scholar
  16. 16.
    Dasan, J., Ramamohan, T.R., Singh, A., Nott, P.R.: Stress fluctuations in sheared Stokesian suspensions. Physical Review E 66(2), 021409 (2002)Google Scholar
  17. 17.
    Derdikman, D., Hildesheim, R., Ahissar, E., Arieli, A., Grinvald, A.: Imaging spatiotemporal dynamics of surround inhibition in the barrels somatosensory cortex. The Journal of Neuroscience 23(8), 3100–3105 (2003)Google Scholar
  18. 18.
    DeWeese, M.R., Meister, M.: How to measure the information gained from one symbol. Network: Computation in Neural Systems 10, 325–340 (1999)CrossRefMATHGoogle Scholar
  19. 19.
    Effenberger, F.: A primer on information theory, with applications to neuroscience, arXiv:1304.2333 (2013), http://arxiv.org/abs/1304.2333
  20. 20.
    Faes, L., Nollo, G., Porta, A.: Information-based detection of nonlinear Granger causality in multivariate processes via a nonuniform embedding technique. Physical Review E 83, 051112 (2011)Google Scholar
  21. 21.
    Faes, L., Nollo, G., Porta, A.: Non-uniform multivariate embedding to assess the information transfer in cardiovascular and cardiorespiratory variability series. Computers in Biology and Medicine 42(3), 290–297 (2012)CrossRefGoogle Scholar
  22. 22.
    Fano, R.M.: Transmission of information: a statistical theory of communications. MIT Press, Cambridge (1961)Google Scholar
  23. 23.
    Flecker, B., Alford, W., Beggs, J.M., Williams, P.L., Beer, R.D.: Partial information decomposition as a spatiotemporal filter. Chaos: An Interdisciplinary Journal of Nonlinear Science 21(3), 037104 (2011)Google Scholar
  24. 24.
    Frenzel, S., Pompe, B.: Partial Mutual Information for Coupling Analysis of Multivariate Time Series. Physical Review Letters 99(20), 204101 (2007)CrossRefGoogle Scholar
  25. 25.
    Friston, K.J., Harrison, L., Penny, W.: Dynamic causal modelling. NeuroImage 19(4), 1273–1302 (2003)CrossRefGoogle Scholar
  26. 26.
    Gomez-Herrero, G., Wu, W., Rutanen, K., Soriano, M.C., Pipa, G., Vicente, R.: Assessing coupling dynamics from an ensemble of time series. arXiv:1008.0539 (2010), http://arxiv.org/abs/1008.0539
  27. 27.
    Gong, P., van Leeuwen, C.: Distributed Dynamical Computation in Neural Circuits with Propagating Coherent Activity Patterns. PLoS Computational Biology 5(12) (2009)Google Scholar
  28. 28.
    Granger, C.W.J.: Investigating causal relations by econometric models and cross-spectral methods. Econometrica 37, 424–438 (1969)CrossRefGoogle Scholar
  29. 29.
    Grassberger, P.: New mechanism for deterministic diffusion. Physical Review A 28(6), 3666 (1983)CrossRefGoogle Scholar
  30. 30.
    Grassberger, P.: Long-range effects in an elementary cellular automaton. Journal of Statistical Physics 45(1-2), 27–39 (1986)CrossRefMathSciNetGoogle Scholar
  31. 31.
    Grassberger, P.: Toward a quantitative theory of self-generated complexity. International Journal of Theoretical Physics 25(9), 907–938 (1986)CrossRefMATHMathSciNetGoogle Scholar
  32. 32.
    Griffith, V., Koch, C.: Quantifying synergistic mutual information. In: Prokopenko, M. (ed.) Guided Self-Organization: Inception, pp. 159–190. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  33. 33.
    Hanson, J.E., Crutchfield, J.P.: The Attractor-Basin Portait of a Cellular Automaton. Journal of Statistical Physics 66, 1415–1462 (1992)CrossRefMATHMathSciNetGoogle Scholar
  34. 34.
    Hanson, J.E., Crutchfield, J.P.: Computational mechanics of cellular automata: An example. Physica D 103(1-4), 169–189 (1997)CrossRefMATHMathSciNetGoogle Scholar
  35. 35.
    Harder, M., Salge, C., Polani, D.: Bivariate Measure of Redundant Information. Physical Review E 87, 012130 (2013)Google Scholar
  36. 36.
    Helvik, T., Lindgren, K., Nordahl, M.G.: Local information in one-dimensional cellular automata. In: Sloot, P.M.A., Chopard, B., Hoekstra, A.G. (eds.) ACRI 2004. LNCS, vol. 3305, pp. 121–130. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  37. 37.
    Helvik, T., Lindgren, K., Nordahl, M.G.: Continuity of Information Transport in Surjective Cellular Automata. Communications in Mathematical Physics 272(1), 53–74 (2007)CrossRefMATHMathSciNetGoogle Scholar
  38. 38.
    Hinrichs, H., Heinze, H.J., Schoenfeld, M.A.: Causal visual interactions as revealed by an information theoretic measure and fMRI. NeuroImage 31(3), 1051–1060 (2006)CrossRefGoogle Scholar
  39. 39.
    Honey, C.J., Kotter, R., Breakspear, M., Sporns, O.: Network structure of cerebral cortex shapes functional connectivity on multiple time scales. Proceedings of the National Academy of Science 104(24), 10,240–10,245 (2007)Google Scholar
  40. 40.
    Ito, S., Hansen, M.E., Heiland, R., Lumsdaine, A., Litke, A.M., Beggs, J.M.: Extending Transfer Entropy Improves Identification of Effective Connectivity in a Spiking Cortical Network Model. PLoS One 6(11), e27431 (2011)Google Scholar
  41. 41.
    Kantz, H., Schreiber, T.: Nonlinear Time Series Analysis. Cambridge University Press, Cambridge (1997)MATHGoogle Scholar
  42. 42.
    Katare, S., West, D.H.: Optimal complex networks spontaneously emerge when information transfer is maximized at least expense: A design perspective. Complexity 11(4), 26–35 (2006)CrossRefGoogle Scholar
  43. 43.
    Kerr, C.C., Van Albada, S.J., Neymotin, S.A., Chadderdon, G.L., Robinson, P.A., Lytton, W.W.: Cortical information flow in parkinson’s disease: a composite network/field model. Frontiers in Computational Neuroscience 7(39) (2013)Google Scholar
  44. 44.
    Kraskov, A.: Synchronization and Interdependence Measures and their Applications to the Electroencephalogram of Epilepsy Patients and Clustering of Data. Publication Series of the John von Neumann Institute for Computing, vol. 24. John von Neumann Institute for Computing, Jülich (2004)Google Scholar
  45. 45.
    Kraskov, A., Stögbauer, H., Grassberger, P.: Estimating mutual information. Physical Review E 69(6), 066138 (2004)Google Scholar
  46. 46.
    Langton, C.G.: Computation at the edge of chaos: phase transitions and emergent computation. Physica D 42(1-3), 12–37 (1990)CrossRefMathSciNetGoogle Scholar
  47. 47.
    Levina, A., Herrmann, J.M., Geisel, T.: Dynamical synapses causing self-organized criticality in neural networks. Nature Physics 3(12), 857–860 (2007)CrossRefGoogle Scholar
  48. 48.
    Liang, H., Ding, M., Bressler, S.L.: Temporal dynamics of information flow in the cerebral cortex. Neurocomputing 38-40, 1429–1435 (2001)CrossRefGoogle Scholar
  49. 49.
    Lindner, M., Vicente, R., Priesemann, V., Wibral, M.: TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neuroscience 12(1), 119 (2011)CrossRefGoogle Scholar
  50. 50.
    Lizier, J., Heinzle, J., Soon, C., Haynes, J.D., Prokopenko, M.: Spatiotemporal information transfer pattern differences in motor selection. BMC Neuroscience 12(Suppl. 1), P261 (2011)Google Scholar
  51. 51.
    Lizier, J.T.: JIDT: An information-theoretic toolkit for studying the dynamics of complex systems (2012), https://code.google.com/p/information-dynamics-toolkit/
  52. 52.
    Lizier, J.T.: The Local Information Dynamics of Distributed Computation in Complex Systems. Springer Theses. Springer, Heidelberg (2013)CrossRefMATHGoogle Scholar
  53. 53.
    Lizier, J.T., Flecker, B., Williams, P.L.: Towards a synergy-based approach to measuring information modification. In: Proceedings of the 2013 IEEE Symposium on Artificial Life (ALIFE), pp. 43–51. IEEE (2013)Google Scholar
  54. 54.
    Lizier, J.T., Heinzle, J., Horstmann, A., Haynes, J.D., Prokopenko, M.: Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity. Journal of Computational Neuroscience 30(1), 85–107 (2011)CrossRefMathSciNetGoogle Scholar
  55. 55.
    Lizier, J.T., Pritam, S., Prokopenko, M.: Information dynamics in small-world Boolean networks. Artificial Life 17(4), 293–314 (2011)CrossRefGoogle Scholar
  56. 56.
    Lizier, J.T., Prokopenko, M.: Differentiating information transfer and causal effect. European Physical Journal B 73(4), 605–615 (2010)CrossRefGoogle Scholar
  57. 57.
    Lizier, J.T., Prokopenko, M., Tanev, I., Zomaya, A.Y.: Emergence of Glider-like Structures in a Modular Robotic System. In: Bullock, S., Noble, J., Watson, R., Bedau, M.A. (eds.) Proceedings of the Eleventh International Conference on the Simulation and Synthesis of Living Systems (ALife XI), Winchester, UK, pp. 366–373. MIT Press, Cambridge (2008)Google Scholar
  58. 58.
    Lizier, J.T., Prokopenko, M., Zomaya, A.Y.: Detecting Non-trivial Computation in Complex Dynamics. In: Almeida e Costa, F., Rocha, L.M., Costa, E., Harvey, I., Coutinho, A. (eds.) ECAL 2007. LNCS (LNAI), vol. 4648, pp. 895–904. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  59. 59.
    Lizier, J.T., Prokopenko, M., Zomaya, A.Y.: Local information transfer as a spatiotemporal filter for complex systems. Physical Review E 77(2), 026110 (2008)Google Scholar
  60. 60.
    Lizier, J.T., Prokopenko, M., Zomaya, A.Y.: Information modification and particle collisions in distributed computation. Chaos 20(3), 037109 (2010)Google Scholar
  61. 61.
    Lizier, J.T., Prokopenko, M., Zomaya, A.Y.: Coherent information structure in complex computation. Theory in Biosciences 131(3), 193–203 (2012)CrossRefGoogle Scholar
  62. 62.
    Lizier, J.T., Prokopenko, M., Zomaya, A.Y.: Local measures of information storage in complex distributed computation. Information Sciences 208, 39–54 (2012)CrossRefGoogle Scholar
  63. 63.
    Lizier, J.T., Rubinov, M.: Multivariate construction of effective computational networks from observational data. Tech. Rep. Preprint 25/2012, Max Planck Institute for Mathematics in the Sciences (2012)Google Scholar
  64. 64.
    Lungarella, M., Sporns, O.: Mapping Information Flow in Sensorimotor Networks. PLoS Computational Biology 2(10), e144 (2006)Google Scholar
  65. 65.
    MacKay, D.J.C.: Information Theory, Inference, and Learning Algorithms. Cambridge University Press, Cambridge (2003)MATHGoogle Scholar
  66. 66.
    Mahoney, J.R., Ellison, C.J., James, R.G., Crutchfield, J.P.: How hidden are hidden processes? A primer on crypticity and entropy convergence. Chaos 21(3), 037112 (2011)Google Scholar
  67. 67.
    Manchanda, K., Yadav, A.C., Ramaswamy, R.: Scaling behavior in probabilistic neuronal cellular automata. Physical Review E 87, 012704 (2013)Google Scholar
  68. 68.
    Manning, C.D., Schütze, H.: Foundations of Statistical Natural Language Processing. The MIT Press, Cambridge (1999)MATHGoogle Scholar
  69. 69.
    Marinazzo, D., Wu, G., Pellicoro, M., Angelini, L., Stramaglia, S.: Information flow in networks and the law of diminishing marginal returns: evidence from modeling and human electroencephalographic recordings. PLoS One 7(9), e45026 (2012)Google Scholar
  70. 70.
    Mitchell, M.: Computation in Cellular Automata: A Selected Review. In: Gramss, T., Bornholdt, S., Gross, M., Mitchell, M., Pellizzari, T. (eds.) Non-Standard Computation, pp. 95–140. VCH Verlagsgesellschaft, Weinheim (1998)Google Scholar
  71. 71.
    Mitchell, M., Crutchfield, J.P., Hraber, P.T.: Evolving Cellular Automata to Perform Computations: Mechanisms and Impediments. Physica D 75, 361–391 (1994)CrossRefMATHGoogle Scholar
  72. 72.
    Nakajima, K., Li, T., Kang, R., Guglielmino, E., Caldwell, D.G., Pfeifer, R.: Local information transfer in soft robotic arm. In: 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1273–1280. IEEE (2012),Google Scholar
  73. 73.
    Obst, O., Boedecker, J., Asada, M.: Improving Recurrent Neural Network Performance Using Transfer Entropy. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds.) ICONIP 2010, Part II. LNCS, vol. 6444, pp. 193–200. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  74. 74.
    Pearl, J.: Causality: Models, Reasoning, and Inference. Cambridge University Press, Cambridge (2000)Google Scholar
  75. 75.
    Priesemann, V., Munk, M., Wibral, M.: Subsampling effects in neuronal avalanche distributions recorded in vivo. BMC Neuroscience 10(1), 40 (2009)CrossRefGoogle Scholar
  76. 76.
    Prokopenko, M., Boschietti, F., Ryan, A.J.: An Information-Theoretic Primer on Complexity, Self-Organization, and Emergence. Complexity 15(1), 11–28 (2009)CrossRefMathSciNetGoogle Scholar
  77. 77.
    Prokopenko, M., Gerasimov, V., Tanev, I.: Evolving Spatiotemporal Coordination in a Modular Robotic System. In: Nolfi, S., Baldassarre, G., Calabretta, R., Hallam, J.C.T., Marocco, D., Meyer, J.-A., Miglino, O., Parisi, D. (eds.) SAB 2006. LNCS (LNAI), vol. 4095, pp. 558–569. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  78. 78.
    Prokopenko, M., Lizier, J.T., Obst, O., Wang, X.R.: Relating Fisher information to order parameters. Physical Review E 84, 41116 (2011)CrossRefGoogle Scholar
  79. 79.
    Prokopenko, M., Lizier, J.T., Price, D.C.: On thermodynamic interpretation of transfer entropy. Entropy 15(2), 524–543 (2013)CrossRefMathSciNetGoogle Scholar
  80. 80.
    Rubinov, M., Lizier, J., Prokopenko, M., Breakspear, M.: Maximized directed information transfer in critical neuronal networks. BMC Neuroscience 12(supp.l 1), P18 (2011)Google Scholar
  81. 81.
    Schreiber, T.: Interdisciplinary application of nonlinear time series methods - the generalized dimensions. Physics Reports 308, 1–64 (1999)CrossRefMathSciNetGoogle Scholar
  82. 82.
    Schreiber, T.: Measuring Information Transfer. Physical Review Letters 85(2), 461–464 (2000)CrossRefGoogle Scholar
  83. 83.
    Shalizi, C.R.: Causal Architecture, Complexity and Self-Organization in Time Series and Cellular Automata. Ph.D. thesis, University of Wisconsin-Madison (2001)Google Scholar
  84. 84.
    Shalizi, C.R., Haslinger, R., Rouquier, J.B., Klinkner, K.L., Moore, C.: Automatic filters for the detection of coherent structure in spatiotemporal systems. Physical Review E 73(3), 036104 (2006)Google Scholar
  85. 85.
    Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)Google Scholar
  86. 86.
    Soon, C.S., Brass, M., Heinze, H.J., Haynes, J.D.: Unconscious determinants of free decisions in the human brain. Nature Neuroscience 11(5), 543–545 (2008)CrossRefGoogle Scholar
  87. 87.
    Staniek, M., Lehnertz, K.: Symbolic transfer entropy. Physical Review Letters 100(15), 158101 (2008)CrossRefGoogle Scholar
  88. 88.
    Stramaglia, S., Wu, G.R., Pellicoro, M., Marinazzo, D.: Expanding the transfer entropy to identify information subgraphs in complex systems. In: Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3668–3671. IEEE (2012)Google Scholar
  89. 89.
    Ver Steeg, G., Galstyan, A.: Information-theoretic measures of influence based on content dynamics. In: Proceedings of the Sixth ACM International Conference on Web Search and Data Mining, pp. 3–12 (2013)Google Scholar
  90. 90.
    Verdes, P.F.: Assessing causality from multivariate time series. Physical Review E 72(2), 026222 (2005)Google Scholar
  91. 91.
    Vicente, R., Wibral, M., Lindner, M., Pipa, G.: Transfer entropy–a model-free measure of effective connectivity for the neurosciences. Journal of Computational Neuroscience 30(1), 45–67 (2011)CrossRefMathSciNetGoogle Scholar
  92. 92.
    Wang, X.R., Miller, J.M., Lizier, J.T., Prokopenko, M., Rossi, L.F.: Quantifying and Tracing Information Cascades in Swarms. PLoS One 7(7), e40084 (2012)Google Scholar
  93. 93.
    Wibral, M., Pampu, N., Priesemann, V., Siebenhühner, F., Seiwert, H., Lindner, M., Lizier, J.T., Vicente, R.: Measuring Information-Transfer delays. PLoS One 8(2), e55809 (2013)Google Scholar
  94. 94.
    Wibral, M., Rahm, B., Rieder, M., Lindner, M., Vicente, R., Kaiser, J.: Transfer entropy in magnetoencephalographic data: quantifying information flow in cortical and cerebellar networks. Progress in Biophysics and Molecular Biology 105(1-2), 80–97 (2011)CrossRefGoogle Scholar
  95. 95.
    Williams, P.L., Beer, R.D.: Nonnegative Decomposition of Multivariate Information. arXiv:1004.2515 (2010), http://arxiv.org/abs/1004.2515
  96. 96.
    Williams, P.L., Beer, R.D.: Generalized Measures of Information Transfer. arXiv:1102.1507 (2011), http://arxiv.org/abs/1102.1507
  97. 97.
    Wolfram, S.: A New Kind of Science. Wolfram Media, Champaign (2002)MATHGoogle Scholar
  98. 98.
    Wuensche, A.: Classifying cellular automata automatically: Finding gliders, filtering, and relating space-time patterns, attractor basins, and the Z parameter. Complexity 4(3), 47–66 (1999)CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.CSIRO Computational InformaticsMarsfieldAustralia

Personalised recommendations