Journal of Statistical Physics

, Volume 161, Issue 2, pp 404–451

Computational Mechanics of Input–Output Processes: Structured Transformations and the \(\epsilon \)-Transducer

Article

Abstract

Computational mechanics quantifies structure in a stochastic process via its causal states, leading to the process’s minimal, optimal predictor—the \(\epsilon {\text {-}}\mathrm{machine}\). We extend computational mechanics to communication channels coupling two processes, obtaining an analogous optimal model—the \(\epsilon {\text {-}}\mathrm{transducer}\)—of the stochastic mapping between them. Here, we lay the foundation of a structural analysis of communication channels, treating joint processes and processes with input. The result is a principled structural analysis of mechanisms that support information flow between processes. It is the first in a series on the structural information theory of memoryful channels, channel composition, and allied conditional information measures.

Keywords

Sequential machine Communication channel Finite-state transducer Statistical complexity Causal state Minimality Optimal prediction Subshift endomorphism 

References

  1. 1.
    Crutchfield, J.P.: Between order and chaos. Nat. Phys. 8(January), 17–24 (2012)MathSciNetGoogle Scholar
  2. 2.
    Packard, N.H., Crutchfield, J.P., Farmer, J.D., Shaw, R.S.: Geometry from a time series. Phys. Rev. Lett. 45, 712 (1980)CrossRefADSGoogle Scholar
  3. 3.
    Mandal, D., Jarzynski, C.: Work and information processing in a solvable model of Maxwell’s Demon. Proc. Natl. Acad. Sci. USA 109(29), 11641–11645 (2012)CrossRefADSGoogle Scholar
  4. 4.
    Boyd, A.B., Crutchfield, J.P.: Demon dynamics: deterministic chaos, the Szilard map, and the intelligence of thermodynamic systems. Santa Fe Institute Working Paper 15–06-019. arXiv:1506.04327 [cond-mat.stat-mech]
  5. 5.
    Boyd, A.B., Mandal, D., Crutchfield, J.P.: Identifying functional thermodynamics in autonomous Maxwellian ratchets. Santa Fe Institute Working Paper 15-07-025. arXiv:1507.01537 [cond-mat.stat-mech]
  6. 6.
    Rieke, F., Warland, D., de Ruyter, R., van Steveninck, Bialek, W.: Spikes: Exploring the Neural Code. Bradford Book, New York (1999)Google Scholar
  7. 7.
    Cutsuridis, V., Hussain, A., Taylor, J.G.: Perception-Action Cycle. Springer, New York (2011)CrossRefGoogle Scholar
  8. 8.
    Gordon, G., Kaplan, D., Lankow, D., Little, D., Sherwin, J., Suter, B., et al.: Toward an integrated approach to perception and action: conference report and future directions. Front. Syst. Neurosci. 5, 20 (2011)CrossRefGoogle Scholar
  9. 9.
    Padgett, J.F., Lee, D., Collier, N.: Economic production as chemistry. Ind. Corp. Chang. 12(4), 843–877 (2003)CrossRefGoogle Scholar
  10. 10.
    Crutchfield, J.P., Young, K.: Inferring statistical complexity. Phys Rev. Lett. 63, 105–108 (1989)MathSciNetCrossRefADSGoogle Scholar
  11. 11.
    Gray, R.M.: Probability, Random Processes, and Ergodic Theory, 2nd edn. Springer, New York (2009)CrossRefGoogle Scholar
  12. 12.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, New York (2006)MATHGoogle Scholar
  13. 13.
    Yeung, R.W.: Information Theory and Network Coding. Springer, New York (2008)MATHGoogle Scholar
  14. 14.
    Ellison, C.J., Mahoney, J.R., Crutchfield, J.P.: Prediction, retrodiction, and the amount of information stored in the present. J. Stat. Phys. 136(6), 1005–1034 (2009)MathSciNetCrossRefADSMATHGoogle Scholar
  15. 15.
    Crutchfield, J.P., Ellison, C.J., Mahoney, J.R., James, R.G.: Synchronization and control in intrinsic and designed computation: an information-theoretic analysis of competing models of stochastic computation. CHAOS 20(3), 037105 (2010)MathSciNetCrossRefADSGoogle Scholar
  16. 16.
    Gray, R.M., Davisson, L.D.: The ergodic decomposition of stationary discrete random processses. IEEE Trans. Inf. Theory 20(5), 625–636 (1974)MathSciNetCrossRefMATHGoogle Scholar
  17. 17.
    Billingsley, P.: Statistical methods in Markov chains. Ann. Math. Stat. 32, 12 (1961)MathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    Crutchfield, J.P., Shalizi, C.R.: Thermodynamic depth of causal states: objective complexity via minimal representations. Phys. Rev. E 59(1), 275–283 (1999)MathSciNetCrossRefADSGoogle Scholar
  19. 19.
    Shalizi, C.R., Crutchfield, J.P.: Computational mechanics: pattern and prediction, structure and simplicity. J. Stat. Phys. 104, 817–879 (2001)MathSciNetCrossRefMATHGoogle Scholar
  20. 20.
    Crutchfield, J.P., Ellison, C.J., Mahoney, J.R.: Time’s barbed arrow: irreversibility, crypticity, and stored information. Phys. Rev. Lett. 103(9), 094101 (2009)CrossRefADSGoogle Scholar
  21. 21.
    N. Travers and J. P. Crutchfield. Equivalence of history and generator \(\epsilon \)-machines. Santa Fe Institute Working Paper 11–11-051. arXiv:1111.4500 [math.PR]
  22. 22.
    Crutchfield, J.P.: The calculi of emergence: computation, dynamics, and induction. Physica D 75, 11–54 (1994)CrossRefADSMATHGoogle Scholar
  23. 23.
    Paz, A.: Introduction to Probabilistic Automata. Academic Press, New York (1971)MATHGoogle Scholar
  24. 24.
    Hopcroft, J.E., Ullman, J.D.: Introduction to Automata Theory, Languages, and Computation. Addison-Wesley, Reading (1979)Google Scholar
  25. 25.
    Lind, D., Marcus, B.: An Introduction to Symbolic Dynamics and Coding. Cambridge University Press, New York (1995)CrossRefMATHGoogle Scholar
  26. 26.
    Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 623–656 (1948)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Gray, R.M.: Entropy and Information Theory. Springer, New York (1990)CrossRefMATHGoogle Scholar
  28. 28.
    Oppenheim, A.V., Schafer, R.W.: Discrete-Time Signal Processing. Prentice-Hall Signal Processing Series. Prentice Hall, Upper Saddle River (2010)Google Scholar
  29. 29.
    Mandelbrot, B.: Fractals: Form. Chance and Dimension. W. H. Freeman and Company, New York (1977)MATHGoogle Scholar
  30. 30.
    Rabiner, L.R.: A tutorial on hidden Markov models and selected applications. IEEE Proc. 77, 257 (1989)CrossRefGoogle Scholar
  31. 31.
    Elliot, R.J., Aggoun, L., Moore, J.B.: Hidden Markov Models: Estimation and Control. Applications of Mathematics, vol. 29. Springer, New York (1995)Google Scholar
  32. 32.
    James, R.G., Mahoney, J.R., Ellison, C.J., Crutchfield, J.P.: Many roads to synchrony: natural time scales and their algorithms. Phys. Rev. E 89, 042135 (2014)CrossRefADSGoogle Scholar
  33. 33.
    Crutchfield, J.P., Hanson, J.E.: Turbulent pattern bases for cellular automata. Physica D 69, 279–301 (1993)MathSciNetCrossRefADSMATHGoogle Scholar
  34. 34.
    Li, C.-B., Yang, H., Komatsuzaki, T.: Multiscale complex network of protein conformational fluctuations in single-molecule time series. Proc. Natl. Acad. Sci. USA 105, 536–541 (2008)CrossRefADSGoogle Scholar
  35. 35.
    Varn, D.P., Canright, G.S., Crutchfield, J.P.: \(\epsilon \)-Machine spectral reconstruction theory: a direct method for inferring planar disorder and structure from X-ray diffraction studies. Acta. Cryst. Sec. A 69(2), 197–206 (2013)MathSciNetCrossRefMATHGoogle Scholar
  36. 36.
    Strelioff, C.C., Crutchfield, J.P.: Bayesian structural inference for hidden processes. Phys. Rev. E 89, 042119 (2014)CrossRefADSGoogle Scholar
  37. 37.
    Crutchfield, J.P., McNamara, B.S.: Equations of motion from a data series. Complex Syst. 1, 417–452 (1987)MathSciNetMATHGoogle Scholar
  38. 38.
    Johnson, B.D., Crutchfield, J.P., Ellison, C.J. McTague, C.S.: Enumerating finitary processes. Santa Fe Institute Working Paper 10–11-027. arXiv:1011.0036 [cs.FL]
  39. 39.
    Shalizi, C.R., Shalizi, K.L., Crutchfield, J.P.: Pattern discovery in time series, Part I: Theory, algorithm, analysis, and convergence. Santa Fe Institute Working Paper 02–10-060. arXiv:cs.LG/0210025
  40. 40.
    Still, S., Crutchfield, J.P., Ellison, C.J.: Optimal causal inference: estimating stored information and approximating causal architecture. CHAOS 20(3), 037111 (2010)MathSciNetCrossRefADSGoogle Scholar
  41. 41.
    Crutchfield, J.P.: Reconstructing language hierarchies. In: Atmanspracher, H.A., Scheingraber, H. (eds.) Information Dynamics, pp. 45–60. Plenum, New York (1991)CrossRefGoogle Scholar
  42. 42.
    Hanson, J.E., Crutchfield, J.P.: The attractor-basin portrait of a cellular automaton. J. Stat. Phys. 66, 1415–1462 (1992)MathSciNetCrossRefADSMATHGoogle Scholar
  43. 43.
    Crutchfield, J.P.: Discovering coherent structures in nonlinear spatial systems. In: Brandt, A., Ramberg, S., Shlesinger, M. (eds.) Nonlinear Ocean Waves, pp. 190–216. World Scientific, Singapore (1992)Google Scholar
  44. 44.
    Hanson, J.E., Crutchfield, J.P.: Computational mechanics of cellular automata: An example. Physica D 103, 169–189 (1997)MathSciNetCrossRefADSMATHGoogle Scholar
  45. 45.
    McTague, C.S., Crutchfield, J.P.: Automated pattern discovery—an algorithm for constructing optimally synchronizing multi-regular language filters. Theoe. Comp. Sci. 359(1–3), 306–328 (2006)MathSciNetCrossRefMATHGoogle Scholar
  46. 46.
    Crutchfield, J.P., Görnerup, O.: Objects that make objects: the population dynamics of structural complexity. J. R. Soc. Interface 3, 345–349 (2006)CrossRefGoogle Scholar
  47. 47.
    Crutchfield, J.P., Whalen, S.: Structural drift: the population dynamics of sequential learning. PLoS Comput. Biol. 8(6), e1002510 (2010)MathSciNetCrossRefGoogle Scholar
  48. 48.
    J. P. Crutchfield. Optimal structural transformations-the \(\epsilon \)-transducer. UC Berkeley Physics Research Report, 1994Google Scholar
  49. 49.
    Shalizi, C.R.: Causal architecture, complexity and self-organization in time series and cellular automata. PhD thesis, University of Wisconsin, Madison (2001)Google Scholar
  50. 50.
    Hopfield, J.J.: Neural networks and physical systems with emergent collective behavior. Proc. Natl. Acad. Sci. 79, 2554 (1982)MathSciNetCrossRefADSGoogle Scholar
  51. 51.
    Hertz, J., Krogh, A., Palmer, R.G.: An Introduction to the Theory of Neural Networks. Lecture Notes, vol. 1. Studies in the Sciences of Complexity. Addison-Wesley, Redwood City (1991)Google Scholar
  52. 52.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Spinger, New York (2011)Google Scholar
  53. 53.
    MacKay, D.J.C.: Information Theory. Inference and Learning Algorithms. Cambridge University Press, Cambridge (2003)MATHGoogle Scholar
  54. 54.
    Bialek, W., Nemenman, I., Tishby, N.: Predictability, complexity, and learning. Neural Comput 13, 2409–2463 (2001)CrossRefMATHGoogle Scholar
  55. 55.
    Shalizi, C.R., Crutchfield, J.P.: Information bottlenecks, causal states, and statistical relevance bases: How to represent relevant information in memoryless transduction. Adv. Comput. Syst. 5(1), 91–95 (2002)CrossRefMATHGoogle Scholar
  56. 56.
    Albers, D., Sprott, J.C., Crutchfield, J.P.: Persistent chaos in high dimensions. Phys. Rev. E 74(5), 057201 (2006)MathSciNetCrossRefADSGoogle Scholar
  57. 57.
    Marzen, S., Crutchfield, J.P.: Informational and causal architecture of discrete-time renewal processes. Entropy 17(7), 4891–4917 (2015)Google Scholar
  58. 58.
    Crutchfield, J.P.: Information and its metric. In: Lam, L., Morris, H.C. (eds.) Nonlinear Structures in Physical Systems—Pattern Formation. Chaos and Waves, pp. 119–130. Springer, New York (1990)CrossRefGoogle Scholar
  59. 59.
    Shannon, C.E.: The lattice theory of information. IEEE Trans. Inf. Theory 1, 105–107 (1953)MATHGoogle Scholar
  60. 60.
    Li, H., Chong, E.K.P.: On a connection between information and group lattices. Entropy 13, 683–798 (2011)MathSciNetCrossRefADSMATHGoogle Scholar
  61. 61.
    Morse, M., Hedlund, G.A.: Symbolic dynamics. Am. J. Math. 60(4), 815–866 (1938)MathSciNetCrossRefGoogle Scholar
  62. 62.
    Post, E.: Introduction to the general theory of elementary propositions. Am. J. Math. 43, 163–185 (1921)MathSciNetCrossRefMATHGoogle Scholar
  63. 63.
    Gödel, K.: On Formally Undecidable Propositions of Principia Mathematica and Related Systems. Dover Publications, Mineola (1992)MATHGoogle Scholar
  64. 64.
    Turing, A.: On computable numbers, with an application to the Entschiedungsproblem. Proc. Lond. Math. Soc. 42, 43:230–265, 544–546 (1937)Google Scholar
  65. 65.
    Church, A.: A note on the Entscheidungsproblem. J. Symb. Log. 1, 40–41 (1936)CrossRefGoogle Scholar
  66. 66.
    Moore, E.F.: Gedanken-experiments on sequential machines. In: Shannon, C., McCarthy, J. (eds.) Automata Studies. number 34 in Annals of Mathematical Studies, pp. 129–153. Princeton University Press, Princeton, New Jersey (1956)Google Scholar
  67. 67.
    Huffman, D.: The synthesis of sequential switching circuits. J. Frankl Inst. 257(161–190), 275–303 (1954)MathSciNetCrossRefGoogle Scholar
  68. 68.
    Huffman, D.: Information conservation and sequence transducers. In: Proceedings of the Symposium on Information Networks, pp. 291–307. Polytechnic Institute of Brooklyn, Brooklyn (1954)Google Scholar
  69. 69.
    Huffman, D.: Canonical forms for information-lossless finite-state logical machines. IRE Trans. Circuit Theory 6, 41–59 (1959)CrossRefGoogle Scholar
  70. 70.
    Huffman, D.: Notes on information-lossless finite-state automata. Il Nuovo Cimento 13(2 Supplement), 397–405 (1959)MathSciNetCrossRefMATHGoogle Scholar
  71. 71.
    Brookshear, J.G.: Theory of Computation: Formal Languages, Automata, and Complexity. Benjamin/Cummings, Redwood City (1989)MATHGoogle Scholar
  72. 72.
    Ash, R.B.: Information Theory. Wiley, New York (1965)MATHGoogle Scholar
  73. 73.
    Blackwell, D., Breiman, L., Thomasian, A.J.: Proof of Shannon’s transmission theorem for finite-state indecomposable channels. Ann. Math. Stat. 29(4), 1209–1220 (1958)MathSciNetCrossRefMATHGoogle Scholar
  74. 74.
    Blackwell, D.: Exponential error bounds for finite state channels. In: Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 57–63. University of California Press, (1961)Google Scholar
  75. 75.
    Hamming, R.W.: Digital Filterns, 3rd edn. Dover Publications, New York (1997)Google Scholar
  76. 76.
    Bateson, G.: Mind and Nature: A Necessary Unity. E. P. Dutton, New York (1979)Google Scholar
  77. 77.
    R. Shaw. Strange attractors, chaotic behavior, and information flow. Z. Naturforsh., 36a:80, 1981Google Scholar
  78. 78.
    Ahlswede, R., Körner, J.: Appendix: on common information and related characteristics of correlated information sources. In: Ahlswede R., Baumer, Cai N., Aydinian H., Blinovsky V., Deppe C., Mashurian H. (eds.) General Theory of Information Transfer and Combinatorics. Lecture Notes in Computer Science, vol. 4123, pp. 664–677. Springer, Berlin (2006)Google Scholar
  79. 79.
    Williams, P.L., Beer, R.D.: Generalized measures of information transfer. arXiv:1102.1507
  80. 80.
    Barnett, L., Barrett, A.B., Seth, A.K.: Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett. 103, 238701 (2009)CrossRefADSGoogle Scholar
  81. 81.
    Schreiber, T.: Measuring information transfer. Phys. Rev. Lett. 85, 461–464 (2000)CrossRefADSGoogle Scholar
  82. 82.
    Marko, H.: The bidirectional communication theory—a generalization of information theory. IEEE Trans. Commun. 21, 1345–1351 (1973)CrossRefGoogle Scholar
  83. 83.
    Massey, J.L.: Causality, feedback and directed information. In: Proceedings of the 1990 International Symposium on Information Theory and Its Applications, pp. 1–6. Waikiki, Hawaii, 27–30 Nov 1990Google Scholar
  84. 84.
    Sun, J., Bollt, E.M.: Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings. Physica D 267, 49–57 (2014)MathSciNetCrossRefADSMATHGoogle Scholar
  85. 85.
    Barnett, N., Crutchfield, J.P.: Computational mechanics of input-output processes: Shannon information measures and decompositions. in preparation (2014)Google Scholar
  86. 86.
    Fedorov, V.V.: Theory of Optimal Experiments. Probability and Mathematical Statistics. Academic Press, New York (1972)Google Scholar
  87. 87.
    Atkinson, A., Bogacka, B., Zhigljavsky, A.A. (eds.): Optimum Design 2000. Nonconvex Optimization and Its Applications. Springer, New York (2001)Google Scholar
  88. 88.
    Still, S.: Information-theoretic approach to interactive learning. EuroPhys. Lett. 85, 28005 (2009)CrossRefADSGoogle Scholar
  89. 89.
    Astrom, K.J., Murray, R.M.: Feedback Systems: An Introduction for Scientists and Engineers. Princeton University Press, Princeton (2008)Google Scholar
  90. 90.
    Crutchfield, J.P., Riechers, P., Ellison, C.J.: Exact complexity: spectral decomposition of intrinsic computation. Santa Fe Institute Working Paper 13–09-028. arXiv:1309.3792 [cond- mat.stat-mech]

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.Complexity Sciences CenterUniversity of California at DavisDavisUSA
  2. 2.Mathematics DepartmentUniversity of California at DavisDavisUSA
  3. 3.Physics DepartmentUniversity of California at DavisDavisUSA

Personalised recommendations