Abstract

The technical concept of information developed after Shannon [22] has fueled advances in many fields, but its quantitative precision and its breadth of application have come at a cost. Its formal abstraction from issues of reference and significance has reduced its usefulness in fields such as biology, cognitive neuroscience and the social sciences where such issues are most relevant. I argue that explaining these nonintrinsic properties requires focusing on the physical properties of the information medium with respect to those of its physical context—and specifically the relationship between the thermodynamic and information entropies of each. Reference is shown to be a function of the thermodynamic openness of the information medium. Interactions between an informing medium and its physical context that drive the medium to a less probable state create intrinsic constraints that indirectly reflect the form of this extrinsic influence. This susceptibility of an informing medium to the effects of physical work is also relevant for assessing the significance or usefulness of information. Significance can be measured in terms of work “saved” due to access to information about certain contextual factors relevant to achieving a preferred target condition.

References

  1. 1.
    Bateson, G.: Upside-Down Gods and Gregory Bateson’s World of Difference. Fordham University Press, New York (1968)Google Scholar
  2. 2.
    Ben-Naim, A.: A Farewell To Entropy: Statistical Thermodynamics Based on Information. World Scientific Publishing Co, Singapore (2008)CrossRefGoogle Scholar
  3. 3.
    Boltzmann, L.: The second law of thermodynamics. Populare Schriften, Essay 3, address to a formal meeting of the Imperial Academy of Science, 29 May 1886, reprinted in Ludwig Boltzmann, Theoretical Physics and Philosophical Problems, S. G. Brush (Trans.). Boston: Reidel (1874)Google Scholar
  4. 4.
    Brentano, F.: Psychology From an Empirical Standpoint. Routledge & Kegan Paul, London, pp. 88–89 (1874)Google Scholar
  5. 5.
    Brillouin, L.: Science and Information Theory. Academic Press, New York (1962)MATHGoogle Scholar
  6. 6.
    Chaitin, G.: Algorithmic Information Theory. IBM J. Res. Develop. 21(350–359), 496 (1977)MathSciNetMATHGoogle Scholar
  7. 7.
    Clausius, R.: The Mechanical Theory of Heat: With its Applications to the Steam Engine and to Physical Properties of Bodies. John van Voorst, London (1865)Google Scholar
  8. 8.
    Deacon, T.: Shannon-Boltzmann-Darwin: Redefining Information. Part 1. Cogn. Semiot. 1, 123–148 (2007)CrossRefGoogle Scholar
  9. 9.
    Deacon, T.: Shannon-Boltzmann-Darwin: Redefining Information. Part 2. Cogn. Semiot. 2, 167–194 (2008)Google Scholar
  10. 10.
    Deacon, T.: Incomplete Nature: How Mind Emerged from Matter. W. W. Norton & Co., New York (2012)Google Scholar
  11. 11.
    Deacon, T., Koutroufinis, S.: Complexity and dynamical depth. Information 5, 404–423 (2014)CrossRefGoogle Scholar
  12. 12.
    Deacon, T., Srivastava, A., Bacigalupi, J.A.: The transition from constraint to regulation at the origin of life. Front. Biosci. 19, 945–957 (2014)CrossRefGoogle Scholar
  13. 13.
    Fano, R.: Quoted from Information Theory And The Digital Age. Compiled by Aftab, Cheung, Kim, Thakkar, Yeddanapudi, in 6.933: Project History, Massachusetts Institute of Technology (2001). SNAPES@MIT.EDU. http://web.mit.edu/6.933/www/Fall2001/Shannon2.pdf
  14. 14.
    Jaynes, E.T.: Information theory and statistical mechanics. Phys. Rev. 106, 620 (1957)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Kline, S.J.: The Low-Down on Entropy and Interpretive Thermodynamics. DCW Industries, Lake Arrowhead (1999)Google Scholar
  16. 16.
    Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Probl. Inf. Transm. 1, 1–7 (1965)MATHGoogle Scholar
  17. 17.
    Landauer, R.: Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 5, 183–191 (1961)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Leff, H.S., Rex, A.F. (eds.): Maxwell’s Demon, Entropy, Information, Computing. Princeton University Press, Princeton, NJ (1990)Google Scholar
  19. 19.
    Maxwell, J.C.: Theory of Heat. Longmans, Green and Co, London (1871)Google Scholar
  20. 20.
    Mirowski, P.: Machine Dreams: Economics Becomes a Cyborg Science. Cambridge University Press, New York (2002)Google Scholar
  21. 21.
    Seife, C.: Decoding the Universe: How the New Science of Information is Explaining Everything in the Cosmos, from Our Brains to Black Holes. Penguin Books, London (2007)Google Scholar
  22. 22.
    Shannon, C.: The mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423, 623–656 (1948)Google Scholar
  23. 23.
    Szillard, L.: On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings. In: The Collected Works of Leo Szilard: Scientific Papers (MIT Press, 1972), pp. 120–129 (1929)Google Scholar
  24. 24.
    Ter Haar, D.: Elements of Statistical Mechanics. Rinehart Press, Boulder (1954)MATHGoogle Scholar
  25. 25.
    Thims, Libb: Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair. J. Hum. Thermodyn. 8(1), 1–120 (2012)Google Scholar
  26. 26.
    Wicken, J.: Entropy and information: suggestions for a common language. Philos. Sci. 54, 176–193 (1987)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.University of CaliforniaBerkeleyUSA

Personalised recommendations