Abstract
Information, entropy, probability: these three terms are closely interconnected in the prevalent understanding of statistical mechanics, both when this field is taught to students at an introductory level and in advanced research into the field’s foundations. This paper examines the interconnection between these three notions in light of recent research in the foundations of statistical mechanics. It disentangles these concepts and highlights their differences, at the same time explaining why they came to be so closely linked in the literature. In the literature the term ‘information’ is often linked to entropy and probability in discussions of Maxwell’s Demon and its attempted exorcism by the Landauer-Bennett thesis, and in analyses of the spin echo experiments. The direction taken in the present paper is a different one. Here we discuss the statistical mechanical underpinning of the notions of probability and entropy, and this constructive approach shows that information plays no fundamental role in these concepts, although it can be conveniently used in a sense that we shall specify.
Similar content being viewed by others
Notes
The paper has a second part, see Jaynes (1957b).
In recent decades a substantial amount of work has been done in the field of the foundations of statistical mechanics, in which the soundness of (alleged) theorems of mechanics has been re-examined: examples of such re-examined theorems include the Landauer-Bennett thesis (see Hemmo and Shenker 2013), Maxwell’s Demon (see Hemmo and Shenker 2012, 2016), and, more generally, various correlates of the second law of thermodynamics, such as Lanford’s theorem (see see Uffink and Valente 2010, 2015). The task of this foundational work was, and still is, to examine whether these alleged theorems can be derived from the first principles of mechanics, or whether – in order to establish them – one needs to make additional assumptions, and if so, what exactly those are (for the distinction between what can be derived from mechanics and what requires auxiliary hypotheses see Shenker 2017a, b. These efforts require an analysis of the subject matter of statistical mechanics and thermodynamics, e.g. a clarification of what exactly the second law of thermodynamics says (see Uffink 2001, 2007, Frigg 2008). As part of this foundational work both mainstream and non-mainstream approaches have been put forward and are constantly being studied and debated. For overviews of mainstream ideas about the foundations of statistical mechanics see e.g., Sklar (1993), Albert (2000), Uffink (2007), Frigg (2008).
For a detailed overview of versions of the second law see Uffink (2001). What is the second law about? It is often said that the second law is about the degree to which energy is exploitable to produce work, and that this degree is quantified by the term ‘entropy’. However, it should be noticed that this is not a matter of mere definition: in thermodynamics the term ‘entropy’ can be associated with the degree of un-exploitability of energy only if the second law is already assumed to be valid. See Hemmo and Shenker 2012 Ch. 1.
In circumstances where fluctuations are observed, the probabilities account for them as well.
We agree with Ladyman and Ross (2007) and Wallace (2001) that it is a serious mistake to carry out metaphysical investigation assuming that the world is as classical mechanics describes it, when physics tells us that this is clearly not the case. The use of classical terminology and laws is legitimate only if they preserve essential features of the phenomena and fundamental facts being addressed. On the differences between classical and quantum statistical mechanics see Shenker 2018.
The idea that the microstates need to be partitioned into sets appears in the two major approaches to statistical mechanics, namely the Boltzmannian on and the Gibbisian one, albeit in different ways and with different aims. In the former it was initially part of Boltzmann’s combinatorial argument (See Uffink 2014) and in the latter it forms part of Gibbs’s coarse graining argument (See Sklar 1993, Uffink 2007, Frigg 2008). The two major approaches to the foundations of statistical mechanics are notoriously problematic: Boltzmann’s approach is not dynamical and is empirically inadequate as far as the approach to equilibrium is concerned (See Hemmo and Shenker 2012 Sec. 7.10, 2015b, Allori, 2013, 2015). On the well-known conceptual problems with Gibbs’s approach see Callender (1999). On how to interpret Gibbs’s approach in order to make it conceptually reasonable, see Hemmo and Shenker 2012 Ch. 11. The approach described in this paper is neither completely Boltzmannian nor completely Gibbsian, but takes the good ideas from each of them; and both can be understood as approximations in the present framework, along the lines described in Hemmo and Shenker 2012 Ch. 11.
In circumstances where fluctuations are observed, the probabilities account for them as well.
Such an aspect may be understood as a ‘property’ of the microstate of the gas, or of the gas, at that moment; we shall not address this notion here. See the overview of the notion of ‘property’ in Orilia and Swoyer (2016).
In other cases of partial description, the labels may play a role, for example when the partial description means focusing on the properties of only some of the particles and not others. I am grateful to an anonymous referee for EJPS for bringing this interesting point to my attention.
For more on this notion see Shenker 2017a.
This set resembles a Poincare section, usually used to describe quasi-periodic systems.
On these shortcuts see Hemmo and Shenker 2012 s. 6.5.
Non negativity, null empty set, and sigma additivity.
See Hemmo and Shenker 2015a.
See Hemmo and Shenker 2015a.
See Van Fraassen (1989) on this point.
See Hemmo and Shenker 2012 Sec. 6.5 and Ch. 11 for outlines of such shortcuts.
In the Gibbsian tradition the concept of entropy appears to be different (see Uffink 2007, Frigg 2008), and the considerations for choosing a measure seem to be different; however, essentially, the considerations are empirical and based on the second law too. Moreover, in the framework of the conceptual framework employed here, the Gibbsian account reduces to the Boltzmannian notion in the cases that are interesting for the second law of thermodynamics, and so we can safely proceed with only the Boltzmannian account. See Hemmo and Shenker 2012 Ch. 11.
See Hemmo and Shenker 2012 Ch. 7.
\( \Delta S={C}_v\ln \left(\frac{T_B}{T_A}\right)+ Rln\left(\frac{V_B}{V_A}\right) \), where Cv is specific heat in constant temperature, R is the gas constant, V is volume and T is temperature, in equilibrium states A and B.
This fact is emphasized by Ben-Menahem (2018), who takes this to be an example of a non-reductive thinking in science. I disagree with her, for the reasons given here.
It should be noted that Brillouin’s argument is flawed because he forgets to take into account the fact that when the energy packet enters the Demon’s eye it also leaves the gas, thus changing the total entropy balance; see another criticism in Leff and Rex (2003), pp. 17–19
Does the past hypothesis call for an explanation? This question is addressed in Baras and Shenker (forthcoming).
Jaynes noted right from the start that the MaxEnt theory is more general, and indeed it is applied in a variety of contexts. See the variety of articles mentioning MaxEnt in the Stanford Encyclopedia of Philosophy.
References
Albert, D. (2000). Time and chance. Cambridge: Harvard University Press.
Allori, V. (2013). Review of the road to Maxwell’s demon. International Studies in the Philosophy of Science, 27, 453–456.
Allori, V. (2015). Response to authors. International Studies in the Philosophy of Science, 29, 94–98.
Balian, R. (2005). Information in statistical physics. Studies in History and Philosophy of Modern Physics, 36, 323–353.
Ben-Menahem, Y. (2018). Causation in science. Princeton: Princeton University Press.
Bennett, C. (1982). The thermodynamics of computation: A review. International Journal of Theoretical Physics, 21, 905–940.
Bennett, C. (2003). Notes on Landauer’s principle, reversible computation, and Maxwell’s demon. Studies in History and Philosophy of Modern Physics, 34, 501–510.
Brillouin, L. (1962). Science and information theory. London: Academic Press.
Brown, H., & Uffink, J. (2001). The origins of time-asymmetry in thermodynamics: The minus first law. Studies in History and Philosophy of Modern Physics, 32, 525–538.
Bub, J. (2001). Maxwell's demon and the thermodynamics of computation. Studies in History and Philosophy of Modern Physics, 32, 569–579.
Callender, C. (1999). Reducing thermodynamics to statistical mechanics: The case of entropy. Journal of Philosophy, 46, 348–373.
Earman, J., & Norton, J. (1998). Exorcist XIV: The wrath of Maxwell’s demon. Part I. From Maxwell to Szilard. Studies in History and Philosophy of Modern Physics, 29, 435–471.
Eddington, A. (1935). The nature of the physical world. London: Everyman’s Library.
Einstein, Albert. 1919. Time, Space, and Gravitation. Times London, 28 November 1919, 13–14. Reprinted as: What is the Theory of Relativity? In Einstein, Albert. 1954. Ideas and opinions, New York: Bonanza Books.
Einstein, A. (1970). Autobiographical notes. In P. A. Schilpp (Ed.), Albert Einstein: Philosopher-scientist (Vol. 2). Cambridge: Cambridge University Press.
Fermi, E. (1936). Thermodynamics (p. 1956). New York: Dover.
Feynman, R. (1965). The character of physical law. Cambridge: MIT Press.
Feynman, R. P., Leighton, R. B., & Sands, M. L. (1963). The Feynman lectures on physics. Reading, Mass: Addison-Wesley Pub. Co.
Floridi, L. (2010). Information: A very short introduction. Oxford: Oxford University Press.
Floridi, L. (2011). The philosophy of information. Oxford: Oxford University Press.
Frigg, R. (2008). A field guide to recent work on the foundations of statistical mechanics. In D. Rickles (Ed.), The Ashgate companion to contemporary philosophy of physics (pp. 99–196). London: Ashgate.
Frigg, R. (2009). Typicality and the approach to equilibrium in Boltzmannian statistical mechanics. Philosophy of Science, 76, 997–1008.
Frigg, R. (2011). Why typicality does not explain the approach to equilibrium. In M. Suárez (Ed.), Probabilities, causes and propensities in physics. Synthese Library, Dordrecht: Reidel.
Gillies, D. (2000). Philosophical theories of probability. Oxon: Routledge.
Goldstein, S. (2012). Typicality and notions of probability in physics. In Y. Ben-Menahem & M. Hemmo (Eds.), Probability in physics (pp. 59–72). Berlin and Heidelberg: Springer.
Hemmo, M., & Shenker, O. (2006). Von Neumann's entropy does not correspond to thermodynamic entropy. Philosophy of Science, 73, 153–174.
Hemmo, M., & Shenker, O. (2010). Maxwell's demon. The Journal of Philosophy, 107, 389–411.
Hemmo, M., & Shenker, O. (2011). Szilard's perpetuum mobile. Philosophy of Science, 78, 264–283.
Hemmo, M., & Shenker, O. (2012). The road to Maxwell’s demon: Conceptual foundations of statistical mechanics. Cambridge: Cambridge University Press.
Hemmo, M., & Shenker, O. (2013). Entropy and computation: The Landauer-Bennett thesis reexamined. Entropy, 15, 3387–3401.
Hemmo, M., & Shenker, O. (2015a). Probability and typicality in deterministic physics. Erkenntnis, 80, 575–586.
Hemmo, M., & Shenker, O. (2015b). Letter to the editor. International Studies in the Philosophy of Science, 29, 91–92.
Hemmo, M., & Shenker, O. (2016). Maxwell's demon. Oxford Handbooks Online. https://doi.org/10.1093/oxfordhb/9780199935314.013.63.
Hemmo, Meir and Shenker, Orly. 2019. The past hypothesis and the psychological arrow of time. British Journal for Philosophy of Science. Forthcoming. Available online at: https://academic.oup.com/bjps/advance-article/doi/10.1093/bjps/axz038/5543104?guestAccessKey=186297a8-c467-42e3-bfe0-a06100b9633d
Jaynes, E. (1957a). Information theory and statistical mechanics. Physical Review, 106, 620–630.
Jaynes, E. (1957b). Information theory and statistical mechanics II. Physical Review, 108, 171–190.
Keynes, J. M. (1921). A treatise on probability. London: Macmillan.
Kolmogorov, A.N. 1933. Foundations of the theory of probability. English translation 1956. New York: Chelsea.
Ladyman, J., & Ross, D. (2007). Every thing must go: Metaphysics naturalized. Oxford: Oxford University Press.
Landauer, R. (1961). Irreversibility and heat generation in the computing process. IBM Journal of Research and Development, 3, 183–191.
Leff, H. S., & Rex, A. F. (2003). Maxwell’s demon 2: Entropy, classical and quantum information, computing. Bristol: Institute of Physics.
Loewer, B. (2001). Determinism and chance. Studies in History and Philosophy of Modern Physics, 32, 609–620.
Lombardi, O. (2004). What is information? Foundations of Science, 9, 105–134.
Lombardi, O. (2005). Dretske, Shannon’s theory and the interpretation of information. Synthese, 144, 23–39.
Lombardi, O., Fortin, S., & Vanni, L. (2015). A pluralist view about information. Philosophy of Science, 82, 1248–1259.
Lombardi, O., Holik, F., & Vanni, L. (2016a). What is quantum information? Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 56, 17–26.
Lombardi, O., Holik, F., & Vanni, L. (2016b). What is Shannon information? Synthese, 193, 1983–2012.
Lombardi, O., Fortin, S., & Lopez, C. (2016c). Deflating the deflationary view of information. European Journal for Philosophy of Science, 6, 209–230.
Mellor, H. D. (2005). Probability: A philosophical introduction. Oxon: Routledge.
Norton, J. (2017). The worst thought experiment. In M. T. Stuart, Y. Fehige, & J. R. Brown (Eds.), The Routledge companion to thought experiments (pp. 454–468). Routledge.
Orilia, F., & Swoyer, C. (2016). Properties. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). http://plato.stanford.edu/archives/spr2016/entries/properties/.
Pitowsky, I. (2012). Typicality and the role of the Lebesgue measure in statistical mechanics. In Y. Ben-Menahem & M. Hemmo (Eds.), Probability in physics (pp. 41–58). Berlin and Heidelberg: Springer.
Reif, Frederick. 1965, reprinted 2009. Fundamentals of Statistical and Thermal Physics. Waveland.
Shannon, C. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27(379–423), 623–656.
Shenker, O. (1999). Is -kTr(ρlnρ) the entropy in quantum mechanics? British Journal for the Philosophy of Science, 50, 33–48.
Shenker, O. (2017a). Foundations of statistical mechanics: Mechanics by itself. Philosophy Compass, 12. https://doi.org/10.1111/phc3.12465.
Shenker, O. (2017b). Foundations of statistical mechanics: The auxiliary hypotheses. Philosophy Compass, 12. https://doi.org/10.1111/phc3.12464.
Shenker, O. (2018). Foundations of quantum statistical mechanics. In E. Knox & A. Wilson (Eds.), Routledge companion to the philosophy of physics. Oxford: Routledge Forthcoming.
Sklar, L. (1993). Physics and chance. Cambridge: Cambridge University Press.
Szilard, Leo. 1929. On the decrease in entropy in a thermodynamic system by the intervention of intelligent beings. In Leff and Rex (2003), 110–119.
Timpson, C. (2013). Quantum information theory and the foundations of quantum mechanics. Oxford: Oxford University Press.
Tribus, M., & McIrvine, E. C. (1971). Energy and information. Scientific American, 224, 179–186.
Uffink, J. (2001). Bluff your way in the second law of thermodynamics. Studies in History and Philosophy of Modern Physics, 32, 305–394.
Uffink, J. (2007). Compendium to the foundations of classical statistical physics. In J. Butterfield & J. Earman (Eds.), Handbook for the philosophy of physics, Part B (pp. 923–1074). North-Holland: Elsevier.
Uffink, Jos. 2014. Boltzmann's work in statistical physics. The Stanford Encyclopedia of Philosophy (fall 2014 edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/fall2014/entries/statphys-Boltzmann/>.
Uffink, J., & Valente, G. (2010). Time’s arrow and Lanford’s theorem. Seminaire Poincare XV Le Temps, 141–173.
Uffink, J., & Valente, G. (2015). Lanford’s theorem and the emergence of irreversibility. Foundations of Physics, 45(4), 404–438.
van Fraassen, B. C. (1989). Laws and Symmetry. Oxford: Oxford University Press.
von Neumann, John. 1932. Mathematical foundations of quantum mechanics. English translation by R. T. Beyer (1955). Princeton: Princeton University Press.
Wallace, D. (2001). Implications of quantum theory in the foundations of statistical mechanics. http://philsciarchive.pitt.edu/410/1/wallace.pdf.
Wiener, N. (1948). Cybernetics: Or control and communication in the animal and the machine. Cambridge: MIT Press.
Acknowledgements
I am grateful to Olimpia Lombardi for her useful comments, as well as to anonymous referees for this journal. This research was supported by a grant from Lockheed-Martin.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Shenker, O. Information vs. entropy vs. probability. Euro Jnl Phil Sci 10, 5 (2020). https://doi.org/10.1007/s13194-019-0274-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s13194-019-0274-4