Abstract
There is a long tradition of thinking of thermodynamics, not as a theory of fundamental physics (or even a candidate theory of fundamental physics), but as a theory of how manipulations of a physical system may be used to obtain desired effects, such as mechanical work. On this view, the basic concepts of thermodynamics, heat and work, and with them, the concept of entropy, are relative to a class of envisaged manipulations. This article is a sketch and defense of a science of manipulations and their effects on physical systems. I call this science thermodynamics (with hyphen), or \({\Theta \Delta }^{\text{cs}}\), for short, to highlight that it may be different from the science of thermodynamics, as the reader conceives it. An upshot of the discussion is a clarification of the roles of the Gibbs and von Neumann entropies. Light is also shed on the use of coarsegrained entropies.
This is a preview of subscription content, access via your institution.
Notes
 1.
The word’s first appearance is in Part VI of Kelvin’s “On the Dynamical Theory of Heat” [1], read before the Royal Society of Edinburgh on May 1, 1854. There he recapitulates what in the previous year [2] he had called the “Fundamental Principles in the Theory of the Motive Power of Heat,” now relabelled “Fundamental Principles of General Thermodynamics.”
 2.
Maxwell used this and related abbreviations in his correspondence with P. G. Tait. See letter to Tait of Dec. 1, 1873 ([3], p. 947).
 3.
The minus first law, which Brown and Uffink also call the Equilibrium Principle, is given by them as,
An isolated system in an arbitrary initial state within a finite fixed volume will spontaneously attain a unique state of equilibrium ([4], p. 528).
They point out that a principle of this sort had been recognized as a law of thermodynamics earlier, by Uhlenbeck and Ford ([5], p, 5).
 4.
Borrowing the apt phrase of J.S. Bell [11].
 5.
 6.
If b cannot be reached from a via any manipulation in \(\mathcal {M}\), or if the set considered has no upper bound, \({S}_{\mathcal{M}}(a \rightarrow b)\) is undefined. To avoid qualifying every formula involving entropies with a proviso that all quantities mentioned therein are defined, we can, if we like, allow \({S}_{\mathcal{M}}(a \rightarrow b)\) to take values in the extended reals, which supplement the reals with \(\pm \infty\). Then, if b cannot be reached from a, \({S}_{\mathcal{M}}(a \rightarrow b) = \infty\).
 7.
Example, from one of the most widely used textbooks,
A description of a thermodynamic system requires the specification of the “walls” that separate it from its surroundings and that provide its boundary conditions. It is by means of manipulations of the walls that the extensive parameters of the system are altered and processes are initiated ([18], p. 15).
 8.
 9.
It should be stressed that we are not defining the statistical mechanical entropy \({S}_{\mathcal{M}}(a \rightarrow b)\) in terms of \(S[\rho _b]\) and \(S[\rho _a]\); it is defined by (19).
 10.
To be clear: I don’t know of anyone who has actually committed this error. Certainly not Gibbs or von Neumann.
 11.
Letter to P. G. Tait, 11 Dec. 1867, in [3], p. 332.
 12.
It is essential to the theorem that the dynamics preserve phasespace volume. That this condition is required to underwrite the second law is illustrated by Earman and Norton, who, building on earlier work by others, exhibit a fictitious system with nonHamiltonian, energyconserving, timereversal invariant dynamics that completely converts heat drawn from a heat reservoir into work [42,43,44].
 13.
See [45] for further discussion of these points.
References
 1.
Thomson, W.: On the dynamical theory of heat part VI: thermoelectric currents. Trans. R. Soc. Edinb. 21, 123 (1857).
 2.
Thomson, W.: On the dynamical theory of heat, with numerical results deduced from Mr Joule’s equivalent of a Thermal Unit, and M Regnault’s Observations on Steam. Trans. R. Soc. Edinb. 20, 261 (1853).
 3.
Harman, P.M. (ed.): The Scientific Letters and Papers of James Clerk Maxwell, Volume II: 1862–1873. Cambridge University Press, Cambridge (1995)
 4.
Brown, H.R., Uffink, J.: The origins of timeasymmetry in thermodynamics: the Minus First Law. Stud. Hist. Philos. Mod. Phys. 32, 525 (2001)
 5.
Uhlenbeck, G.E., Ford, G.W.: Lectures in Statistical Mechanics. American Mathematical Society, Providence, R.I. (1963)
 6.
Gour, G., Müller, M.P., Narasimhachar, V., Spekkens, R.W., Halpern, N.Y.: The resource theory of informational nonequilibrium in thermodynamics. Phys. Rep. 583, 1 (2015)
 7.
Goold, J., Huber, M., Riera, A., del Rio, L., Skrzypczyk, P.: The role of quantum information in thermodynamics—a topical review. J. Phys. A 49, 143001 (2016)
 8.
Ng, N.H.Y., Woods, M.P.: Resource theory of quantum thermodynamics: thermal operations and second laws. In: Binder, F., Correa, L.A., Gogolin, C., Anders, J., Adesso, G. (eds.) Thermodynamics in the Quantum Regime: Fundamental Aspects and New Directions, pp. 625–650. Springer, New York (2018)
 9.
Lostaglio, M.: An introductory review of the resource theory approach to thermodynamics. Rep. Prog. Phys. 82, 114001 (2019)
 10.
Wallace, D.: Thermodynamics as control theory. Entropy 16, 699 (2016)
 11.
Bell, J.S.: Free variables and local causality. Epistemol. Lett. 15, 79 (1977).
 12.
Pearl, J.: Causality: Models, Reasoning, and Inference. Cambridge University Press, Cambridge (2000)
 13.
Woodward, J.: Making Things Happen: A Theory of Causal Explanation. Oxford University Press, Oxford (2003)
 14.
Woodward, J.: Causation and manipulability. In: Zalta, E.N. (ed.) The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University, Stanford (2017)
 15.
Clausius, R.: Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Ann. Phys. 125, 353 (1865). Reprinted in [16], 1–44
 16.
Clausius, R.: Abhandlungen über die mechanische Wärmetheorie, vol. 2. Friedrich Vieweg und Sohn, Braunschweig (1867)
 17.
Norton, J.D.: The impossible process: thermodynamic reversibility. Stud. Hist. Philos. Mod. Phys. 55, 43 (2016)
 18.
Callen, H.B.: Thermodynamics and an Introduction to Thermostatistics, 2nd edn. Wiley, New York (1985)
 19.
Gibbs, J.W.: On the equilibrium of heterogeneous substances, Part I. Trans. Conn. Acad. Arts Sci. 3, 108 (1875). Reprinted in [20], 55–184
 20.
Gibbs, J.W.: The Scientific Papers of J. Willard Gibbs, PhD, LLD, vol. I. Longmans, Green, and Co, New York (1906)
 21.
Boltzmann, L.: Über die Beziehung der Diffusionsphänomene zum zweiten Hauptsatze der mechanischen Wärmetheorie. Sitzungsberichte der Kaiserlichen Akademie der Wissenschaften zu Wien, MathematischNaturwissenschaften Klasse 78, 733 (1878)
 22.
Planck, M.: Vorlesungen Über Thermodynamik. Verlag Von Veit & Comp, Leipzig (1897)
 23.
Swendsen, R.H.: Statistical mechanics of colloids and Boltzmann’s definition of the entropy. Am. J. Phys. 74, 187 (2006)
 24.
Swendsen, R.H.: Probability, entropy, and Gibbs’ Paradox(es). Entropy 20, 450 (2018)
 25.
Duhem, P.: Sur la Dissociation dans les Systèmes qui Renferment un Mélange de Gaz Parfaits, Travaux et Mémoires des Facultés de Lille, Tome II. Mémoire 8 (1892)
 26.
Wiedeberg, O.: Das Gibbs’sche Paradoxon. Ann. Phys. 289, 684 (1894)
 27.
Denbigh, K., Redhead, M.: Synthese. Gibbs’ Paradox and nonuniform convergence 81, 283 (1983)
 28.
Maxwell, J.C.: Theory of Heat, 4th edn. Longmans, Green, and Co, London (1875)
 29.
Hemholtz, H.V.: Die Thermodynamik der chemischen Prozessen, pp. 22–39. Sitzungsberichte der Königlich Preussischen Akademie der Wissenschaften zu Berlin (1882)
 30.
Myrvold, W.C.: Probabilities in statistical mechanics. In: Hitchcock, C., Hájek, A. (eds.) Oxford Handbook of Probability and Philosophy, pp. 573–600. Oxford University Press, Oxford (2016)
 31.
Boltzmann, L.: Einige allgemeine Sätze über Wärmegleichgewichte. Sitzungberichte der Kaiserlichen Akademie der Wissenschaften zu Wien, MathematischNaturwissenschaftliche Classe 63, 679 (1871)
 32.
Maxwell, J.C.: On Boltzmann’s Theorem of the average distribution of energy in a system of material points. Trans. Camb. Philos. Soc. 12, 547 (1879). Reprinted in [50], 713–741
 33.
Boltzmann, L.: Vorlesungen Über Gastheorie, II Thiel. Verlag von Johann Ambrosius Barth, Berlin (1898)
 34.
Gibbs, J.W.: Elementary Principles in Statistical Mechanics: Developed with Especial Reference to the Rational Foundation of Thermodynamics. Charles Scribner’s Sons, New York (1902)
 35.
Szilard, L.: On the extension of phenomenological thermodynamics to fluctuation phenomena. In: Feld, B.T., Szilard, G.W., Winsor, K.R. (eds.) The Collected Works of Leo Szilard: Scientific Papers, pp. 70–102. The MIT Press, Cambridge, MA (1972)
 36.
Szilard, L.: Über die Ausdehnung der phänomenologischen Thermodynamik auf die Schwankungserscheinungen. Zeitschrift für Physik 32, 753 (1925)
 37.
Maroney, O.: The physical basis of the Gibbsvon Neumann entropy (2007). arXiv:quantph/0701127v2
 38.
Tolman, R.C.: The Principles of Statistical Mechanics. Clarendon Press, Oxford (1938)
 39.
Maxwell, J.C.: Theory of Heat. Longmans, Green, and Co, London (1871)
 40.
Earman, J., Norton, J.D.: Exorcist XIV: the wrath of Maxwell’s Demon. Part I. from Maxwell to Szilard. Stud. Hist. Philos. Mod. Phys. 29, 435 (1998)
 41.
Wallace, D.: Thermodynamics as control theory (2018). Lecture delivered June 21, 2018, at conference, Thermodynamics as a Resource Theory: Philosophical and Foundational Implications. The University of Western Ontario. https://www.youtube.com/watch?v=TnZTlZN2LiQ
 42.
Earman, J., Norton, J.D.: Exorcist XIV: the Wrath of Maxwell’s Demon. Part II. From Szilard to Landauer and Beyond. Stud. Hist. Philos. Mod. Phys. 30, 1 (1999)
 43.
Skordos, P.: Compressible dynamics, time reversibility, Maxwell’s Demon, and the second law. Phys. Rev. E 48, 777 (1993)
 44.
Zhang, K., Zhang, K.: Mechanical models of Maxwell’s demon with noninvariant phase volume. Phys. Rev. A 46, 4598 (1992)
 45.
Myrvold, W.C.: Explaining thermodynamics: what remains to be done? In: Allori, V. (ed.) Statistical Mechanics and Scientific Explanation: Determinism, Indeterminism and Laws of Nature, pp. 113–143. World Scientific, Washington (2020)
 46.
Tait, P.G.: Lectures on Some Recent Advances in Physical Science, 2nd edn. Macmillan and Co., London (1876)
 47.
Tait, P.G.: Sketch of Thermodynamics, 2nd edn. David Douglas, Edinburgh (1877)
 48.
Clausius, R.: Ueber eine von Hrn. Tait in der mechanischen Wärmetheorie andgewandte Schlussweise. Annalen der Physik und Chemie 238, 130 (1877)
 49.
Maxwell, J.C.: Diffusion. In Encyclopedia Britannica, vol. 7, ninth edn. Adam and Charles Black, Edinburgh, pp. 214–221 (1877)
 50.
Niven, W.D. (ed.): The Scientific Papers of James Clerk Maxwell, vol. Two. Cambridge University Press, Cambridge (1890)
 51.
Maxwell, J.C.: Tait’s thermodynamics II. Nature 17, 278 (1878)
 52.
von Neumann, J.: Proof of the ergodic theorem and the \(H\)theorem in quantum mechanics. Eur. Phys. J. H 35, 201 (2010). Translation, by R. Tumulka, of [53]
 53.
von Neumann, J.: Beweis des Ergodensatzes und des \(H\)Theorems in der neuen Mechanik. Zeitschrift für Physik 57, 30 (1929)
 54.
Grad, H.: The many faces of entropy. Commun. Pure Appl. Math. 14, 323 (1961)
 55.
van Kampen, N.G.: The Gibbs Paradox. In: Parry, W.E. (ed.) Essays in Theoretical Physics in Honour of Dirk ter Haar, pp. 303–312. Pergamon Press, Oxford (1984)
 56.
Jaynes, E.T.: The Gibbs paradox. In: Erickson, G., Neudorfer, P., Smith, C.R. (eds.) Maximum Entropy and Bayesian Methods, pp. 1–22. Kluwer Academic Publishers, Dordrecht (1992)
Acknowledgements
I am grateful to a number of people with whom I have discussed these matters over the years. In particular, I thank Owen Maroney for drawing my attention to what I have called the Fundamental Theorem, John Norton for discussions of reversible processes, David Wallace for urging the considerations of Sect. 8 on me, and Carlo Rovelli for getting me to think more about the concept of manipulability. I thank also the discussants at the Summer School on Entropy in Split, Croatia, July 2018, and the Southwestern Ontario Philosophy of Physics Reading Group, for helpful questions and feedback.
Author information
Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
Proof of the Fundamental Theorem of Statistical Thermodynamics
In this appendix we prove Proposition 2.
As before, \(\rho\) is used ambiguously for either a density function with respect to Liouville measure on classical phase space, or a quantum density operator. Hamiltonian evolution is, in the classical context, evolution according to Hamilton’s equations of motion, and, in the quantum context implemented by a family of unitary operators U(t). The letter S, without subscript, denotes either the Gibbs entropy or the von Neumann entropy.
In the classical context, the salient fact about Hamiltonian evolution—and, indeed, the only fact that we will use—is that Liouville measure is invariant under evolution of that type. As a consequence, the expectation value, with respect to Liouville measure \(\Lambda\), of any measurable function on phase space is invariant under Hamiltonian evolution; this includes in particular the Gibbs entropy
In the quantum context, the salient fact about Hamiltonian evolution is that it conserves the inner product of two vectors in Hilbert space. As a consequence, the trace of any operator is invariant; this includes in particular the von Neumann entropy
As conservation of phase space volume (classical) and absolute magnitude of inner product (quantum) are the only features of Hamiltonian evolution used, we could expand our repertoire of operations to include fictitious operations, such as an instantaneous velocity reversal, that retain these features, and the theorem would still go through.
The relevant facts about the Gibbs and von Neumann entropies are:

1.
Subadditivity For a composite system AB,
$$\begin{aligned} S[\rho _{AB}] \le S[\rho _A] + S[\rho _B], \end{aligned}$$with equality if and only if the subsystems are probabilistically independent.

2.
For any \(T > 0\), let \(\beta = 1/kT\). The canonical distribution \(\tau _\beta\) minimizes
$$\begin{aligned} \langle H \rangle _\rho  T S[\rho ]. \end{aligned}$$
With these facts in hand, the proof of the theorem is easy. For brevity, we will write \(S_{AB}(t_0)\) for \(S[\rho _{AB}(t_0)]\), etc.. We will consider only interactions with a single heat reservoir, as the extension to successive interactions with multiple heat reservoirs is merely a matter of repeated application of the theorem.
The evolution from \(t_0\) to \(t_1\) does not change the joint entropy \(S_{AB}\). At \(t_0\), since the systems are uncorrelated, \(S_A + S_B\) is at a minimum for the value of \(S_{AB}\) that obtains at both \(t_0\) and \(t_1\). Therefore,
or,
Since B has canonical distribution \(\tau _\beta\) at time \(t_0\),
or,
This gives us,
From (35),
which, combined with (38), yields,
or,
which is the desired result.
Some Quotations from the History of \(\Theta \Delta ^{\text{cs}}\)
The science that I am calling \(\Theta \Delta^{\text{cs}}\) is not a new idea. This understanding of the basic concepts of thermodynamics has been present from the very beginning of the subject. In this appendix I provide some relevant quotations, with no pretense to exhaustiveness.
Josiah Willard Gibbs (1875). Part of this has already been quoted above; here is a fuller quotation.
When we say that when two different gases mix by diffusion as we have supposed, the energy of the whole remains constant, and the entropy receives a certain increase, we mean that the gases could be separated and brought to the same volume and temperature which they had at first by means of a certain change in external bodies, for example, by the passages of a certain amount of heat from a warmer to a colder body. But when we say that when two gasmasses of the same kind are mixed under similar circumstances there is no change of energy or entropy, we do not mean that the gases which have been mixed can be separated without change to external bodies. On the contrary, the separation of the gases is entirely impossible. We call the energy and entropy of the gasmasses when mixed the same as when they were unmixed, because we do not recognize any difference in the substance of the two masses. So when gases of different kinds are mixed, if we ask what changes in external bodies are necessary to bring the system to its original state, we do not mean a state in which each particle shall occupy more or less exactly the same position as at some previous epoch, but only a state which shall be undistinguishable from the previous one in its sensible properties. It is to states of systems thus incompletely defined that the problems of thermodynamics relate.
But if such considerations explain why the mixture of gasmasses of the same kind stands on a different footing from the mixture of gasmasses of different kinds, the fact is not less significant that the increase of entropy due to the mixture of gases of different kinds, in such a case as we have supposed, is independent of the nature of the gases.
Now we may without violence to the general laws of gases which are embodied in our equations suppose other gases to exist than such as actually do exist, and there does not appear to be any limit to the resemblance which there might be between two such kinds of gas. But the increase of entropy due to the mixing of given volumes of the gases at a given temperature and pressure would be independent of the degree of similarity or dissimilarity between them. We might also imagine the case of two gases which should be absolutely identical in all the properties (sensible and molecular) which come into play while they exist as gases either pure or mixed with each other, but which should differ in respect to the attractions between their atoms and the atoms of some other substances, and therefore in their tendency to combine with other substances. In the mixture of such gases by diffusion an increase of entropy would take place, although the process of mixture, dynamically considered, might be absolutely identical in its minutest details (even with respect to the precise path of each atom) with processes which might take place without any increase of entropy. In such respects, entropy stands strongly contrasted with energy ([19], pp. 228–229; in [20], pp. 166–167).
Rudolf Clausius (1877). Responding to Tait’s (unfair) charge that the possibility of a demon that could, without expenditure of work, cool a body below the temperature of its surroundings “is absolutely fatal to Clausius’ reasoning,” ([46], pp. 118–120; see also [47], p. 37), Clausius wrote,
Dieses kann ich in keiner Weise zugeben. Wenn die Wärme als eine Molecularbewegung betrachtet wird, so ist dabei zu bedenken, dass die Molecüle so kleine Körpertheilchen sind, dass es für uns unmöglich ist, sie einzeln wahrzunehmen. Wir können daher nicht auf einzelne Molecüle für sich allein wirken, oder die Wirkungen einzelner Molecüle für sich allein erhalten, sondern haben es bei jeder Wirkung, welche wir auf einen Körper ausüben oder von ihm erhalten, gleichzeitig mit einer ungeheuer grossen Menge von Molecülen zu thun, welche sich nach allen möglichen Richtungen und mit allein überhaupt bei den Molecülen vorkommenden Geschwindigkeiten bewegen, und sich an der Wirkung in der Weise gleichmässig betheiligen, dass nur zufällige Verschiedenheiten vorkommen, die den allgemeinen Gesetzen der Wahrscheinlichkeit unterworfen sind. Dieser Umstand bildet gerade die charakteristische Eigenthümlichkeit derjenigen Bewegung, welche wir Wärme nennen, und auf ihm beruhen die Gesetze, welche das Verhalten der Wärme von dem anderer Bewegungen unterscheiden.
Wenn nun Dämonen eingreifen, und diese charakteristische Eigenthümlichkeit zerstören, indem sie unter den Molecülen einen Unterschied machen, und Molecülen von gewissen Geschwindigkeiten den Durchgang durch eine Scheidewand gestatten, Molecülen von anderen Geschwindigkeiten dagegen den Durchgang verwehren, so darf man das, was unter diesen Umständen geschieht, nicht mehr als eine Wirkung der Wärme ansehen und erwarten, dass es mit den für die Wirkungen der Wärme geltenden Gesetzen übereinstimmt ([48], p. 32).
This I can in no way concede. If heat is regarded as a molecular motion, it should be remembered that the molecules are parts of bodies that are so small that it is impossible for us to perceive them individually. We can therefore not act on single molecules by themselves, or obtain effect from individual molecules by themselves, but rather, in every action that we exert on a body or receive from it, we have simultaneously to do with an immensely large collection of molecules, which move in all possible directions and with all the speeds occurring among the molecules, and participate in the action uniformly, in such a way that there occur only random differences, which are subject to the general laws of probability. This circumstance forms precisely the characteristic property of that motion which we call heat, and on it depends the laws that distinguish the behavior of heat from that of other motions.
If now demons intervene, and disturb this characteristic property by distinguishing between the molecules, and molecules of certain speeds are permitted passage through a partition, molecules of other speeds refused passage, then one may no longer regard what happens under these conditions as an action of heat and expect it to agree with the laws valid for the action of heat.
James Clerk Maxwell (1877, 1878).
Available energy is energy which we can direct into any desired channel. Dissipated energy is energy we cannot lay hold of and direct at pleasure, such as the energy of the confused agitation of molecules which we call heat. Now, confusion, like the correlative term order, is not a property of material things in themselves, but only in relation to the mind which perceives them. A memorandumbook does not, provided it is neatly written, appear confused to an illiterate person, or to the owner who understands thoroughly, but to any other person able to read it appears to be inextricably confused. Similarly the notion of dissipated energy could not occur to a being who could not turn any of the energies of nature to his own account, or to one who could trace the motion of every molecule and seize it at the right moment. It is only to a being in the intermediate stage, who can lay hold of some forms of energy while others elude his grasp, that energy appears to be passing inevitably from the available to the dissipated state ([49], p. 221, in [50], p. 646).
The second law relates to that kind of communication of energy which we call the transfer of heat as distinguished from another kind of communication of energy which we call work. According to the molecular theory the only difference between these two kinds of communication of energy is that the motions and displacements which are concerned in the communication of heat are those of molecules, and are so numerous, so small individually, and so irregular in their distribution, that they quite escape all our methods of observation; whereas when the motions and displacements are those of visible bodies consisting of great numbers of molecules moving altogether, the communication of energy is called work.
Hence we have only to suppose our senses sharpened to such a degree that we could trace the motions of molecules as easily as we now trace those of large bodies, and the distinction between work and heat would vanish, for the communication of heat would be seen to be a communication of energy of the same kind as that which we call work. ([51], p. 279, in [50], p. 669).
John von Neumann (1929).
If we take into account that the observer can measure only macroscopically then we find different entropy values (in fact, greater ones, as the observer is now less skilful and possibly can therefore extract less mechanical work from the system) .... ([52], p. 214, from [53], p. 47).
Harold Grad (1961).
Whether or not a diffusion occurs when a barrier is removed depends not on a difference in physical properties of the two substances but on a decision that we are or are not interested in such a difference (which is what governs the choice of an entropy function) ...A very illuminating example is given by the “spinecho” effect. In this experiment, it is found that it is possible to produce a highly ordered microscopic state and, at a later time, effectively reverse all velocities. To a person who has access to such equipment, a very high level “reversible” entropy will be appropriate; to one who has not, a lower order entropy will properly describe all phenomena ([54], pp. 326–327)
Nicolaas Godfried van Kampen (1984). In regards to the difference in expression of entropies for a uniform sample of gas and a system composed of two different gases, van Kampen wrote,
The origin of the difference is that two different processes had to be chosen for extending the definition of entropy. They are mutually exclusive; the first one cannot be used for two different gases and the second one does not apply to a single gas. But suppose that A and B are so similar that the experimenter has no physical way of distinguishing between them. Then he does not have the semipermeable walls needed for the second process, but on the other hand the first will look reversible to him. ...The point is, that this is perfectly justified and that he will not be led to any wrong results. If you tell him that ‘actually’ the entropy increased when he opened the channel he will answer that this is a useless statement since he cannot utilize the entropy increase for running a machine. The entropy increase is no more physical to him than the one that could be manufactured by taking a single gas and mentally tagging the molecules by A or B.
In fact, this still holds when the experimenter would be able to distinguish between A and B, by means of a mass spectrograph for instance, but is not interested in the difference because it is not relevant for his purpose. This is precisely what engineers do when they make tables of the entropy of steam, ignoring the fact that it is actually a mixture of normal and heavy water. Thus, whether such a process is reversible or not depends on how discriminating the observer is. The expression for the entropy (which one constructs by one or the other processes mentioned above) depends on whether he is able and willing to distinguish between the molecules A and B. This is a paradox only for those who attach more physical reality to the entropy than is implied by its definition ([55], pp. 306–307).
Edward T. Jaynes (1992).
In the first place, it is necessary to decide at the outset of a problem which macroscopic variables or degrees of freedom we shall measure and/or control; and within the context of the thermodynamic system thus defined, entropy will be some function \(S(X_1,\ldots , X_n)\) of whatever variables we have chosen. We expect this to obey the second law \(T dS \ge dQ\) only as long as all experimental manipulations are confined to that chosen set. If someone, unknown to us, were to vary a macrovariable \(X_{n+1}\) outside that set, he could produce what would appear to us as a violation of the second law, since our entropy function \(S(X_1,\ldots , X_n)\) might decrease spontaneously, while his \(S(X_1,\ldots , X_n, X_{n+1})\) increases ([56], p. 5).
John Goold, Marcus Huber, Arnau Riera, Lídia del Rio, and Paul Skrzypczyk (2016).
If physical theories were people, thermodynamics would be the village witch. Over the course of three centuries, she smiled quietly as other theories rose and withered, surviving major revolutions in physics, like the advent of general relativity and quantum mechanics. The other theories find her somewhat odd, somehow different in nature from the rest, yet everyone comes to her for advice, and noone dares to contradict her. Einstein, for instance, called her ‘the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.’
Her power and resilience lay mostly on her frank intentions: thermodynamics has never claimed to be a means to understand the mysteries of the natural world, but rather a path towards efficient exploitation of said world. She tells us how to make the most of some resources, like a hot gas or a magnetized metal, to achieve specific goals, be them moving a train or formatting a hard drive. Her universality comes from the fact that she does not try to understand the microscopic details of particular systems. Instead, she only cares to identify which operations are easy and hard to implement in those systems, and which resources are freely available to an experimenter, in order to quantify the cost of state transformations ([7], pp. 1–2).
Rights and permissions
About this article
Cite this article
Myrvold, W.C. The Science of \({\Theta \Delta }^{\text{cs}}\). Found Phys 50, 1219–1251 (2020). https://doi.org/10.1007/s10701020003713
Received:
Accepted:
Published:
Issue Date:
Keywords
 Thermodynamics
 Statistical mechanics
 Entropy
 Resource theories