Keyword

The first law has shown to provide much insight into how nature functions, but it does not explain why some processes occur spontaneously. A spontaneous process proceeds in one direction, and once initiated, will continue. For instance, if you drop a rock from some table height, it will fall spontaneously, but a rock on the ground does not jump back to table height spontaneously. If we have two flasks, one at vacuum and the other one at 1 bar, and connect these, the pressure will equalize to 0.5 bar in each of the flasks, but without outside intervention gases will never unmix so that one flask will attain vacuum and the other 1 bar again. Heat will always flow from a warm to cold reservoir and not the other way around. These spontaneous processes cause a decrease in useable energy: gravitational potential energy for falling rock, work for the expanding gas and thermal energy for the heat flow. A process such as the dissolution of sodium chloride in water is spontaneous, yet it is an endothermic process (\(\Delta H_{r}^{o}\) =  +3.88 kJ mol−1), consuming enthalpy and cooling the water. Apparently, changes in energy constrained by the first law are insufficient to explain the spontaneous dissolution of kitchen salt while you are cooking. Another thermodynamic law (the second) and a property called entropy are required. Entropy is a thermodynamic state function denoted by the symbol S, which is explained below.

1 The Second Law

There are multiple formulations of the second law that emphasize its various implications. The second law not only makes a statement of the direction of change, but also puts constraints on the efficiency of energy conservations. The latter aspects of the second law are often illustrated with a Carnot cycle of a heat engine and are primarily the domain of physicists and engineers, while chemists give more attention to the direction of reactions. While work can be converted to heat with 100% efficiency, this is not the case for the conversion of heat to work. There is an inherent loss of useable heat when it is used to perform work. This observation underlies Clausius’ statement of the 2nd law that heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time. Another formulation for the 2nd law by Kelvin states there exists no process in which the sole result is the complete conversion of heat absorbed by a system into work. Clausius’ statement articulates the direction of spontaneous processes, while Kelvin’s statement emphasizes the degradation of energy when it is transferred as heat (not all heat is available for work). While the first law states that the sum of heat and work (=internal energy) is conserved, the second law adds that the interconvertibility of heat and work is asymmetrical. Another, useable formulation of the second law states that spontaneous processes are those which increase the entropy of the universe. Accordingly, we must properly define the entropy change. Entropy will first be illustrated using statistical thermodynamics (and on the way we will introduce the third law of thermodynamics) and then the macroscopic entropy definition will be presented and elaborated.

2 Microscopic View of Entropy and the Third Law

The microscopic definition is based on statistical mechanics. Each macroscopic thermodynamic state has a specific number of microstates, W, associated with it. A microstate is a particular way in which the total energy of the system is distributed among the individual molecules of the system. Consider a gas distributed over two bulbs, one is empty, the other one filled with four molecules (Fig. 3.1). After opening the valves between these bulbs, the molecules are re-distributed and 16 possible molecular arrangements can be found (=24; 2 bulbs and 4 four molecules), each configuration is equally likely. The probability of the system returning to its original state is 1/16 (\({=}2^{ - 4} )\). Now consider the case of one mole of gas in one of the bulbs and the other bulb at vacuum again. One mole of gas contains 6.022 × 10 23 molecules, i.e., Avogrado’s number (NA). It will be clear that the probability for one mole of gas containing \( {\approx}10^{24} \) molecules to return to its original state in one of the bulbs only is extremely low \(( {\approx}2^{{ - 10^{24} }}\)). Accordingly, the number of molecular arrangements, or microstates W, increases during spontaneous changes. In other words, the randomness or disorder of the system increases in spontaneous processes. Boltzmann defined the entropy as follows:

$$ S = k_{B} \ln W. $$
(3.1)
Fig. 3.1
A schematic diagram illustrates the distribution of gas between two bulbs. 1. Before connection, One bulb contains four molecules while the other remains empty. 2. Post-connection: It depicts 16 molecular configurations resulting from the redistribution of molecules.

Possible distribution of four molecules before and after connection of two gas bulbs

This equation links entropy (S) with the Boltzmann constant (kB = 1.381 × 10−23 J K−1) and the number of microstates (W). The Boltzmann constant is via Avogrado’s constant (NA) related to the macroscopic gas constant (R):

$$ R = k_{B} \cdot N_{A} $$

This statistical thermodynamic view of entropy as a measure of randomness or disorder is instructive to explain some of its properties (e.g., its dependence on phase and temperature). Gas has more molecular randomness than liquids (because of the larger distance between molecules) and its entropy is therefore higher than that of liquids. Liquids allow movement of individual molecules and have therefore more randomness than solids, in which molecules are in fixed positions. Consequently, the entropy will decline when liquids freeze to form a solid phase. The dissolution of a well-organized sodium chloride crystal in water leads to increase in randomness because of the disruption of the crystal, but the hydration of the chloride and sodium ions somewhat lowers the randomness; the net result is that randomness and thus entropy increases, driving the dissolution of kitchen salt in water.

Lowering of temperature decreases the kinetic energy of molecules and atoms. When the absolute temperature of a system approaches 0 K, then the motion of individuals molecules approaches zero and there is only one microstate (W = 1). Accordingly, application of Boltzmann’s law (Eq. 3.1) then indicates that the entropy should be zero. This constraint is known as the third law of thermodynamics stating that the entropy of a perfect crystal at zero kelvin is zero.

3 Macroscopic View of Entropy

The macroscopic definition is based on a detailed analysis of heat-work cycles and the efficiency of heat engines (Carnot cycle). Carnot has shown that cyclic processes involving isothermal and adiabatic changes of an ideal gas cause no change in (internal) energy because of the return to initial conditions. Consider a heat engine with two reservoirs at temperature Th (the source) and Tc (the sink), receiving heat qh and transferring heat qc, respectively (Fig. 3.2). Because Th is larger than Tc we can extract work (w) from the heat engine.

Fig. 3.2
An illustration of a heat engine with two reservoirs at temperature T subscript H and T subscript C. A heat engine receives the heat from the heat source, works on it, and transfers the heat to the cold sink.

A hot reservoir at Th supplies 20 kJ heat qh to a heat engine performing 5 kJ work (w) and the remaining 15 kJ is lost as heat (qc) to the cold reservoir at Tc. The efficiency is the work done divided by the heat supplied (qh)

Carnot showed that the maximum efficiency (e) of such a heat engine with fixed reservoir temperatures is

$$ e = \frac{w}{{q_{h} }} = \frac{{T_{h} - T_{c} }}{{T_{h} }} $$
(3.2)

The efficiency of an engine can thus only be 100% if \(T_{c}\) is absolute zero. The Carnot, i.e., ideal, efficiency of an automobile engine having a \(T_{h}\) of 1100 K and an ambient temperature \(T_{c}\) of 295 K is about 73%; however, the average automobile engine is typically much less efficient due to other causes.

Conservation of energy implies that

$$ w + q_{h} + q_{c} = 0. $$
(3.3)

Isolating w from this equation and inserting the expression in (3.2) then yields

$$ \frac{{q_{c} }}{{q_{h} }} = - \frac{{T_{c} }}{{T_{h} }}, $$
(3.4a)

or its equivalent

$$ \frac{{q_{h} }}{{T_{h} }} + \frac{{q_{c} }}{{T_{c} }} = 0. $$
(3.4b)

Which implies that the ratio \(\frac{q}{T}\) is a state function whose change is zero in the Carnot cycle.

Following Clausius, the entropy change can then be defined as

$$ \Delta S = \frac{q}{T}. $$
(3.5)

Next, we consider the spontaneous transfer of heat q from a hot reservoir (\(T_{h} )\) to a cold reservoir (\(T_{c} ). \) The entropy decrease for the hot reservoir \(\left( {\frac{q}{{T_{h} }}} \right)\) is smaller than the entropy increase of the cold reservoir \(\left( {\frac{q}{{T_{c} }}} \right),\) and thus the overall entropy change \(\Delta S\) is positive, as required by the second law (Fig. 3.3). Clausius inequality statement implies that for any spontaneous process

$$ \Delta S > 0. $$
(3.6)
Fig. 3.3
An illustration of the transformation of heat from a hot reservoir to a cold reservoir. Hot reservoir del S= q over T h. Cold reservoir del S=q over T c.

Heat (q) spontaneously flows from a hot reservoir with temperature Th to a cold reservoir at temperature Tc. The entropy change \(\Delta {\text{S }}\) is positive because the entropy decrease for the hot reservoir \(\left( {\frac{q}{{T_{h} }}} \right)\) is smaller than the entropy increase of the cold reservoir \(\left( {\frac{q}{{T_{c} }}} \right).\)

4 Entropy Changes During Reactions

Since entropy is a state function, the change in entropy (\(\Delta_{r} S)\) is the difference between the entropy of products and reactants:

$$ \Delta_{r} S = S_{products} - S_{reactants} $$

Following the same approach as for enthalpy changes, we can obtain the entropy change (\(\Delta_{r} S^{0}\)) for the reaction of α moles of substance A and β moles of substance B to form γ moles of substance C and δ moles of substance D

$$ \alpha A + \beta B \to \gamma C + \delta D $$

from the standard molar entropies at STAP (\(S^{0 } ) \) as follows

$$ \Delta_{r} S^{0} = \left[ {\gamma \times S^{0 } \left( C \right) + \delta \times S^{0 } \left( D \right)} \right] - \left[ {\alpha \times S^{0 } \left( A \right) + \beta \times \Delta S^{0 } \left( B \right)} \right] $$
(3.7)

or more generalized:

$$ \Delta_{r} S^{0} = \sum v_{i} \Delta S^{0 } \left( {products} \right) - \sum v_{i} \Delta S^{0 } \left( {reactants } \right) $$
(3.8)

The standard molar entropy, \(S^{0 } , \) of a compound is the molar entropy of a substance in its standard state at 25 °C and 1 bar pressure. These values are tabulated at STAP conditions and normally expressed in J K−1 mol−1 (rather than kJ mol−1 as enthalpy). Moreover, these are absolute values because the third law sets \(S^{0 }\) at zero at zero K.

Consider the following reaction (Table 3.1):

$$ 2H_{2} \left( g \right) + O_{2} \left( g \right) \to 2H_{2} O\left( l \right) $$
(3.9)
Table 3.1 Thermodynamic data for hydrogen and oxygen gas and liquid water

The entropy change of the system (=reaction) is calculated,

$$ \begin{aligned}\Delta_{r} S^{0} &= \left[ {2 \times S^{0 } \left( {H_{2} O \left( l \right)} \right)} \right] - \left[ {2 \times S^{0 } \left( {H_{2} \left( g \right)} \right) + 1 \times \Delta S^{0 } \left( {O_{2} \left( g \right)} \right)} \right] \\&= - 326.68 \,{\text{J K}}^{ - 1}\end{aligned} $$

There is a loss of entropy because three moles of gas (disordered) are turned into two moles of liquid water (less molecules and less disordered). Taking at face value, this negative entropy change of the system (\(\Delta S_{Sys} ) \) suggests that water should not be stable and that the reverse reaction, transforming liquid water into hydrogen and oxygen gas, should be spontaneous. This is incorrect. The issue is that the second law states that spontaneous reactions should increase the total entropy of the universe, not that of the system only. To resolve this, we calculate the entropy change of the surroundings (\(\Delta S_{Surr} )\) realizing that the enthalpy change of the surrounding \(\Delta H_{surr}^{o}\) should be the negative of that of the system (\(\Delta_{r} H^{0} = \Delta H_{Sys}^{o} \)):

$$ \Delta H_{surr}^{o} = - \Delta H_{Sys}^{o} $$
(3.10)

Using the above thermodynamic data and Eq. (2.18),

$$ \Delta_{r} H^{0} = 2\left( { - 285.83} \right) - \left( {2 \times 0 + 1 \times 0} \right) = { } - 571.66{ }\,{\text{kJ}} $$

For constant P, we can use Eq. (3.5) (the entropy definition):

$$ \Delta S_{surr}^{0} = \frac{q}{T} = \frac{{\Delta H_{surr}^{o} }}{T} = - \frac{{\Delta H_{Sys}^{o} }}{T} = \frac{571.66 \times 1000}{{298}} = 1918.3{ }\;{\text{J K}}^{ - 1} $$

The total entropy change (\(\Delta S_{Tot} )\) is thus.

$$ \Delta S^{0}_{Tot} = \Delta S^{0}_{Sys} + \Delta S^{0}_{Surr} = - 326.68 + 1918.3 = + 1591.6\,{\text{ J K}}^{ - 1} $$

In conclusion, for a spontaneous process

$$ \Delta S_{Tot} = \Delta S_{Sys} + \Delta S_{Surr} > 0, $$
(3.11)

Consistent with the 2nd law statement at the end of Sect. 3.1: spontaneous processes are those which increase the entropy of the universe. The entropy change of the system may be negative, but the total entropy change should be positive for any spontaneous process. This leads to a few simple statements:

$$ \Delta S_{Tot} > 0\,{\text{for }}\,{\text{a }}\,{\text{spontaneous}}\,{\text{ process}} $$
(3.12a)
$$ \Delta S_{Tot} < 0\,\,{\text{for}}\,{\text{ a}}\,{\text{ non - spontaneous }}\,{\text{process}} $$
(3.12b)
$$ \Delta S_{Tot} = 0\,\,{\text{for}}\,{\text{ an }}\,{\text{equilibrium}} $$
(3.12c)

Finally, any process taking place in the universe is spontaneous and leads to an increase of \(S_{Tot}\). Consequently, the \(\Delta S_{Tot} > 0\) criterion implies a unique direction of time and underlies the maximum entropy production concept in use in Earth System and Ecosystem sciences.