Entropy

Entropy is one of the most misunderstood concepts in physics, usually associated with disorder or ignorance, and often confused with different concepts that have the same name, such as the Jaynes entropy.

THERMODYNAMIC ENTROPY

The concept of entropy was introduced in physics by Clausius as part of the development of thermodynamics. Its physical meaning is often omitted from textbooks, but we can say it is a measure of the amount of thermal energy per unit of temperature. If we subtract the mechanical, chemical, electrical, magnetic, and other components that make up the internal energy of a system, and divide it by the temperature of the system, we get its entropy \( S \)

\[ S = \frac{ U + pV - \sum_i \mu_i N_i }{ T } + S_0 \]

This is the Euler form and I have added a constant \( S_0 \), which is usually set to zero, because most thermodynamic studies only deal with processes and \( \Delta S \). As you can see, entropy is a physical quantity that depends on physical quantities of the system: internal energy \( U \), volume \( V \), pressure \( p \), temperature \( T \), composition \( N_i \), and others.

IGNORANCE

Entropy is not a measure of ignorance about the microscopic state. Entropy is a physical quantity, so its value is independent of us and our knowledge. We often heard that the entropy is higher when we know less details about the system, but this is not true. The thermodynamic entropy has the same value when we only know the macroscopic quantities in the Euler expression as when we know the detailed microscopic state. In fact, the entropy can be calculated from the microscopic state using molecular dynamics methods and its value is not zero.

The misconception that entropy is a measure of our ignorance is very common and you can find Nobel laureates such as Murray Gell-Mann who believes that observers tracking all the microscopic details of the Universe would assign a zero entropy to it and, as consequence, those observers would conclude that the universe is reversible. This is absurd! Goldstein, Lebowitz, Tumulka, and Zanghì have recently published an article in which they try to revert this common misconception. They write: "entropy has nothing to do with the knowledge of observers" [1]. They also provide some basic examples showing how this subjective concept of entropy does not agree with the thermodynamic entropy:

The first problem is that in some situations, the subjective entropy does not appropriately reproduce the thermodynamic entropy. For example, suppose an isolated room contains a battery-powered heater, and we do not know whether it is on or off. If it is on, then after ten minutes the air will be hot, the battery empty, and the entropy of the room has a high value \( S_3 \). Not so if the heater is off; then the entropy has the low initial value \( S_1 < S_3 \). In view of our ignorance, we may attribute a subjective probability of 50 percent to each of "on" and "off". After ten minutes, our subjective distribution \( \rho \) over phase space will be spread over two regions with macroscopically different phase points, and its Gibbs entropy \( S(\rho) \) will have a value \( S_2 \) between \( S_1 \) and \( S_3 \) (in fact, slightly above the average of \( S_1 \) and \( S_3 \)). But the correct thermodynamic value is not \( S_2 \), it is either \( S_1 \) (if the heater was off) or \( S_3 \) (if the heater was on). So subjective entropy yields the wrong value.

DISORDER

Entropy is not a measure of disorder. The Euler expression shows that the entropy of an ordered room is the same as that of a disordered room. The same thing happens with a collection of coins. The ordered configuration [head][head][head][tail][tail][tail] has the same entropy as [head][tail][head][tail][tail][head]. The thermal energy and the temperature are the same for both the ordered and the disordered configurations, so the value of the entropy cannot be different. Dan Styer traces the origin of the association between entropy and disorder to early work by Boltzmann and Helmholtz [2].

BOLTZMANN AND GIBBS ENTROPIES

The statistical expression \( S = k \ln(W) \) is often regarded as the fundamental microscopic definition of entropy. It is neither fundamental nor microscopic. It is not fundamental, because that expression (first obtained by Boltzmann) is only valid for the microcanonical ensemble and not for arbitrary ensembles, so that the Euler expression for the entropy also applies to a thermodynamic system interacting with a heat bath, but the Boltzmann expression does not.

The Boltzmann expression is not microscopic. The distinction between microscopic and macroscopic quantities is very clear: the macroscopic quantities are obtained by averaging the microscopic ones

\[ p = \langle p^\text{micro} \rangle \] \[ T = \langle T^\text{micro} \rangle \] \[ U = \langle U^\text{micro} \rangle \]

but the macroscopic entropy is not obtained by averaging \( k \ln(W) \), because this is already a macroscopic value. The Boltzmann entropy is a macroscopic quantity.

S = k ln(W) can be derived from the Gibbs expression \( S = -k \rho \ln(\rho) \) , when the later is applied to the microcanonical ensemble. The Gibbs expression is more general because it is valid for the microcanonical, canonical, and grandcanonical ensembles, but it cannot be considered a fundamental and microscopic definition of entropy either. The Gibbs expression has shortcomings, including that its value is constant for an isolated system, which implies that it cannot describe irreversibility. Multiple approaches have been tried to reconcile the Gibbs expression with the second law of thermodynamics, but all the approaches have flaws.

BLACK HOLE ENTROPY

Black holes do not really exist, as even Hawking finally admitted some years before passing away, but the problem with black hole entropy is not that there are not spacetime singularities in nature.

The problem is that this entropy was derived from faulty analogies with thermodynamics quantities. Bekenstein and Hawking confused the rest energy \( mc^2 \) of relativity with the internal energy \( U \) used in thermodynamics. Their second error was the interpretation of \( (ħ c^3 / 8 \pi k G M) \) as the temperature of the black hole. As a consequence, the Bekenstein-Hawking entropy does not have the basic properties that characterize a thermodynamic entropy, including its role with respect to stability. The Bekenstein-Hawking entropy cannot produce thermal stability; any perturbation, no matter how small, will run the system away from thermodynamic equilibrium, rendering any attempt to measure the hypothetical temperature of the nonexistent black hole useless.

THE OTHER ENTROPIES

There are dozens of different definitions of entropy: Kolmogorov and Sinai, Shannon, Rényi, Tsallis, Aczél and Daróczy, Varma, Kapur, Havrda and Charvát, Belis and Guiasu, Rathie, Arimoto, Sharma and Taneja, Picard,...

The term entropy is one of the most abused concepts in physics, with people using it for invented quantities with little or no connection to the original concept introduced by Clausius in physics. The origin of this attitude probably goes back to the anecdote of Neumann and Shannon. Myrus Tribus relates this anecdote in the following terms:

The same function appears in statistical mechanics and, on the advice of John von Neumann, Claude Shannon called it 'entropy'. I talked with Dr. Shannon once about this, asking him why he had called his function by a name that was already in use in another field. I said that it was bound to cause some confusion between the theory of information and thermodynamics. He said that Von Neumann had told him: 'No one really understands entropy. Therefore, if you know what you mean by it and you use it when you are in an argument, you will win every time.'

Many of those new 'entropies' are faulty. For example, the Tsallis entropy has been debunked many times (see for example [3]), but some physicists continue to promote it and some even claim this 'entropy' has been experimentally verified. It has not been.

NOTES

  1. Gibbs and Boltzmann Entropy in Classical and Quantum Mechanics 2020: In Statistical Mechanics: Scientific Explanation and Determinism, Indeterminism and Laws of Nature; Allori, Valia (Ed); World Scientific Publishing Co. Pte. Ltd.; Singapore. Goldstein, Sheldon; Lebowitz, Joel L.; Tumulka, Roderich; Zanghì, Nino.
  2. Entropy as Disorder: History of a Misconception 2019: Phys. Teach. 57, 454–458. Styer, Dan.
  3. Critique of q-entropy for thermal statistics 2003: Phys. Rev. E 67(3), 036114. Nauenberg, Michael.