19

Let's take an ideal gas as an example. Usually, the gas is parameterized in two different ways - by the position and momentum of each of its particles (in the kinetic-molecular model) and by its volume $V$, pressure $p$ and temperature $T$ in general. In this case, a microstate is any particular set of values for the positions and momenta of all gas particles and a macrostate is any $3$-tuple of values for $p,V$ and $T$. The entropy of the gas is defined as the Boltzmann constant times the logarithm of the multiplicity $\Omega$ of its macrostate, where the multiplicity of a macrostate is the number of microstates that correspond to it. In other words, $\Omega$ of a particular macrostate $(p,V,T)$ is the number of $n$-tuples $(\vec{r}_1,\vec{p}_1,\vec{r}_2,\vec{p}_2,\cdots)$, where $\vec{r}_k$ and $\vec{p}_k$ are the position and momentum of the $k$-th gas particle, which result in those particular values $p$, $V$ and $T$ in the macrostate.

However, the gas does not know and does not care which of its properties we choose to examine - it is still the same physical system regardless of whether we study its pressure, volume, temperature or the positions and momenta of its particles or its color, density and electroconductivity. For example, suppose we kept the kinetic-molecular parameterization but instead of then looking at $(p, V, T)$ we looked at the average speed $v$ of the gas particles and the average distance $d$ between them. A microstate in this case would be the same as before, but a macrostate would now be a $2$-tuple $(v,d)$. Moreover, multiple microstates $(\vec{r}_1,\vec{p}_1,\vec{r}_2,\vec{p}_2,\cdots)$ can result in the same macrostate $(v,d)$ - every macrostate has a particular multiplicity. But now we have a different type of entropy, do we not?

My conclusion is then that entropy is not a property of physical systems themselves, but rather a property of the way we choose to study them. But then entropy is not as fundamental a concept as it is usually made out to be - it is simply a mathematical consequence of a choice we make, i.e. which parameterizations of the system we will compare. So why do we bother studying entropy if it is a purely mathematical phenomenon?

4 Answers4

12

My conclusion is then that entropy is not a property of physical systems themselves, but rather a property of the way we choose to study them.

Almost correct. However, some of these ways we choose to study are well established and well-defined, and then a particular well-defined function can get widely accepted and (to some) is interesting.

For example, in thermodynamics, in some practical cases, we can define what we mean by work and heat quite well. For a continuous change of macrostate from some reference state $\mathbf X_0$ to a final state $\mathbf X$, if this can be made reversibly (no violent processes allowed) over some path $\gamma$ in the space of macrostates, joining $\mathbf X_0$ to $\mathbf X$, we can define function of the final state $\mathbf X$: $$ S(\mathbf X) = \int_{\gamma}\frac{dU - dW}{T}. $$ It turns out that provided 2nd law holds, this quantity has interesting properties, such as being a function of the macrostates $\mathbf X_0, \mathbf X$ only (it does not depend on details of the path $\gamma$). So it is a function of the state, independent of the path we used to get there, just like internal energy $U$ is. Clausius and others used this to formulate thermodynamic laws; e.g. reversible processes in closed, thermally insulated systems which are allowed to exchange work, are characterized by entropy remaining constant; and irreversible processes in closed thermally insulated systems are characterized by entropy increasing.

The set of macroscopic variables $\mathbf X$ is what we choose to describe the system. For homogeneous gas, we can use $U,V$, but if we have a reason to include external magnetic field $B_{ext}$, we can. In the future, if we discover a new way to interact with the gas state, e.g. via some new field $G$, then entropy function will change to be a function of $G$ as well.

But then entropy is not as fundamental a concept as it is usually made out to be - it is simply a mathematical consequence of a choice we make, i.e. which parameterizations of the system we will compare. So why do we bother studying entropy if it is a purely mathematical phenomenon?

Yes, the entropy function arguments and its value depend on the choice of variables. However, in case of the Clausius entropy, changes of this entropy between two states are related to integral of $dQ/T$, which are definite measurable quantities, so changes of this entropy do not depend on the choice of variables $\mathbf X$.

I think we study it largely for historical reasons; because it was(is) hard to understand, and because it was written about incorrectly by some authors, it provoked people to write about it and then other people write why that is wrong. So it populated scientific articles and textbooks. Clausius invented thermodynamic entropy, formulated thermodynamics with it, and then people started to analyze what it means (and they do not seem to be done, especially when connection to statistical physics is studied, or when deciding how the concept should be taught). This created a lot of confusion, even resistance. I remember reading about a scientist who rejected entropy as a concept to use when formulating thermodynamics, and formulated everything without it. This is possible, but did not catch on; in a sense, entropy has won societally. It's like a scientific enigma to be decrypted, people accept the trouble with it because they think some big insight or development of our understanding is possible. Clausius and others started this mysticism with statements like "energy of universe is constant, entropy of universe only ever increases" or something like that, which I think is not scientific, more like a catchy slogan to promote one's ideas or maybe a first instance of a buzzword-fueled marketing.

With development of statistical physics and theory of communication (information theory), people introduced new and different concepts and formulae with similar behavior to Clausius entropy (Boltzmann, Gibbs), and later not them, but entirely different people named those entropy too, even though the concepts are different(Boltzmann entropy, Gibbs entropy). This definitely confused and continues to confuse a lot of people.

Today, we have also the Shannon entropy, and in quantum theory, the von Neumann entropy, and there are even more modern contributions, such as the Kolmogorov entropy or the Renyi entropy.

All of these can be indicted with some anthropomorphism, or dependence on some human choice; we choose variables we consider a "practical simplified description" of the physical state, either real physical quantities such as $U,V$, or probabilities of microstates $p_i$, and then various entropies are functions of these or other preferred variables. We could have chosen the variables differently, but if there is no good reason, we use the smallest number of the most plausible variables possible, which (hopefully) makes everyone agree on the one entropy function to be used. Sometimes this fails, or people are unclear which entropy function they talk about, or use it as if it did not depend on those choices (Landauer), and sometimes other people have issue with that.

8

This is to answer your last question specifically:

So why do we bother studying entropy if it is a purely mathematical phenomenon?

Numbers aren't real.

All models are wrong. Some are more useful than others.

The reality of something is of less consequence than its usefulness.

There is mystery involved with entropy and it is useful. Whether it is real or a consequence of other phenomena matters less than being able to use it and expanding our ability to use it.

Physics isn't reality. It is the description of reality. The model chosen sets context of these descriptions to focus more on some aspects while ignoring other aspects.

In a purely mathematical construct, the study into prime numbers was thought to be a dead-end. Now it is the foundation of our cryptography.

David S
  • 219
3

However, the gas does not know and does not care which of its properties we choose to examine - it is still the same physical system regardless of whether we study its pressure, volume, temperature or the positions and momenta of its particles or its color, density and electroconductivity.
[...]
My conclusion is then that entropy is not a property of physical systems themselves, but rather a property of the way we choose to study them. But then entropy is not as fundamental a concept as it is usually made out to be - it is simply a mathematical consequence of a choice we make, i.e. which parameterizations of the system we will compare. So why do we bother studying entropy if it is a purely mathematical phenomenon?

The world (or gas in this case) indeed knows nothing about how we describe and study it. This is true not only in regard to entropy - but also for energy, temperature, and other properties. Even position of an object is but our way of describing its behavior/state, not its intrinsic property. This is the difference between the actual physical world vs. a theory describing physical world.

Having said that, the entropy was introduced by Clausius and Gibbs to describe the actually observed tendency of the physical systems to evolve in a certain direction despite the reversibility of the microscopic laws of motion. Later theories tried to explain this from the mircoscopic point of view and thus introduced various definitions of entropy as a measure of statistical distribution, chaos, etc. Regardless of whether we understand it or not, the physical systems keep evolving in a direction of increasing entropy - a gas let out of a bottle escapes, a fruit left for a long time rots, an egg breaks... and they never return to their initial state.

Roger V.
  • 68,984
-1

Regarding your examples, a $3$ - tuple $(p,V,T)$ is "over counting" the number of independent variables in a single component, closed and single phase system (like the ideal gas assumed). You only need two degrees of freedom, for example, $(p,T)$. For constant $N$ the third state variable is known from the other two. This is just the Gibbs rule or phase rule. The $2$ - tuple $(v,d)$ now can be used to find $(p,T)$ using the well-known relations of kinetic theory. Notice that the average separation $d$ is important because we're assuming that the volume isn't fixed, so that there are two degrees of freedom. For a fixed $N$ and fixed $V$ system you only need one variable, either $p$ or $T$, or the average velocity $v$. Point is: they are the same tuple, except for a constant.

The fact that they are the same tuple means that the number of microstates $\Omega$ that you count in either case will be the same, so the resulting entropy is the same whether you measure $(p,T)$ or you measure $(v,d)$. Ultimately, this is because the macroscopic ideal gas and the "kinetic theory" ideal gas are the same model, they should offer the same results.

If for a physical system you modeled it using the ideal gas model and, say, the Van der Waals equation, then you would predict a different entropy because you're using two different models. In that sense, entropy is subjective.

Moreover, there are questions regarding how exactly one can count correctly the number of possible microstates for a given macrostate. Basically, it amounts to dividing phase-space correctly using quantum mechanics.

agaminon
  • 4,386