I am trying to learn what entropy actually is, and I read this answer about how entropy is the information needed to specify a full quantum state. However, if entropy is just information, why does it have units related to energy and temperature? So I was wondering if someone could explain how information can have these units.
2 Answers
Historically, entropy was defined in terms of heat transfer and temperature. The connection to disorder and entropy came much later. You can, in fact, define entropy to be a dimensionless quantity, and many people do, but the historical units are still generally used in thermodynamic calcualtions.
The First Law of Thermodynamics, which is just conservation of energy, states that the internal energy change of a system is related to the mechanical work done and the heat exchanged: $dU=dW+dQ$. The work can be easily understood in terms of the Work-Energy Theorem, with $dW=-p\,dV$. However, the heat is much more mysterious. The existence of heat has been known forever, but its relationship to mechanical energy was very tricky to work out. Joule got his name attached to the SI unit of energy by demonstrating what the First Law states—that heat and work are ultimately equivalent sources of energy.
Despite the heat transfer being written "$dQ$", there is no state function $Q$, just as there is no state function $W$ that means work. $dW$ and $dQ$ are only meaningful as energy transfers, not as internal energy contributions themselves. However, we know that by dividing $dW$ by the pressure, we get the differential of something (the volume) that really is just a function of a system's state: $-dV=dW/p$. Integrating this gives the change in volume during a process as $\Delta V=-\int dW/p$. Analogously to this, it was realized that there was another state function $S$, whose change was the integral of $dQ/T$. The quantity $\Delta S=\int dQ/T$ was named the entropy, and it was found, empirically (using the equations of state for fluids and other substances), that $S$ was a state function. According to its definition, $S$ has units of energy divided by temperature (J/K in SI). Even more important than the fact that $S$ was a state function [meaning that its value depended only on the current state of a system, not its history; it doesn't matter how heat and work were added to a system to bring it to volume $V$ and temperature $T$, its entropy is $S(V,T)$ regardless], $S$ was found to be related to reversibility. It was realized that the Second Law of Thermodynamics was equivalent to the statements that, for a complete, closed system, $\Delta S=0$ for a reversible process and $\Delta S>0$ for an irreversible process. (For no process does a closed system have negative $\Delta S$.)
In the analyses leading to these steps, $S$ was considered to be a strictly abstract quantity. Unlike pressure, volume, and temperature, it was not something that could be directly measured about a system. However, it was incredibly useful, because of its connection to the First, Second (and Third, which I haven't mentioned) Laws of Thermodynamics. The notion of entropy had been developing for several decades before its precise definition was set down in the 1850s, and it was not until the 1870s that Boltzmann provided a microscopic notion that entropy was related to the number of microscopic states of a system that were consistent with the observed macroscopic state. However (and this was one of the many objections to Boltzmann's ideas), there were problems with the units in his statistical definition of entropy. Only in the twentieth century, with the development of quantum mechanics and the work of Shannon on information theory, was is possible to give a precise relation between the number of possible quantum states of a system and its entropy.
By that time, the units of entropy used in thermodynamics were firmly established. People doing practical calculations with fluids and other mechanical systems continue to use the original units of entropy. Other people, who are interested in information content or quantum statistical mechanics, often use a definition of entropy that makes $S$ dimensionless. (For example, if the probability of a system being in each possible state $i$ is $p_{i}$, the most common normalization of the entropy is $S=-\sum_{i}p_{i}\log_{2}p_{i}$.) It is possible to interconvert the definitions using constant scaling factors. For example, some definitions use the natural logarithm instead of the base-2 logarithm, which just changes the normalization of $S$ by an overall factor, since $\ln p_{i}=\log_{2}p_{i}/\log_{2}e$. To rescale $S$ to have its traditional units of energy per temperature, you just insert a factor of Boltzmann's constant $k_{B}=1.380649\times10^{−23}$ J$\cdot$K$^{−1}$.
- 17,816
It's a quirk of scientific history that energy is quantified with one set of units (joules, ergs, calories) and temperature with another (kelvins, degrees Rankine). In Thermodynamics and an Introduction to Thermostatistics, Callen notes that energy and temperature have the same dimensions: $\mathrm{mass}\cdot\left(\frac{\mathrm{length}}{\mathrm{time}}\right)^2$. It would be reasonable to take entropy as dimensionless and unitless, as you note, being a representation of certain information—namely, the number of microstates consistent with a given macrostate. In this framework, the conversion coefficient of Boltzmann's constant (which would otherwise give entropy its conventional units of $\mathrm{J}/\mathrm{K}$ in SI, for example) is not necessary, and temperature and energy share units.
However, perhaps it's just as well that temperature has different units than energy. For one, it lets us state the temperature within several orders of magnitude of 1, which wouldn't typically be possible otherwise. In addition, look at all the pedagogical challenges associated with internal energy, thermal energy, heat, and work all having the same units (and thus being conflated more or less by new practitioners, especially for the case of thermal energy and heat). If temperature were measured in units of joules, for instance, we could expect still more confusion. But it would nevertheless be a valid (and arguably more logical) choice.
- 30,163