7

Consider how entropy is mathematically defined in classical thermodynamics:

$$\delta Q=TdS \;,$$

in which $\delta Q$ stands for an "element" of heat, transferred from a classical source to a classical destination (which are classical, continuous substances), $T$ for the temperature of both the classical source and the classic (which implies a reversible form of heat transfer), and $dS$ for the increment of the conjugate variable to the temperature $T$. Note that the formula is not an implicit expression for entropy.

In contrast, the mathematical definition for entropy in statistical mechanics is an explicit one:

$$S=k_B \ln\Omega \;,$$

in which $S$ stands for the statistical entropy, $k_B$ is the Bolzmann constant, and $\Omega$ for the number of microstates (or microscopic configurations).

Can these two expressions somehow be connected in a mathematical way? I don't think so (so classical entropy can't be derived from statistical entropy) because if that was the case that would mean incompatible quantities were compared with each other, which by definition can't be done with such quantities (although it is said that $k_B$ forms the bridge between the two approaches to thermodynamics).

Or to put it differently, if there are two different mathematical definitions of entropy, doesn't that mean that there are two different kinds (by which I don't mean different interpretations) of entropy, even if they have the same unit? And can't the same can be said about, for example, the different notions (mathematical definitions) of gravitational force in the Newtonian and Einsteinium light? Or, even more general, between different notions of whatever quantities in whatever two different theoretical approaches of these quantities?

I edited because of some additions where made in the (very clear) answer below. There it's said that the two different approaches are equivalant indeed. But my point is if they are truly equal indeed. The statistical approach is said to be more fundamental from which you deduce the statistical approach, which can only be used in connection with measurements. Or to put it differently, the classical approach is experimental, from which (by first making postulates) a theory can be deduced, while the statistical approach is theoretical which can be tested by experiments. So I don't think that they are the one and only same thing, in concept. Of course, you can say both refer to the one and only true entropy, but I prefer to think that both approaches are (theoretic dependent) referring to truly two different kinds of entropy.

I think it's rather confusing though and nevertheless.

Does anyone have an opinion on this matter? I have the feeling right now though this question belongs more and more in the science-philosophy department and less and less in this one...

Deschele Schilder
  • 1
  • 5
  • 45
  • 105

3 Answers3

9

One important point is that statistical entropy is define as a function of the total energy of the system $$ S(E) = k_B \ln \Omega(E).$$ Now assume that your system that start with total energy $E$ is bring to energy $E'=E+\delta Q$ by heat exchange. The heat exchanged here is $\delta Q$ and you have for infinitesimal change $$ S(E')-S(E) = \mathrm{d} S = \frac{d S}{dE}\delta Q$$ The temperature is actually define in statistical mechanics as $\frac{1}{T} = \frac{d S}{dE}$ and you retrieve your classical formula from statistical mechanics

$$\mathrm{d} S = \frac{\delta Q}T $$

Hence both formula are indeed connected.

A point about the difference between thermodynamics and statistical mechanics.

Thermodynamics is about what can be say on the system on an exterior basis, that means postulate of thermodynamics assume the existence of some functions (internal energy, entropy,...) and say that those function are enough to describe the exchange of the system with the exterior. But thermodynamics never provide a way to compute those function and associated quantities (such that heat capacity).

Statistical mechanics however is concerned by the computation of such quantities from first principle (you start from the hamiltonian of the system).

So we do not have a priori incompatibilities between definition of entropy in thermodynamics and statistical mechanics as thermodynamics never explain how to compute entropy without having to measure things. (If you measure heat capacity you should be able to retrieve the entropy but you will have to measure something)

Hadrien
  • 476
3

How I consider the entropy definitions are connected:

Classical thermodynamics: Entropy is measure of the amount of energy which is unavailable to do work.

Statistical mechanics (Boltzmann entropy): Entropy a measure of the amount of information which is unavailable about the many-particle system (i.e. entropy is a measure of potential information, and Boltzmann = Shannon entropy when microstates are equiprobable)

So - if this is the same entropy - a measure of unavailable energy or information

then energy must be proportional to information, right?

Sure it is: Landauer's principle, the mathematical connection.

Mr Anderson
  • 1,494
0

As I understand the original question is equivalent to asking if there is a mathematical proof showing both definitions give the same value in a general and rigotous way. I suspect the answer is no, but at the same time I do not know whethet the question makes sense given the fundamental concpts at stake.

It is the case that in these situations we can only check that they are not contradictory or inconsistent.

facenian
  • 416
  • 3
  • 12