7

I read on wikipedia how Clausius came to define entropy after studying the Carnot cycle (He found a relation between heat transfer and temperature which was a state function,and named it entropy) but how this can be related to statistical form of entropy?

Qmechanic
  • 220,844

3 Answers3

2

Let's begin by looking at the statistical definition for entropy, and how it comes to be.

Consider two systems-A and A'-that are brought into thermal contact and are isolated to the rest of the surroundings. If the energies of the two system are respectively denoted as E and E', and the total energy of the two systems as $E^{(0)}$, then the probability, that after equilibrium is reached, the system A has energy E is given as- $$P(E)=C\Omega (E) \Omega '(E^{(0)}-E)$$ In simple words, the probability that system A has energy E when equilibrium is achieved is proportional to the product of the number of states accessible to system A when it has energy E, and the number of states accessible to system A' when it has energy $E'=E^{(0)}-E$.

It is well known that $\Omega$ and $\Omega '$ are rapidly increasing functions of their respective variables-E and E'. Therefore it can be argued that the Probability-P(E)-that system A has energy E after equilibrium is achieved has a very sharp peak for a particular value of E, that will be given by the expression. $$\frac{1}{P}\frac{\partial P}{\partial E}=\frac{\partial ln(P)}{\partial E}=0$$ Thus if we plug in the expression for P into this equation, we get $$\frac{\partial ln\Omega (E)}{\partial E}+\frac{\partial \Omega '(E')}{\partial E'}(-1)=0$$ $$\beta(E)=\beta '(E)$$ where $$\beta\equiv\frac{\partial ln\Omega}{E}$$ Here we add two more definitions- $$kT\equiv \frac{1}{\beta}$$ and $$S\equiv kln\Omega$$ $$\therefore \frac{1}{T}=\frac{\partial S}{\partial E}$$ This new quantity S, which is the definition of informational entropy, is such that S+S' is maximized.

The classical definition of entropy-or Thermodynamic entropy-as you termed it is defined similarly too, given as- $$dS=\frac{dQ_{rev}}{T}$$ Notice the similarities between the two definitions of entropy. Both are the same quantities, we have just arrived at them from two different perspectives.

Both quantities are derived on the basis of equillibrium conditions. At equilibrium, both quantities are equal. Both quantities are non-decreasing, and thus define the irreversibility of isolated systems, and by definition, they are essentially the same quantities.

SK Dash
  • 1,880
  • 6
  • 21
0

those two formulations of thermodynamics are developed by the use of completely different initial postulates and reasoning paths. the best exposition I have read on them and how they ultimately lead to the same conclusions is contained in a book by Giedt called Thermophysics.

niels nielsen
  • 99,024
-3

I believe statistical entropy is more related to the mixing of gases which also infers a volume change by default. Once gases mix energy is lost to entropy, ex: if 2 1 liter vials of pure gas are connected they mix and each now occupies 2L.

PhysicsDave
  • 2,886