2

My question stems from a confusion of the definition of entropy for the Fluctuation Theorem (see the comments too).

Normally, entropy is as an average over different ensembles. In the Canonical Ensemble,

\begin{align} e^{-{\frac {F}{kT}}} &= \int \ldots \int {\frac {1}{h^{n}C}}e^{\frac {-E}{kT}}\,dp_{1}\ldots dq_{n} \\ S &= - \frac{\partial F}{\partial T} \end{align}

This is for an n-particle system, so clearly, we are integrating over all possible system states.

Now assuming identical particles, if I prepare a single system in a micro-state $ \Gamma_0 = (\vec{q}, \vec{p})_{t=0} $ that need not be at equilibrium, then I could possibly compute a different kind of entropy over the particles in this system.

For example, imagine I chose a microstate that had a phase space density $ \rho(q, p) $. Meaning each particle is chosen from $ \rho $.

And use this (or approximations of this) to compute an entropy with the Boltzmann/Shannon formula $ S = - \int dq~dp~\rho \log \rho $. This seems like it could inherently be different from the standard ensemble entropy which computes entropy over the p.d.f. $ \rho(\Gamma) $.

Is there such a notion for micro-state entropy? In what cases is it equivalent to macro-state entropy? How should it properly be computed?

0 Answers0