The Boltzmann entropy is defined as the logarithm of the phase space volume (E). Is there a reference, book, paper which shows where this definition comes and how it is equal to the phase space volume?
2 Answers
The phase space volume itself is clearly not equal to its own logarithm, $\Omega\neq\ln\Omega$. The entropy is the logarithm and the logarithm is relevant here because it is additive, $$\ln(\Omega_1 \Omega_2) = \ln \Omega_1 +\ln \Omega_2,$$ for an argument that is the product of two factors, and the volume of phase space of 2 independent subsystems is indeed a product $\Omega_1\Omega_2$ (think about areas of rectangles or other Cartesian products of sets.)
A presentation for "true beginners" or lay audiences could have omitted the logarithm because the author may have considered logarithms to be inedible by his or her audiences.
In the thermodynamic (large number $N\to\infty$ of particles), the log of the volume may also be approximated by the logarithm of the surface of the region in the phase space and in many other ways.
In quantum mechanics, the phase space is effectively divided to "elementary cells" whose volume is $(2\pi\hbar)^N$ for some $N$. The quantum entropy is the logarithm of the number of these phase space cells. This is equivalent to choosing the right "unit" of the volume and to eliminating of the ambiguous additive shift in the definition of the entropy.
There exist generalizations of the notion of "entropy" for which we don't expect additivity ($S_{AB}=S_A+S_B$) but a different relationship. Then the logarithm rule has to be revised, too.
Any book on statistical physics (and/or modern enough book on thermodynamics) discusses these issues.
- 182,599
I am not very sure about this, but here is an attempt.
Well there is a connection you can try to make with the Shannon entropy !! By apriori principle of microstates you can see that probability of each state is
$$ p = \frac{1}{\Omega} $$
In infromation theory
given a set of events $ \{X_1,...,X_n\} $ with probabilities $ \{P_1,...,P_n\} $
Shannon information for the $i^{th}$ event is defined by $$ I_i = -log_2P_i $$
From the definition you can see that lesser the probability of an event, greater is its information. This is the motivation for this definition.
Now the shannon entropy is defined as average information in the given set of events.
For a probability distribution average of a quantity is defined by, $$ <Q> = \sum\limits_{i=1}^n P_iQ_i $$
So, the average information (or entropy) is given by
$$ S_{Shannon} =<I> = -\sum\limits_{i=1}^n P_ilog_2P_i $$
As an exercise you can also verify this average information is maximised when the value of $$ P_i = \frac{1}{n} \:\:\:\:\:\: \forall \: i = 1,..,n $$ which is the idea of apriori principle saying entropy is maximised. (begin by setting $ \delta S = 0 $ )
The base 2 logarithm is a comfortable choice in the case of information theory.
However we can translate this to the idea in statistical mechanics with natural logarithm and using the first equation where $ \Omega $ is the total number of microstates (volume of phase space divided by $ h $ - unit volume element of phase space) .
$$ S_{Boltzmann} \propto ln \Omega \implies S = k_B ln\Omega $$
Where $ k_B $ is factor involved conversion from log base 2 to natural logarithm.
[EDIT 1]
Detailed explanation is quite involved, but to give you first sight, Microstate - is a particular set of values of (p,q) the momentum and position in the phase space. One set of (p,q) describes one physical state for the system. Now, since the phase space is continuous, we can't do counting of the every point (since that would sum infinite no. of them). From quantum mechanics, we have
$$ \Delta x\Delta p \ge \hbar $$ From this we deduce the smallest area element in phase space (ie x*p) goes like h(Planck's constant). After having discretised the space we can count the states, counting no. of smallest boxes (i.e. the bits of area h) within the area - condition given by its energy
In the proper mathematical language (for a 2D phase space) :
$$ \Omega = \int_{H\le E} \frac{dp dq}{h} $$ where E is energy of the system and H the hamiltonian. And the integration is over a 2-sphere.
In a much simpler language to associate to your familiar information theory ($ n \rightarrow \Omega $), which also gives you apriori principle for maximum entropy.
- 3,134