1

I am concerned with accurate derivation of conclusions of two level system, When probability p and q are unequal,

If there are two levels with energy 0 and E(actually it is epsilon, energy of a given level and not total energy of the system), But the probability for given particle out of N, is more favourable for system with 0 energy, i.e p is greater than q,

In such cases one can go to derive results about the system by finding entropy using number of microstates which are accessible,

Since this system follows a Binomial distribution, the number of microstates is N!/n!(N-n)!,

If we find entropy S using Boltzmann's formula kln(no.microstaes),

In this case we find entropy maximum at N/2, and similarly about energy,

But look it doesn't include the different probability or different favour to the states, i.e p and q, Which essentially telling about which one macrostate is favourable,i.e one with zero energy,

But similarly if we go for probability case, using probability density function,

P(n)= NCn p^n q^N-n, this also includes p and q and the maximum of this function is occurring at Np(as according to Binomial distribution property) and not at N/2, using above approach of only considering number of microstates,

I am confused because many texts use directly the first approach, but some go with probability approach but assuming p=q, but this cannot be always true for real systems,

What is the best way to introduce this topic so that, one can never get wrong conclusion!

And is there any resolution, about first approach, How to find number of microstates such that they automatically take care of p and q.?

1 Answers1

3

Let us consider ensemble of two-level systems (the title of this and some preceeding questions about a two-level system is misleading). Let $N$ be the total number of TLSs/atoms, each of which can be either in the ground state with energy $E_g=0$ or in the excited state, with energy $E_e=\epsilon$.

Microcanonical ensemble
Working microcanonical ensemble we fix the system energy to be $U=n\epsilon$, which corresponds to $n$ atoms being in their excited states.

Probabilities
The, the probabilities for an atom to be in its excited or ground state are: $$ p=\frac{n}{N}, q=\frac{N-n}{N}.$$

Entropy
There are many different ways to choose $n$ atoms among $N$ to be in the excited state, each particular choice is a microstate, and there are total $$ \Omega = {N\choose n}=\frac{N!}{n!(N-n)!} $$ different choices. Boltzmann entropy (not to be confused with other kinds of entropies) is then $$ S=k_B\log\Omega $$ This can be expressed as a Shannon/information entropy: as we assume that all the microstates are equally probable, the probability of each microstate is $p_i=1/\Omega$, so the information entropy is $$ H=-\sum_ip_i\log p_i =-\sum_i\Omega^{-1}\log\Omega^{-1}=\log\Omega, $$ which differs from the traditional definition of the Boltzmann entropy by the Boltzmann constant $k_B$.

Macrocanonical ensemble
Temperature
Let us assume now that we know ad-hoc probabilities $p$, $q$. This could be parametrized in terms of temperature as $$ p=Z^{-1}e^{-\beta \epsilon}=\frac{e^{-\beta \epsilon}}{1+e^{-\beta \epsilon}}=\frac{1}{e^{\beta \epsilon}+1},\\ q=Z^{-1}e^{-\beta 0}=Z^{-1}=\frac{1}{1+e^{-\beta \epsilon}},\\ Z =1+e^{-\beta \epsilon},\beta=\frac{1}{k_BT} $$

Number of excited states
Note that now the number of atoms that can be in an excited state is not fixed, i.e., the microstates differ not only by the particular choice of atoms to be excited, but also by the number $n$ of the excited atoms. Probability of a having a specific number of the excited states is given by the binomial distribution $$ P_n={N\choose n}p^nq^{N-n} $$ For a given $n$ we have ${N\choose n}$ possible configurations of atoms, so that each configuration has conditional (on $n$) probability $p(i|n)=1/{N\choose n}$.

Entropy
We can now calculate Shannon entropy as $$ H=-\sum_{n=0}^N\sum_iP_np_i(n)\log\left[P_np_i(n)\right]= -\sum_{n=0}^N\sum_iP_np_i(n)\left[\log P_n+ \log p_i(n)\right]=\\ -\sum_{n=0}^NP_n\log P_n \sum_ip_i(n) - \sum_{n=0}^NP_n\sum_ip_i(n)\log p_i(n)=\\ -\sum_{n=0}^NP_n\log P_n + \sum_{n=0}^NP_n\log{N\choose n} $$ This can be somewhat further simplified (also using the Stirling formula), but is generally different from the microcanonical case.

We get back to the microcanonical case, if we note that the binomial distribution is strongly peaked for large $N$. Indeed, it has mean $\langle n\rangle=pN$ and standard deviation $\sigma=\sqrt{Np(1-p)}$, so that $\langle n\rangle/\sigma_n\propto 1/\sqrt{N}$. The limit $N\rightarrow\infty$ is called thermodynamic limit, and we see that we can then assume that only one state is relevant and only one term in the sum in the equation above needs to be kept. The resulting entropy is then $\propto \log{N\choose n}$ (up to an additive and multiplicative constants.)

Related:
What is the differentiating factor between work and heat?
How subjective is entropy really?

Roger V.
  • 68,984