I recently learned about the Entropy of entanglement and now asking myself, what is the difference between the classical entropy and the entropy of entanglement? Is there even any or is entropy really just a measure of how entangled a system is? I thought I understood what entanglement and entropy is and now I'm confused as hell.
2 Answers
If you are a believer of the "church of the larger Hilbert space", whose central belief is that any mixed quantum state can always be seen as part of a larger pure state (whether you believe in this is, to quite some extent, a matter of interpretation rather than hard facts), then: Yes, every mixed state is part of a larger entangled pure state, and then, its entropy is a measure of its entanglement with the other part of that pure state.
Some more details can also be found in this older answer of mine (thanks to @Rococo for digging this out!).
- 22,105
TL; DR: the basic formula is the same, but the content and the domain of application are totally different. Entropy in statistical physics is not a measure of entanglement.
Let us consider what is proposed in the Wikipedia article linked as von Neuman entanglement entropy:
- we start with a two-component system, described by the density matrix $$\rho_{AB}=|\Psi_{AB}\rangle\langle\Psi_{AB}|$$
- we trace this density matrix over one of the components: $$\rho_A=Tr[\rho_{AB}]$$
- we calculate the entropy using the expression for the Shannon's entropy ($S=-\sum_ip_i\log p_i$) generalized for the use with the density matrix: $$S(\rho_A)=-Tr[\rho_A\log \rho_A]$$
This is a rather special situation, as compared to (equilibrium) statistical mechnaics where the partition function has a well-defined form (e.g., in the canonical ensemble): $$ \rho=e^{-\beta H} $$ Moreover, in most cases we will not even use this expression, but rather define the entropy as a logarithm of the phase volume occupied by the system (assuming that the system has an equal probability of being in each state): $$S=k_B\log\Omega.$$ Finally, in thermodynamics the entropy is introduced via $$ dS = \frac{\delta Q}{T}. $$
To recapitulate: although the basic formula for calculating the entropy is the same (except for thermodynamic approach), the content is different:
- in statistical physics we take the statistical matrix of the system of many degrees of freedom, which has a well-nown form (at least in equilibrium). In particular, this entropy is never zero, because equilibrium statistical ensemble is not in a pure state.
- the entanglement entropy can be calculated for any system with two quantum numbers, even for one particle. No assumptions about the size of the system or the particular form of the density matrix are made. This entropy takes zero value in a pure state, which si why it can be used as a measure of entanglement.
Update:
@Rococo has brought up in the comments this article, where both concepts are used: a complex entangled system is thermalized, and the entanglement entropy becomes thermodynamic entropy (as the entanglement vanishes).
Update 2
Jaynes in his article The minimum entropy production principle points out at the confusions resulting from using word entropy for quantities that actually differ mathematically or have different physical content:
By far the most abused word in science is "entropy". Confusion over the different meanings of this word, already serious 35 years ago, reached disaster proportions with the 1948 advent of Shannon's information theory, which not only appropriated the same word for a new set of meanings; but even worse, proved to be highly relevant to statistical mechanics.
Jaynes himself distinguishes at least 6 different types of entropy.
- 68,984