For questions about the various kinds of entropies --- as defined in the context of quantum information theory and quantum statistical mechanics.
Questions tagged [entropy]
160 questions
13
votes
0 answers
What is the Generalized Quantum Stein's Lemma and why is it important?
I'm sensing a lot of buzz about potential re-proofs of the Generalized Quantum Stein's Lemma - a generalization of the quantum counterpart to the classical Stein's Lemma, which is of some importance in statistical inference and hypothesis…
Mark Spinelli
- 15,378
- 3
- 26
- 83
10
votes
3 answers
What is a "maximally mixed state"?
What is meant by maximally mixed states? Does this mean that there are partially mixed states?
For example, consider $\rho_{GHZ} = \left| {GHZ} \right\rangle \left\langle {GHZ} \right|$ and $\rho_W = \left| {W} \right\rangle \left\langle {W}…
Bekaso
- 305
- 2
- 6
10
votes
1 answer
Proof of an Holevo information inequality for a classical-classical-quantum channel
Suppose I have a classical-classical-quantum channel $W : \mathcal{X}\times\mathcal{Y} \rightarrow \mathcal{D}(\mathcal{H})$, where $\mathcal{X},\mathcal{Y}$ are finite sets and $\mathcal{D}(\mathcal{H})$ is the set of density matrices on finite…
Stephen Diadamo
- 155
- 5
9
votes
1 answer
Building Intuition for Relative Von Neumann Entropy
This is how I think about classical relative entropy: There is a variable that has distribution P, that is outcome $i$ has probability $p_i$ of occuring, but someone mistakes it to be of a distribution Q instead, so when outcome $i$ occurs, instead…
Mahathi Vempati
- 1,731
- 10
- 21
8
votes
4 answers
Maximally mixed states for more than 1 qubit
For 1 qubit, the maximally mixed state is $\frac{\mathrm{I}}{2}$.
So, for two qubits, I assume the maximally mixed state is the maximally mixed state is $\frac{\mathrm{I}}{4}$?
Which is:
$\frac{1}{4} (|00\rangle \langle 00| + |01\rangle \langle…
Mahathi Vempati
- 1,731
- 10
- 21
8
votes
1 answer
How does the conditional min-entropy $H_{\rm min}(A|B)_\rho$ relate to the conditional entropy $H(X|Y)_\rho$?
Suppose we have a classical quantum state $\sum_x |x\rangle \langle x|\otimes \rho_x$, one can define the smooth-min entropy $H_\min(A|B)_\rho$ as the best probability of guessing outcome $x$ given $\rho_x$. How does this quantity relate to…
john_smith
- 81
- 1
8
votes
1 answer
What is "linear" in linear entropy?
Why is the linear entropy, defined by $S_L = 1 - \textrm{Tr} \rho^2$, called linear?
Rob
- 401
- 2
- 5
8
votes
1 answer
What are the thermodynamic limits of Shor's algorithm
The asymptotic time complexity of Grover's algorithm is the square root of the time of a brute force algorithm. However, according to Perlner and Liu, the thermodynamic behavior (theoretical minimum on energy consumption) is asymptotically the same…
Nic
- 183
- 4
7
votes
1 answer
Degradable channels and their quantum capacity
Note: I'm reposting this question as it was deleted by the original author, so that we do not lose out on the existing answer there, by Prof. Watrous. Further answers are obviously welcome.
I have two questions:
What are degradable channels?
Given…
Sanchayan Dutta
- 17,945
- 8
- 50
- 112
7
votes
1 answer
Understanding classical vs. quantum channel capacities
The classical channel capacity ($C_{ea}$) and the quantum channel capacity ($Q$) as defined here (eqs. 1 and 2) are given by
\begin{equation}
C_{ea} = \text{sup}_{\rho} \Big[S(\rho) + S(\Phi_t \rho) - S(\rho,t)\Big],
\end{equation}
and…
Tobias Fritzn
- 721
- 4
- 11
7
votes
2 answers
Is the set of all states with negative conditional Von Neumann entropy convex?
I have read somewhere / heard that the set of all states that have non-negative conditional Von Neumann entropy forms a convex set. Is this true? Is there a proof for it?
Can anything be said about the reverse - set of all states that have negative…
Mahathi Vempati
- 1,731
- 10
- 21
7
votes
2 answers
How is the quantum relative entropy $S(\rho\|\sigma)$ defined when $\sigma$ is a pure state?
I want to evalualte the quantum relative entropy $S(\rho|| \sigma)=-{\rm tr}(\rho {\rm log}(\sigma))-S(\rho)$, where $\sigma=|\Psi\rangle\langle\Psi|$ is a density matrix corresponding to a pure state and $\rho$ is a density matrix corresponding to…
Confinement
- 167
- 6
7
votes
1 answer
How can the Holevo bound be used to show that $n$ qubits cannot transmit more than $n$ classical bits?
The inequality $\chi \le H(X)$ gives the upper bound on accessible information.
This much is clear to me. However, what isn't clear is how this tells me I cannot transmit more than $n$ bits of information.
I understand that if $\chi < H(X)$, then…
GaussStrife
- 1,193
- 8
- 14
7
votes
0 answers
Schumacher compression - comparing with Shannon compression
Background
Shannon's source coding theorem tells us the following. We shall consider a binary alphabet for simplicity. Suppose Alice has $n$ independent and identically distributed instances of a random variable $X\in \{0, 1\}$. Let us call this…
user1936752
- 3,311
- 1
- 9
- 24
6
votes
1 answer
Does computing the quantum mutual information $I(\rho^{AB})$ require full tomographic information of $\rho^{AB}$?
In the discussions about quantum correlations, particularly beyond entanglement (discord, dissonance e.t.c), one can often meet two definitions of mutual information of a quantum system $\rho^{AB}$:
$$
I(\rho^{AB}) = S(\rho^A) + S(\rho^B) -…
Ilya
- 163
- 4