For questions about the (quantum) relative entropy, the quantum version of the classical Kullback-Leibler divergence.
Questions tagged [relative-entropy]
30 questions
8
votes
1 answer
How to derive the quantum Fisher information from the relative entropy?
The quantum relative entropy (QRE) between two states $\rho$ and $\sigma$ is given by
$$
S(\rho\|\sigma)=\operatorname{Tr}(\rho\ln\rho)-\operatorname{Tr}(\rho\ln\sigma)
$$
Now if $\rho$ and $\sigma$ are infinitesimally related i.e,…
m1rohit
- 103
- 6
5
votes
1 answer
How to calculate the conditional min-entropy via a semidefinite program?
I am trying to formulate the calculation of conditional min-entropy as a semidefinite program. However, so far I have not been able to do so. Different sources formulate it differently. For example, in this highly influential paper, it has been…
QuestionEverything
- 1,837
- 12
- 23
5
votes
1 answer
Is the quantum min-relative entropy $D_{\min}(\rho\|\sigma)=-\log(F(\rho, \sigma)^2)$ or $D_{\min}(\rho\|\sigma)=-\log(tr(\Pi_\rho\sigma))$?
In John Watrous' lectures, he defines the quantum min-relative entropy as
$$D_{\min}(\rho\|\sigma) = -\log(F(\rho, \sigma)^2),$$
where $F(\rho,\sigma) = tr(\sqrt{\rho\sigma})$. Here, I use this question and answer to make the definition simpler…
James
- 51
- 1
5
votes
1 answer
Questions about the relation between max-relative entropy $D_{\max}(\rho||\sigma)$ and max-information
The max-relative entropy between two states is defined as
$$D_{\max }(\rho \| \sigma):=\log \min \{\lambda: \rho \leq \lambda \sigma\},$$
where $\rho\leq \sigma$ should be read as $\sigma - \rho$ is positive semidefinite. In other words, $D_{\max}$…
user1936752
- 3,311
- 1
- 9
- 24
4
votes
1 answer
Subsystem dimension bound for the quantum relative entropy
I'm curious as to whether a statement of the following form can be proven:
$$
D(\rho_{AB} || \tau_{AB}) \leq D(\rho_{A}|| \tau_{A}) + |B|
$$
Where $D(\cdot || \cdot )$ is the standard quantum relative entropy, and $\rho, \tau$ are both density…
loplo
- 83
- 3
4
votes
2 answers
Clarification about inverses in sandwiched Renyi divergence
The sandwiched Renyi divergence is defined as in
$$
\tilde{D}_\alpha(\rho\|\sigma):=\frac{1}{\alpha−1}\log tr[(\sigma^{\frac{1−\alpha}{2\alpha}}\rho \sigma^{\frac{1−\alpha}{2 \alpha
}})^\alpha]
$$
The divergence measure takes on finite values when…
Peeveey
- 103
- 6
4
votes
1 answer
Does the quantum relative entropy have a direct operational interpretation?
Consider the quantum relative entropy, defined as
$$D(\rho\|\sigma) = \operatorname{tr}(\rho\log\rho)-\operatorname{tr}(\rho\log\sigma),$$
for all $\rho,\sigma\ge0$ such that $\operatorname{im}(\rho)\subseteq\operatorname{im}(\sigma)$.
It is…
glS
- 27,510
- 7
- 37
- 125
4
votes
1 answer
Data processing inequality for relative entropy in the presence of an amplitude damping channel
Consider the single qubit quantum depolarizing channel, given by
$$T(\rho) = (1- p)\rho + p \frac{\mathbb{I}}{2}. $$
For an $n$ qubit state $\rho$, according to Definition 6.1 of this paper, the channel satisfies a strong data processing inequality,…
BlackHat18
- 1,527
- 9
- 22
3
votes
1 answer
Proof that the relative entropy satisfies $S(\rho\|\sigma)=S(T\rho\|T\sigma)$ iff $\hat TT\rho=\rho$, $\hat TT\sigma=\sigma$ for some $\hat T$
To prove the saturation condition for the strong subadditivity of the von Neumann entropy, the authors of [HJPW2004] make use of the following characterisation of when the monotonicity of the relative entropy is saturated: we…
glS
- 27,510
- 7
- 37
- 125
3
votes
0 answers
What are examples of states saturating the strong subadditivity of the von Neumann entropy?
A well-known property of classical distribution is that they satisfy strong subadditivity, meaning that for any tripartite joint probability distribution $p(x,y,z)$, we have the inequality
$$H(AB)+H(BC) - H(B) - H(ABC) \ge 0.$$
This can be…
glS
- 27,510
- 7
- 37
- 125
3
votes
0 answers
On the use of $\log(P\otimes Q)= \log P\otimes I+I\otimes\log Q$ for relations between entropic quantities. What if $P,Q$ are only semidefinite?
Many properties of entropic quantities are shown by resorting to related properties of the relative entropy of suitable quantities. For instance, subadditivity of entropy may follow from non negativity of relative entropy by observing…
atlantropa
- 61
- 4
3
votes
1 answer
Which quantum entropies are meaningful with respect to continuous distributions of states?
When using a quantum channel to transmit classical information, we consider an ensemble $\mathcal{E} = \{(\rho_x, p(x))\}$ consisting of states $\rho_x$ labelled with a symbol $x$ from a finite alphabet $\Sigma$, each of which is associated with a…
forky40
- 7,988
- 2
- 12
- 33
3
votes
2 answers
What is the conditional min-entropy for diagonal ("classical") matrices?
The conditional min-entropy, discussed e.g. in these notes by Watrous, as well as in this other post, can be defined as
$$\mathsf{H}_{\rm min }(\mathsf{X} \mid \mathsf{Y})_{\rho}\equiv -\inf _{\sigma \in \mathsf{D}(\mathcal Y)} \mathsf{D}_{\rm max…
glS
- 27,510
- 7
- 37
- 125
3
votes
1 answer
Quasi concavity of max-relative entropy?
The max-relative entropy between two states is defined as
$$D_{\max }(\rho \| \sigma):=\log \min \{\lambda: \rho \leq \lambda \sigma\}.$$
It is known that the max-relative entropy is quasi-convex. That is, for $\rho=\sum_{i \in I} p_{i} \rho_{i}$…
user1936752
- 3,311
- 1
- 9
- 24
2
votes
1 answer
What are non-standard ways to describe the distance between states?
I understand that when comparing two arbitrary quantum states, one may use various measures to encapsulate the difference between states such as trace distance, fidelity or relative quantum entropy. I was wondering whether there are other…
milo
- 39
- 5