6

In the discussions about quantum correlations, particularly beyond entanglement (discord, dissonance e.t.c), one can often meet two definitions of mutual information of a quantum system $\rho^{AB}$: $$ I(\rho^{AB}) = S(\rho^A) + S(\rho^B) - S(\rho^{AB}) $$ and $$ J(\rho^{AB}) = S(\rho^A)-S_{\{\Pi^B_j\}}(\rho^{A|B}), $$ where $S$ is the Von-Neumann entropy, $\rho^A$ and $\rho^B$ are the reduced states of the individual subsystems of $\rho^{AB}$ and the second term in $J$ is the quantum analogue of the conditional entropy $$ S_{\{\Pi^B_j\}}(\rho^{A|B}) = \sum_j p_j S(\rho^{A|\Pi^B_j}). $$ In the expression for the conditional entropy $\rho^{A|\Pi^B_j} = \text{Tr}_B[\rho^{AB} (\mathbb{I}^A \otimes \Pi^B_j )]/p_j $ are the states of the subsystem $A$ after getting a particular projector $\Pi^B_j$ in $B$, which happens with a probability $p_j = \text{Tr}[\rho^{AB} (\mathbb{I}^A \otimes \Pi^B_j ) ]$. While $I$ characterizes the total correlations between $A$ and $B$ the second expression involves a measurement process, in which non-classical features of $\rho^{AB}$ are lost, and therefore $J$ characterizes classical correlations in $\rho^{AB}$.

While measuring $J$ is relatively straightforward, (for 2 qubits one can just measure 4 probabilities $p(\Pi^A_i \Pi^B_j), \, i,j = 1,2$ and calculate the mutual information of the resulting probability distribution) I can't think of an easy way of estimating $I$. So my question is: is it possible to measure $I$ without performing a full tomography of $\rho^{AB}$?

glS
  • 27,510
  • 7
  • 37
  • 125
Ilya
  • 163
  • 4

1 Answers1

6

The mutual information can be written in terms of the relative entropy, please see Nielsen and Chuang (the entropy Venn diagram figure 11.2). I am writing the equation in the question's notation: $$I(\rho^{AB}) = S(\rho^{AB}|\rho^{A} \otimes \rho^{B})$$ The relative entropy can be estimated without full tomography. The procedure is described in Bengtsson and Życzkowski (equation 12.55-12.59) based on Lindblad's work:

The estimation procedure for the estimation of $S(\rho|\sigma)$ is performed as follows:

  1. Preparation of a composite system: $$\rho^N = \otimes^{N} \rho$$
  2. Measurement a set of POVMs $\{E\}$: $$p_i = \operatorname{Tr}(\rho^N E_i)$$ $$q_i = \operatorname{Tr}(\sigma^N E_i)$$
  3. Computation of the "Classical" relative entropy: $$S_N(\rho|\sigma) = \frac{1}{N} \sum_i p_i \log{\frac{p_i}{q_i}}$$

The relative entropy is estimated by optimization over a large set of POVMs and for a large number of copies $N$ due to the result: $$ S(\rho|\sigma) = \lim_{N\rightarrow \infty}\operatorname{Sup}_E S_N(\rho|\sigma) $$

Of course, as in any statistical estimation, there are estimation errors due to finite samples, however, I don't know how to obtain these error bounds.

David Bar Moshe
  • 2,645
  • 9
  • 12