4

Quantum discord of a bipartite system can be determined as:

$${D_A}({\rho _{AB}}) = I({\rho _{AB}}) - {J_A}({\rho _{AB}}),$$ The subscript of $A$ denotes that the measurement has been performed on the subsystem $A$. The mutual information is defined as - $$I({\rho _{AB}}) = S({\rho _A}) + S({\rho _B}) - S({\rho _{AB}}),$$

the classical correlation is defined as -

$${J_A}({\rho _{AB}}) = S({\rho _A}) - \mathop {\min }\limits_{\{ \Pi _I^A\} } \sum\limits_i {{p_i}S({\rho _{\left. B \right|i}})} ,$$ $$S({\rho _{\left. B \right|i}}) = {1 \over {{p_i}}}t{r_A}\left( {\Pi _i^A \otimes {I_B}} \right){\rho _{AB}}\left( {\Pi _i^A \otimes {I_B}} \right)$$

How does one calculate this quantity, computationally say, using Qutip or any other package/software, is there a code developed for this? Can anyone suggest an algorithm? Doing this analytically for any arbitrary density matrix, $\rho_{AB}$ is not feasible.

Paranoid
  • 141
  • 2

1 Answers1

1

I'm surprised this hasn't been answered. QuTip does appear to have the components for computing discord already in place in terms of the mutual and conditional entropies. But, the conditional entropy is not the expression in the OP which you can rewrite the classical mutual info as $$ J(A|B) = S(\rho) - {min}_{\pi_j}\sum_j p_j S(\rho_{A|\pi_j}) $$ where $S(\rho)$ is the von Neumann entropy. The last term means that you take a variational minimum over the entropy over the set of POVM operators acting on the A subspace $$ \rho_{A|\pi_j} = tr_B(\pi_j\rho) $$ The $\pi_j$ operators introduce the constraint that $$ \sum_j \pi_j = I $$ and $\langle \pi_j\rangle \ge 0$. $S(\rho_{A|\pi_j})$ is then the vN entropy taken over $\rho_{A|\pi_j} $ and $p_j$ is the probability for that projected state $j$. Since the $p_j$'s are probabilities, we have an additional constraint that $$ \sum_j p_j = 1 $$

The problem is that the sum in $J(A|B)$ requires sampling over operators and then optimization. This is pretty easy for a bipartite system. For example, for a 2-level system you could have $\pi_1 = \lambda |0\rangle \langle 0|$ and $\pi_2 = (1-\lambda)|1\rangle \langle 1|$ where $\lambda$ is the probability for $\pi_1$ and $(1-\lambda)$ is the probability for $\pi_2$. Plug this into the expression and you can find an expression that has a min.

So, my PhD student suggested to define a "Classical" mutual entropy by taking the full density matrix, strip off the non-diagonal parts and compute the vN entropy. It works and is general to any dimension and is a pretty good estimate of the classical correlation.

E R Bittner