2

On Wikipedia, the max-entropy for classical systems is defined as

$$H_{0}(A)_{\rho}=\log \operatorname{rank}\left(\rho_{A}\right)$$

The term max-entropy in quantum information is reserved for the following definition

$$H_{\max }(A)_{\rho}=2 \cdot \log \operatorname{tr}\left[\rho_{A}^{1 / 2}\right]$$

While these are just definitions, they go by the same name so is there a relationship between them?

What I know

The only thing I managed to prove was that $H_0(A)_\rho \geq H_{\max}(A)_\rho$. The proof is below. Let $\lambda_i$ be the eigenvalues of $\rho_A$ and $r$ be the rank of $\rho_A$. We have

\begin{align} H_{\max}(A)_\rho &= 2\log(\lambda_1^{1/2} + .. + \lambda_n^{1/2})\\ &\leq 2\log \left(\frac{1}{r^{1/2}}\cdot r\right)\\ &=H_0 \end{align}

Is there perhaps a reverse version of this inequality

$$H_{\max}(A)_\rho\geq H_0(A)_\rho + \text{something}$$

which would justify using the same name for both quantities?

glS
  • 27,510
  • 7
  • 37
  • 125
user1936752
  • 3,311
  • 1
  • 9
  • 24

1 Answers1

1

The term max-entropy in quantum information is reserved for the following definition

No it's not, many papers like https://arxiv.org/abs/0803.2770 use the term to refer to the quantity $\log \mathrm{rank}(\rho)$. Your first definition comes from the Rényi entropy of order 0, while the second one comes from the Rényi entropy of order $\frac{1}{2}$, and you should always check which one the authors are referring to.

user13507
  • 126
  • 2