2

It is known that coherent information defined in terms of von Neumann entropies is a lower bound of quantum channel capacity. If we define coherent information in terms of $\alpha$-Renyi entropies, would it still be a lower bound?

The $\alpha$-Renyi entropy is defined as $$S_{\alpha}(\rho) =\frac{1}{1-\alpha}\log Tr(\rho^{\alpha} ).$$ For $\alpha \rightarrow 1$, we get the limit case $S_{1}(\rho)$, which is the von Neumann entropy.

Coherent information of a quantum channel $\mathcal{N}$ and an input state $\rho$ is defined as $$Q_{\alpha}(\rho, \mathcal{N}) = S_{\alpha}(\mathcal{N}\rho) - S_{\alpha}(\rho, \mathcal{N}) \ \ \textrm{with } \alpha=1$$ and it bounds the channel capacity.

Are there any results for $Q_{\alpha}$ with $\alpha >1$? Mostly, I'm interested in the special case $\alpha=2$.

MonteNero
  • 3,344
  • 7
  • 24

0 Answers0