6

Suppose there are two fixed density matrices $\rho_1$ and $\rho_2$ are prepared for equal probability. Can we say something about the minimum number of measurements required to distinguish the two states?

One approach I was thinking of is using the fact that the maximum success probability of discriminating correctly is $p_{success} = 1/2 + \Vert \rho_1 - \rho_2 \Vert_1/2$ (from trace distance). Since the trace distance between two random density matrices is generically exponentially small, the success probability will also be exponentially small. From here, I'm lost how to connect to the number of measurements. Is there a rigorous way to say something about the (minimum) number of measurements? This is a single-copy measurement setting by the way.

Jon Megan
  • 535
  • 2
  • 8

1 Answers1

2

Settings

In your setting, you assume you are given $n$ copies of some state $\rho$, which is either $\rho_1$ or $\rho_2$ with equal probability. The setting you have chosen is also symmetric in another way - you treat the error of misidentifying $\rho_1$ as $\rho_2$ to be the same as the error of misidentifying $\rho_2$ as $\rho_1$. You are not allowed joint measurements on $\rho^{\otimes n}$. You could consider sequential measurements where the result of the previous measurements is used for subsequent measurements but in my answer, I assume this is not allowed either. So you make $n$ independent measurements.

Many of these assumptions can be changed and some possibilities are covered in Chapter 7 of Tomamichel's book.

Experimental procedure

You perform the Helstrom measurement $n$ independent times and you obtain $s$ measurements where you get the correct result and $n-s$ measurements where you get the incorrect result.

Probability of correct guess

Your probability of correctly identifying the state is $P_w$. The probability of losing this is $P_l = 1 - P_w$. You have that

$$P_{w} = \frac{p^s(1-p)^{n-s}}{p^s(1-p)^{n-s} + (1-p)^sp^{n-s}}.$$

How many measurements?

We know that for sufficiently large $n$, $s = pn$. Substituting that and doing some algebra, you obtain

\begin{align} \frac{1 - P_{w}}{P_{w}} &= \left(\frac{(1-p)^pp^{(1-p)}}{p^p(1-p)^{(1-p)}}\right)^n\\ \log\frac{P_{l}}{P_{w}} &= n\log\left(\frac{(1-p)^pp^{(1-p)}}{p^p(1-p)^{(1-p)}}\right) = -n\left(D(p\|q) + D(q\|p)\right)\\ P_{w} &= \frac{1}{1+2^{-n\left(D(p\|q) + D(q\|p)\right)}} \end{align}

I have set $q = 1-p$ and used the quantity $D(p\|q)$, the Kullback-Leibler divergence.