5

In Molina et al (2012)'s article on quantum money, the proof of security of Wiesner's quantum money scheme depends on the fact that the density operator

$$Q = \frac{1}{4}\sum_{k \in \{0, 1, +, -\}}\left|kkk\right>\left<kkk\right|$$

has spectral norm $\|Q\|_{spectral}=3/8$.

The article presents this without proof, but I was able to show it by noticing that

$$\left|v_0 \right>=2 \left|000\right> + \sqrt{2}\left(\left|+++\right> + \left|---\right>\right)$$ $$\left|v_1 \right>=2 \left|111\right> + \sqrt{2}\left(\left|+++\right> - \left|---\right>\right)$$

are both eigenvectors of $Q$ with eigenvalues $3/8$, so since $Q$ is a density operator its eigenvalues must all be in $[0, 1]$ and add to one and therefore this is its maximum eigenvalue.

However, I only found these eigenvectors using a tedious numerical search, and since $\|Q\|_{spectral}=3/8$ was presented without proof I wonder - is there a more obvious way to show that this is true?

glS
  • 27,510
  • 7
  • 37
  • 125
Malper
  • 153
  • 4

2 Answers2

4

The eigenvalues of a matrix are independent of the choice of basis in which we represent it. This remains true for choices of bases that are not orthogonal. Consider then a matrix $A=\sum_k P_k$, where $P_k\equiv\lvert u_k\rangle\!\langle u_k|$, and $\{|u_k\rangle\}$ is a set of normalized (not necessarily orthogonal) vectors. Observe that $A=UU^\dagger$, where $U\equiv \sum_k \lvert u_k\rangle\!\langle k|$. Furthermore, observe that $$U^{-1}AU=(U^\dagger U)^{-1}U^\dagger AU=(U^\dagger U)^{-1}(U^\dagger U)^2=U^\dagger U,$$ where I used the formula for the pseudo-inverse to write: $U^{-1}=U^+=(U^\dagger U)^{-1}U^\dagger$. Here I should point out that I'm abusing notation, as $U$ might not be invertible. This is however not a problem, because I'm only interested in the restriction of the inverse of $U$ on its range, which equals the domain of $A$. This is why I can safely understand $U^{-1}$ as the pseudoinverse $U^+$. Moreover, $U^\dagger U$ is invertible, provided the vectors $|u_k\rangle$ are linearly independent.

You can actually simplify the above by observing that $BC$ and $CB$ have the same characteristic polynomial (and thus the same spectral radius) for any $B$ and $C$.

The gist of the above observations is that, whenever a matrix $A$ is a sum of unit-trace projections like the case at hand, one can replace computing the eigenvalues of $A$ with computing the eigenvalues of the matrix of overlaps of the component vectors. This can significantly simplify the calculations.

In the case at hand, we use the basis vectors $\{|kkk\rangle\}_{k\in\{0,1,+,-\}}$, and thus $$U^\dagger U=\frac{1}{2\sqrt2}\begin{pmatrix}2\sqrt2 & 0 & 1 & 1 \\0 & 2\sqrt2 & 1 & -1 \\ 1 & 1 & 2\sqrt2 & 0\\ 1 & -1 & 0 & 2\sqrt2\end{pmatrix}.$$ Finding the largest eigenvalue of $U$ amounts to maximising the quantity $\langle v|U|v\rangle$ over all $\langle v|v\rangle=1$. This calculation can be simplified observing that (1) $U$ is real symmetric, thus we can stick to real numbers, (2) being $U$ symmetric, we only need to take into account elements in the upper-right components, and just multiply by $2$ the results. We thus easily find $$\langle v|U^\dagger U|v\rangle = 1 + \frac{2}{2\sqrt2}\left( v_1(v_3+v_4) + v_2 (v_3 - v_4) \right).$$ Maximising this quantity thus reduces to maximising the quantity $$C(\vec v)\equiv v_1(v_3+v_4) + v_2 (v_3 - v_4)$$ under the constraint $v_1^2+v_2^2+v_3^2+v_4^2=1$. It's not hard from this to guess that the maximum is achieved with $v_2=0$ and $v_3=v_4$, or $v_1=0$ and $v_3=-v_4$. But if one wants to make sure of it, you can apply standard Lagrange multipliers, which in this case amounts to imposing that the condition on the gradients $\nabla C(\vec v)=\lambda \nabla \vec v$, which gives $$v_3+v_4= 2\lambda v_1, \qquad v_3-v_4= 2\lambda v_2, \\ v_1 + v_2 = 2\lambda v_3, \qquad v_1 - v_2 = 2\lambda v_4,$$ which has solutions for $\lambda=1/\sqrt2$, with solution space $(\sqrt2 v_1,\sqrt2 v_2,v_1+v_2,v_1-v_2)$ (you don't actually need this for the largest eigenvalue, but you might verify that this matches with your finding).

In conclusion, the largest eigenvalue of $U^\dagger U$ is $1+\frac{1}{\sqrt2}\times \frac{1}{\sqrt2}=3/2$. Dividing by $4$, you again find that the largest eigenvalue of $Q$ is $3/8$.

Of course, this whole last paragraph about finding the largest eigenvalue of $U^\dagger U$ is completely equivalent to the standard ways of diagonalising a Hermitian matrix. I just used this one because I enjoy reinventing wheels.

glS
  • 27,510
  • 7
  • 37
  • 125
3

Personally, I'd jump straight to Mathematica. It took me all of a minute:

zero = ({
    {1, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0}
   });
one = ({
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 0},
    {0, 0, 0, 0, 0, 0, 0, 1}
   });
plus = Table[1, {8}, {8}]/8;
minus = KroneckerProduct[{{1, -1}, {-1, 1}}, {{1, -1}, {-1, 1}}, {{1, -1}, {-1, 1}}]/8;
Q = (zero + one + plus + minus)/4;
Eigenvalues[Q]

{3/8, 3/8, 1/8, 1/8, 0, 0, 0, 0}

DaftWullie
  • 62,671
  • 4
  • 55
  • 140