DaftWulie's answer to Extending a square matrix to a unitary matrix says that extending a matrix into a unitary cannot be done unless there's constraints on the matrix. What are the constraints?
3 Answers
A necessary and sufficient condition is that, given an $n\times n$ matrix $M$, you can construct a $2n\times 2n$ unitary matrix $U$ provided the singular values of $M$ are all upper bounded by 1.
Sufficiency
To see this, express the singular value decomposition of $M$ as $$ M=RDV $$ where $D$ is diagonal and $R$, $V$ are unitary. Now define $$ U=\left(\begin{array}{cc} M & R\sqrt{\mathbb{I}-D^2}V \\ R\sqrt{\mathbb{I}-D^2}V & -M \end{array}\right), $$ which we can only do if the singular values are no larger than 1. Let's verify that it's unitary \begin{align*} UU^\dagger&=\left(\begin{array}{cc} RDV & R\sqrt{\mathbb{I}-D^2}V \\ R\sqrt{\mathbb{I}-D^2}V & -RDV \end{array}\right)\left(\begin{array}{cc} V^\dagger DR^\dagger & V^\dagger\sqrt{\mathbb{I}-D^2}R^\dagger \\ V^\dagger\sqrt{\mathbb{I}-D^2}R^\dagger & -V^\dagger DR^\dagger \end{array}\right) \\ &=\left(\begin{array}{cc} RD^2R^\dagger+R(\mathbb{I}-D^2)R^\dagger & 0 \\ 0 & RD^2R^\dagger+R(\mathbb{I}-D^2)R^\dagger \end{array}\right) \\ &=\mathbb{I}. \end{align*}
Necessity
Imagine I have a matrix $M$ with a singular value $\lambda>1$ and corresponding normalised vector $|\lambda\rangle$. Assume I construct a unitary $$ U=\left(\begin{array}{cc} M & A \\ B & C \end{array}\right). $$ Let's act $U$ on the state $\left(\begin{array}{c} |\lambda\rangle \\ 0 \end{array}\right)$. We get $$ U\left(\begin{array}{c} |\lambda\rangle \\ 0 \end{array}\right)=\left(\begin{array}{c} M|\lambda\rangle \\ B|\lambda\rangle \end{array}\right). $$ This output state must have a norm that is at least the norm of $M|\lambda\rangle$, i.e. $\lambda>1$. But if $U$ is a unitary, the norm must be 1. So it must be impossible to perform such a construction if there exists a singular value $\lambda>1$.
- 62,671
- 4
- 55
- 140
$\newcommand{\bs}[1]{\boldsymbol{#1}}$Here is a slightly different way to prove what the other excellent answer did.
Note that a matrix $U$ is unitary if and only if it sends orthonormal bases into orthonormal bases. This, in particular, means that if $U$ is unitary then $\|U\bs v\|=1$ for any $\bs v$ with $\|\bs v\|=1$.
Let us write the SVD of $M$ as $M\bs u_k=s_k\bs v_k$, where $s_k\ge0$ are the singular values of $M$.
Note that if $U$ is an extension of $M$, then $U\bs u_k=s_k \bs v_k+\bs w_k$ for some $\bs w_k$ orthogonal to $\bs v_k$ (and more generally to the whole range of $M$).
If follows that if, for any $k$, $s_k>1$, then $\|U\bs u_k\|>1$, and thus $U$ is not unitary.
On the other hand, if $s_k\le1$ for all $k$, let us show how can always construct a unitary $U$ that contains $M$ as a submatrix. Let us denote with $\bs v\oplus \bs 0$ the vectors in the extended $2n$-dimensional space that are built by appending zeros to the $n$-dimensional vector $\bs v$, and with $\bs 0\oplus\bs v$ the vectors that are equal to $\bs v$ in the last $n$ dimensions by zero in the first $n$ ones. Being $\{\bs u_k\}_k$ a basis for the original space, it follows that $\{\bs u_k\oplus \bs 0,\bs0\oplus\bs u_k\}_k$ is a basis for the extended space.
We will define $U$ through its action on the vectors $u_k\oplus \bs 0$ and $\bs0\oplus u_k$ as follows: \begin{align} U(\bs u_k\oplus \bs 0)&=s_k(\bs v_k\oplus\bs 0)+\sqrt{1-s_k^2}(\bs 0\oplus \bs v_k) \\ U(\bs0 \oplus \bs u_k)&=\sqrt{1-s_k^2}(\bs v_k\oplus\bs 0)-s_k(\bs 0\oplus \bs v_k). \end{align}
One can then check that all of these output vectors form an orthonormal system in the extended space, and thus $U$ is unitary.
- 27,510
- 7
- 37
- 125
Here's a solution that follows a slightly different idea, but provides the precise number of rows that need to be added to get a unitary matrix.
Let $M$ be an arbitrary $n\times m$ matrix. As a first observation, note that the task of adding rows/columns can be reduced to that of adding rows to get $m$ orthonormal columns. Indeed, once this is done, we get an isometry, which one can then trivially make into a unitary by simply building an orthonormal basis from the orthonormal columns in the usual way (Gram-Schmidt etc).
We thus want to figure out when it is possible to add rows to $M$ to obtain orthonormal columns, and when it is possible, exactly how many rows need to be added. Call $S$ the block of rows added to $M$. If we are adding $s$ rows, then $S$ is an $s\times m$ matrix. Note that the full ($M$ with $S$ added) matrix is an isometry iff $$M^\dagger M + S^\dagger S = I_{m\times m} \Longleftrightarrow S^\dagger S = I_{m\times m}-M^\dagger M.$$ Now, thinking in terms of SVDs, observe how this means that this is possible iff the singular values of $M$ are all in the range $[0,1]$. Furthermore, we can now also say that the singular values of $S$, call these $s_k$, must be $s_k=\sqrt{1-m_k^2}$, with $m_k$ the singular values of $M$.
In summary, we found that $S$ must be a matrix whose singular values are $\sqrt{1-m_k^2}$. More precisely, multiplicity needs to be also matched here: e.g. if $M$ has a two-fold degenerate singular value $0.5$, then $S$ needs to have a two-fold degenerate singular value $\sqrt{1-0.5^2}$.
This is telling us that $S$ needs to have rank equal to that of $I-M^\dagger M$, or in other words, $S$ needs to have a number of rows equal to the number of nonzero singular values of $M$. This is both necessary and sufficient, and once verified, building $S$ can be done taking as right singular vectors those of $M$ with nonzero singular values, and as left singular vectors an arbitrary orthonormal set in the relevant space.
As an easy sanity check, try taking an arbitrary set of orthonormal columns in whatever (sufficiently large) dimension you like. Observe how removing $k$ rows from the corresponding matrix, $I-M^\dagger M$ has precisely $k$ vanishing eigenvalues, implying that precisely $k$ rows need to be added to $M$ to get back an isometry.
- 27,510
- 7
- 37
- 125