4

Firstly, I am lead author and developer of unitaryfund/qrack, the open-source quantum computer simulator. We've had a method in the library, QInterface::TryDecompose(), since about 2018, whereby Qrack can check if a general pure quantum state $|\psi\rangle$ decomposes as $|\psi_A\rangle\otimes|\psi_B\rangle$. It does this by providing a quantitative metric "entanglement witness" and comparing if the "amount of entanglement" along a requested bipartite boundary is less than a tolerance parameter on a normalized $0$ to $1$ scale. The "amount of entanglement" is defined as the round-trip fidelity loss on applying QInterface::Decompose() followed by QInterface::Compose(), checked against a copy of the original $|\psi\rangle$. In other words, for a tolerance "$\epsilon$," the substates are deemed separable if $\epsilon > (1 - \left|\langle\psi|(|\psi_A\rangle\otimes|\psi_B\rangle)\right|^2)$. Compose() is simply a Kronecker product (or tensor product) on separable pure states, and fidelity is checked via the usual inner product rules of Dirac notation and quantum mechanics in genral.

"Decompose()" will fully break entanglement in general in a manner that appears to be unitary, as total probability is conserved when comparing the original state to each subsystem. The algorithm does this through a method of (apparently) tracing out subsytem probability and phase as they would be encoded on the reduced density matrix, observing (as in the Decompose() implementation) that the phases of all amplitudes contributing to the average of a single reduced subsystem phase dimension would be identical (up to arbitrary non-observable overall phase factor) under the assumption that the state is separable (i.e., the assumption that $|\psi\rangle = |\psi_A\rangle\otimes|\psi_B\rangle$ has an exact solution in the ideal).

I am well aware of Schmidt decomposition and other singular value decomposition (SVD) based techniques. I am less familiar with semidefinite programming and Doherty-Parrilo-Spedalieri (DPS), for example. That (DPS) paper I link quotes $exp(poly(n)) poly log(1/\epsilon)$ complexity for finding the overall maximal decomposition of a density matrix, rather than just checking a bipartite boundary of a pure state, but I do not have a perfectly clear and quantitative idea of how these separability tests in different domains are related.

On tests with nonseparable and separable boundaries of subsystems of GHZ states, TryDecompose() can work with a lower "entanglement witness metric" tolerance than Schmidt decomposition's quantitative tolerance for singular values in the C++ Eigen3 library (as in this example), while providing empirically better execution time and memory scaling that might constitute a superpolynomial advantage. Analytically, minimizing any redundant computational complexity as TryDecompose() does, we can bound complexity in general at less than 3 times the memory in total as of the original state vector and at less than 5 full traversals of the original state vector dimension count.

Sorry that I'm effectively self-promoting my own implementation as a research question on Quantum StackExchange, but I genuinely appreciate the slightly-less-formal input of the peer community before (finally) publishing on the specific algorithm in Qrack. (We published on Qrack in general in IEEE QCE'23.) Is this algorithm in Qrack novel? Is it pareto-optimal? (I'm honestly already eking past it with "quantum binary decision diagrams" for specific respective cases at which each is better, but that other topic truly requires at least a separate question.)

Dan Strano
  • 91
  • 3

0 Answers0