I'm studying the Dirac monopole in Anomalies in Quantum Field Theory by R. A. Bertlmann (see this book). In the derivation of the Dirac quantization condition, the author argues that transporting the wavefunction around the monopole results in a phase: $$\psi\to\exp\left[-i\frac{2eg}{\hbar c}2\pi\right]\psi,$$where $e$ is the electric charge and $g$ is the magnetic charge. The condition for the wavefunction to be single-valued then leads to the quantization of $eg$.
However, I’m puzzled: why must the wavefunction itself be single-valued? If we compute a probability amplitude (say, $|\psi|^2$), the global phase would cancel out and give the same physical result. So why is this single-valuedness condition necessary, especially if the overall phase seems to have no observable consequence?