The exponential growth in the dimension of a many-body Hilbert space with increasing particle number is often presented as something that makes quantum physics very different from classical physics. But is this really true? Doesn't a classical stochastic system also have an exponentially large state space?
Consider n (not necessarily statistically independent) copies of a system that can take on $d$ different values. If the systems are quantum mechanical, then the state space of the full system is the complex projective space $\mathrm{CP}^{d^n-1}$. This is the famous "exponentially large (in $n$) Hilbert space" of quantum physics. If we ignore the complex structure, then this is a manifold (without boundary) with $2(d^n-1)$ real dimensions.
But if the systems are classically stochastic (e.g. described by a local-hidden-variable theory), then the state space of the full system is a ($d^n-1$)-simplex. This is a manifold (with boundary) with $d^n-1$ real dimensions. This is also exponentially large in $n$; in fact, it's just half the dimension of the quantum-mechanical case.
Now, I'm well aware of the many ways that quantum systems are qualitatively differently from classical stochastic systems: Bell's theorem, the Kochen-Specker theorem, the fact that BQP is probably strictly larger than BPP, etc. etc. But it seems to me that the fundamental difference between quantum and classical systems is not the exponentially large Hilbert space of quantum mechanics, but instead the fact that in quantum mechanics, amplitudes are complex rather than nonnegative-real (and are converted into "real" probabilities via the Born rule), which allows for the possibility of destructive interference between amplitudes.
Here's one concrete example to make the point sharper: people often say that the power of quantum computers is that they can "use" the exponentially large Hilbert space to perform computations. But I would argue that a (classical) probabilistic Turing machine can also "use" an exponentially large state space to compute (although I know that some people will object that that is a very different sense of the word "use" than in the case of a quantum computer). And yet, the class of problems BPP that can be efficiently solved by a probabilistic Turing machine is suspected not to be any larger than the class of problems P that can be efficiently solved by a deterministic Turing machine. So it seems to me that in the case of a probabilistic Turing machine, an exponentially large state space is not useful for qualitatively improving a computer's performance.
Am I missing some mathematical distinction that makes the "exponentially large state space" unique to quantum physics? Or is it just that, for philosophical reasons, people consider the exponentially large Hilbert space of quantum physics to be "real" (in the colloquial rather than the mathematical sense of the word) but not the exponentially large simplex of stochastic classical physics? (I don't mean that question dismissively; that's certainly a reasonable philosophical position to hold.)