It is sometimes stated that any classical underpinnings (rightly non-local) of a general quantum system are unrealistic or unphysical because these require exponentially more information to store what the quantum system can. For example, to simulate $N$ qubits in general we need $2^N$ classical variables. In other words, the classical system doesn't scale well. Why is this considered a shortcoming of the classical system (besides from a practicality point of view)?
Can we not turn this question around as follows:
On the whole, a quantum many-body system (e.g. Fock state, quantum field, etc.) has a very large state-space, say, of dimension $N'$. Yet from it's dynamics, only ${\mathrm O}\left(\log N' \right)$ classical units of information emerge. (Caveat: multiple classical configurations can exist in superposition; however, not all of these configurations give rise to coherent classical systems). Is a quantum system therefore a very inefficient way to do classical mechanics?
When we quantize a 3 bit system, say, we pretend that the system has 3 degrees of freedom. However, the quantum system lives in a $2^3$-dimensional Hilbert space. To me it seems that the quantum system has $2^3-1$ degrees of freedom. Why should we demand that a classical simulator require fewer variables?
Here is an example of some of this type of criticism in an answer by Peter Shor. (My apologies to him if I am misrepresenting him by framing his comments in this way).
It seems like a cheat to say that quantum computers use "N" qubits (because that's what we see) when, as a dynamical system, their degrees of freedom are among $\mathrm{O}(2^N)$. Because classical physics was discovered first, it seems that we have gotten used to measuring the size of systems in terms of things like lengths and energies (eigenvalues of operators) rather than dimensions of vector spaces on which those operators act. Is this just our classical bias showing?
To be specific, I am not asking about complexity classes here. But if tomorrow somebody posts a paper on the arxiv with a simple, classical, deterministic model underlying ordinary quantum mechanics, but the model scales exponentially in the size of the system it models, is it automatically considered uninteresting?
tl;dr: Why are deterministic classical underpinnings of quantum systems considered flawed because they require an exponential amount of resources to recreate exponentially many degrees of freedom?