1

Are there any serious reformulations of a quantum field theory of general relativity within a nonstandard analysis framework?  Is it possible that such reformulations (which are possible,  in principle) would create any predictive models related to some major problems in physics (like the black hole singularity problem, or quantum gravity)?

I emphasize that this is a question of principle, an elaborate answer cannot be given here. Nonstandard analysis is non-constructive, and model-theoretic nonstandard analysis is based on superstructures (pushing the level of abstractization one step up). A similar phenomenon can be observed with the emergence of the mathematical framework of quantum mechanics, with the transition towards operator calculus, not frequent or necessary in classical mechanics. So basically, could one level of abstractization higher solve major problems in physics?

Related questions found on StackExchange:

Q1 - Non-Standard Analysis Solution to Differential Equations

Q2 - Is there a (introductory) differential equations books from the infinitesimal perspective?

Q3 - Renormalization and Conway/Surreal Numbers

Q4 - Can the problem of unification of Quantum Mechanics and General Relativity be converted into a pure Mathematical problem?

Qmechanic
  • 220,844

1 Answers1

1

Non-standard analysis (NSA) is more of a way to reformulate the foundations of mathematics, than it is a way to reformulate applied-mathematical theories constructed on those foundations, which is what (the theoretical canon of) physics is. It deals with things like how to formulate tools like the derivative, integral, and limits in an alternative fashion to the usual $\epsilon$-$\delta$ formalism often seen in books, under the idea that that is not such an intuitive formalism, but rather that it would be better to use an explicit idea of an infinitesimal number so that, say, $dx$ becomes an actually "infinitely small increment" (though even here, it doesn't quite manage to work out perfectly.). It doesn't change their mathematical properties in any way, just gives a different way of defining them.

The chief purpose of these formalisms are in making mathematical proofs. Theories of physics are applications of differentials, integrals, limits and other things, and physicists are not typically concerned with proofs. The extent to which you will find $\epsilon$-$\delta$ proofs in a physics paper is thus, also, roughly (with of course due mindfulness of the possibility to be surprised!) the extent to which you could expect NSA to be useful in physics, i.e. outside of studying it from a mathematician's pov as mathematical constructs. Most physicists care so little about what $dx$ "really" is in many cases (perhaps too little, imo), that they use it in ways that are "said to make a mathematician cringe". Though for me, with fair interest and study of both perspectives I find them each to have advantages and disadvantages and no problem in using either when suitable.

That said, and to note my Math.SE answer here:

https://math.stackexchange.com/questions/3241309/what-is-the-intuition-behind-uniform-continuity/3241346#3241346

I'd say that actually the "standard" $\epsilon$-$\delta$ limit formalism is not that bad at all, nor unintuitive. It's just that as with many things in these areas, the way it's commonly presented really doesn't do it the justice it deserves and serves to obscure. $\epsilon$ and $\delta$ are approximation tolerances, the former on the dependent variable and the latter on the independent variable, like in empirical measurements and the use of calculators with finite numbers of digits, and $\epsilon$-$\delta$ formalism thus not only should be with proper exploitation of such intuitions be easily understandable to anyone who has mastered the use of those, but in fact I'd even say it is perhaps closer in vein to scientific praxis than NSA is (something that, again, is a view that sometimes may or may not be advantageous to dealing with any given formalism).

ADD: To heed the comments, I would say that mathematical proofs aren't irrelevant to physics - just maybe to most practitioners. Being able to more easily prove theorems or perhaps gain insight into how to prove a long-standing conjecture by a different view of your mathematical tools can certainly help in practice on the purely theoretical side of things that may then lead to conceptual breakthroughs and reformulations. My answer though is really to try and point out that using NSA does not by itself result, as the titular question suggests, in a "reformulation of GR and QFT" any more than, say, recompiling a computer program for a different processor architecture (that can be so recompiled) merely by doing it rewrites its source code or improves the algorithms used. The lower-level meaning/signification of the terms in the programming language changes perhaps dramatically, but the high-level construct does not see a thing.