39

I'm wondering about current efforts to provide mathematical foundations and more solid definition for quantum field theories. I am aware of such efforts in the context of the simpler topological or conformal field theories, and of older approaches such as algebraic QFT, and the classic works of Wightman, Streater, etc. etc . I am more interested in more current approaches, in particular such approaches that incorporate the modern understanding of the subject, based on the renormalization group. I know such approaches exists and have had occasions to hear interesting things about them, I'd be interested in a brief overview of what's out there, and perhaps some references.

Edit: Thanks for all the references and the answers, lots of food for thought! As followup: it seems to me that much of that is concerned with formalizing perturbative QFT, which inherits its structure from the free theory, and looking at various interesting patterns and structures which appear in perturbation theory. All of which is interesting, but in addition I am wondering about attempts to define QFT non-perturbatively, by formalizing the way physicists think about QFT (in which the RNG is the basic object, rather than a technical tool). I appreciate this is a vague question, thanks everyone for the help.

Qmechanic
  • 220,844

6 Answers6

15

There are a number of high level mathematicians who are working on giving a more mathematically precise description of perturbative QFT and the renormalization procedure. For example there is a recent paper by Borcherds http://arxiv.org/pdf/1008.0129, work of Connes and Kreimer on Hopf algebras and the work of Bloch and Kreimer on mixed Hodge structures and renormalization http://www.math.uchicago.edu/~bloch/monodromy.pdf just to name a few. To be honest, I am not mathematically sophisticated enough to judge what has been accomplished in these papers, but I think there are some problems in QFT which will probably involve some rather high-powered mathematics of the type being developed in these papers. For example, the current attempt to reformulate N=4 SYM in terms of Grassmannians apparently has some connection to rather deep mathematical objects called Motives. Results on the degree of transcendentality which show up in perturbative N=4 SYM amplitudes also seem beyond what physicists really understand and I believe the presence of transcendental objects (like $\zeta(3)$) in QFT amplitudes provides some of the motivation for the work of Bloch and Kreimer. I'm not an expert on this stuff, so perhaps someone else will chime in with a more complete explanation and additional references.

Edit: One more reference which is closer to the spirit of the original question is a book in progress by Costello on perturbative quantum field theory treated from the Wilsonian, effective field theory point of view. Notes are available online at http://www.math.northwestern.edu/~costello/renormalization

David Z
  • 77,804
pho
  • 4,690
9

The question here is how to organize a response. Names can be given, but I hope I can give some conceptual order. Hopefully other answers will find other ways to organize their response.

The Wightman axioms are classic. I take the approach here of organizing how other approaches sit with the Wightman axioms, even though they may not be axiomatic. A useful critique of the Wightman axioms can be found in Streater R F, Rep. Prog. Phys. 1975 38 771-846. More recent is the assessment of Fredenhagen, Rehren, and Seiler, “Quantum Field Theory: Where We Are” in http://arxiv.org/abs/hep-th/0603155, which I recommend. In my scheme here, however, the approach you particularly ask about, formalizations of the renormalization group, do not figure, because, as you say, they have a completely different starting point. I’d say that the starting point is perhaps the concept of Feynman integrals rather than the renormalization group in itself, but I’d also say that that’s a quibble.

There is a large question of what we hope to achieve by axiomatizing. (1) We can loosen the axioms, so that we have more models, some of which might be useful in Physics, but we have to figure out which on a case-by-case basis. This makes Engineering somewhat quixotic. (2) We can tighten the axioms, with the ambition that all the models are useful in Physics, but some Physically useful models might be ruled out. Mathematicians are often happy to work with axioms that a Physicist would consider too tight.

So, the Wightman axioms, more-or-less in Haag’s presentation in “Local Quantum Physics”:

  1. The state space (a) is a separable Hilbert space. There are people trying to use non-associative algebras, amongst other things. (b) which supports a representation of the Poincaré group. There are people doing QG, QFT on CST, and many ways of breaking Lorentz symmetry at small scales. (c) There is a unique Poincaré invariant state. Thermal sectors don’t satisfy this. Non-unique vacuums are an old favorite, but the vacuum state is pervasive in Particle Physics. (d) The spectrum of the generator of translations is confined to the closed forward light-cone. This is an elephant, IMO. The underlying reason for this is “stability”, which has no axiomatic formulation. The belief that the spectrum condition is necessary for stability may rely on classical thinking, particularly on the primacy of the Hamiltonian or Lagrangian. Feynman integrals for loops introduce negative frequencies, however, so there’s something of a case against it.

  2. The observables (which, implicitly, correspond in some way to statistics of experimental data) (a) Are operator-valued distributions. People have introduced other Generalized function spaces. Haag-Kastler tightens this, to bounded operators, but the mapping from space-time regions to operators is looser. In Particle Physics, the S-matrix, which discusses transitions between free field states on time-like hyperplanes at t=+/-infinity, has been the supreme observable for decades: trying to reconcile this with the Lorentz invariant operator-valued distributions of the Wightman axioms pretty much killed the latter. Condensed matter Physics, optics, etc., take correlation functions quite seriously, which seems to me to be at the heart of the split between Particle and other Physicists. Another elephant. (b) Are Hermitian. There’s a complex structure. People have also introduced quaternions in various ways. (c) The fields transform under the Poincaré group. This goes with 1b. (d) The observables are jointly measurable at space-like separation, but in general are not jointly measurable at time-like separation. Stepping away from the Poincaré group almost always results in violation of this axiom. Random fields, which are always jointly measurable operator-valued distributions, and the differences between them and QFT, are something I have published on.

The Haag-Kastler approach to some extent brings the states and observables into the single structure of von Neumann algebras, but essentially the distinction of linear operators and their duals remains. Refusing to split the world into states and observables, which we might call “holism”, makes Physics almost impossible. There’s always the question of exactly where the Heisenberg cut should be put, but pragmatically we just put it somewhere. Bell tries to square that circle while still doing Physics in his ‘Against “Measurement” ’, and Bohm as much as left Physics behind. There are people trying to do that kind of thing, but I find very little of it useful.

Returning back to earth, there’s also a question of how we deform a system that we have managed to construct so that it’s significantly different in interesting ways. This isn’t in the axioms, but the standard has been to deform the Hamiltonian or Lagrangian. Both methods, however, require a choice of one or two space-like hypersurfaces, which goes against the spirit of the Poincaré group. Algebraic deformations, the other known alternative (others?), have hardly left the ground because the constraints of positive energy, microcausality, and the primacy of the S-matrix have hitherto ruled them out (I have also published on this, based on Lie fields from the 1960s). If we deform the algebra of observables instead of the dynamics, the question arises of what "stability" might be.

There is of course a question whether one ought to start from the Wightman axioms at all, but one has to choose somewhere. Then, with Lee Smolin, one has to set off into the valleys, hoping to find a bigger hill. Best wishes.

Peter Morgan
  • 10,016
8

you'll find a lot of information on the nLab, the open online Wiki of a bunch of people working on n-categories. You should really click around and see what is there, here is the page about the "functorial" POV on QFT, the formalization of the Schrödinger picture of QFT, including TQFTs:

there is also a page about the Heisenberg picture aka axiomatic quantum field theory:

The wiki software written by Jaques Distler has a nice search function, use it! You'll find there is a lot about (formalizing) string theory, too, also about the work of Jacob Lurie et alt. on TQFTs, a long page about CQFT, and references about recent work on the picture of perturbative QFT and renormalization groups from the AQFT point of view.

Well, since you asked specifically for the latter, here is the direct link (but this is also on the nLab, together with a whole plethora of other resources):

There is also information about Connes' work on formalizing the standard model and unifying it with gravity by using noncommutative spaces.

Besides, if anyone does not find anything that should be there, go to the nForum and tell the folks over there about it!

Edit: Explanation of "noncommutative spaces": When you take a real smooth manifold like a spacetime, for example, this manifold is completely described by the algebra of charts. In fact, the very definition of manifold can be done this way. Every property of the manifold corresponds to a property of the algebra of charts. This algebra is commutative, of course. Connes' great idea of "noncommutative geometry" is that we could replace the commutative algebra of charts with a noncommutative operator algebra and see what geometric concepts we could transfer from the commutative to the noncommutative setting. Operator algebras ($C^*$-algebras, to be more precise) are then considered as a noncommutative analog of charts of a "noncommutative" space. Connes's did a lot of work on the standard model and perturbative QFT using this idea, but unfortunately it is mathematically quite sophisticated. For a good introduction for physicists, see for example:

  • J. Madore: "An Introduction to Noncommutative Differential Geometry and its Physical Applications", 2nd edition, Cambridge University Press

This book also explains ideas of extending classical spacetime with noncommutative aspects.

Tim van Beek
  • 3,755
5

Here is my answer from a condensed matter physics point of view:

Quantum field theory is a theory that describes the critical point and the neighbor of the critical point of a lattice model. (Lattice models do have a rigorous definition).

So to rigorously define/classify quantum field theories is to classify all the possible critical points of lattice models, which is a very important and very hard project.

(One may replace "lattice model" in the above by "non-perturbatively regulated model")

Xiao-Gang Wen
  • 13,725
4

I think if you want a rigorous approach to QFT that incorporates the ideas of renormalization and effective field theory, you might want to take a long look at the work of the 'constructive QFT' school. Their work on rigorous Euclidean functional integration is very much in the spirit of Wilson.

Two examples, chosen randomly among many:

1) Euclidean functional measures are usually constructed as cylinder measures on spaces of distributions. (These spaces of distributions arise as duals of nuclear spaces, which are vector spaces of functions which get their topology from families of norms which quantify "how sharply concentrated is this function?".) These measures are constructed as limits of cutoff measures, as the cutoff scale goes to zero, and the cutoff measures are measures are constructed in such a way that they approximate a renormalization group flow. For example, the construction of continuum Yang-Mills theory on a torus is done by rather explicitly block-spinning lattice Yang-Mills theory. Same thing for 2d Maxwell-Higgs.

2) In order to be an honest measure, a cylinder measure must satisfy one additional property: it must be countably additive. In Glimm & Jaffe's work, this property is obtained a consequence of a property called 'vanishing at infinity', which states, basically, that the measure is insensitive to the region of field space probed by test functions which are large and very sharply localized. This is almost exactly what one means by an 'effective' field theory.

user1504
  • 16,959
-1

I think they have already given you the necessary references before me. I return to your phrase: "... in addition I am wondering about attempts to define QFT non-perturbatively ...". I think the maximum that we may hope for is a partially taking into account some interactions. In this sense this part is non-perturbative. This may give us a better initial approximation (in/out states); the rest of interactions can be taken into account perturbatively. I started such an activity, see here, but no renormalization group appears in my approach, sorry. Besides, it is not formalizing but reformulating QFT basing on physical phenomena.

Please, don't kill me for that!