12

Physicist Sabine Hossenfelder on YouTube just recently posted a video about vacuum energy, the cosmological constant and the "worst prediction" in physics. The worst prediction in physics refers in this case to the enormous discrepancy of a factor of $10^{120}$ between the measured value of the cosmological constant and the calculated vacuum energy density from quantum field theory. This is known as cosmological constant problem.

What's confusing is that she claims this isn't a problem at all, because "the cosmological constant has nothing to do with the vacuum energy from QFT". As I understand, she says that the cosmological constant is simply a natural constant related to spacetime itself, just like, say, the gravitational constant. This seems to go against everything I've heard about this problem from other physicists.

To my understanding, it isn't wrong to interpret the cosmological constant as just a constant in the Einstein equations, but the hope was to explain the measured value using QFT. Moreover, if space really has such a huge vacuum energy density, then it should have been completely "ripped" apart by now. Not to mention that there are other interpretations of dark energy apart from a pure cosmological constant. What is really the deal with all of this?

Note: related videos are Are Dark Matter and Dark Energy Scientific?, Dark Energy might not exist after all. The video in the first link is Physicist Despairs over Vacuum Energy.

Quillo
  • 5,541
  • 1
  • 19
  • 46
User3141
  • 954

8 Answers8

11

What's confusing is that she claims this isn't a problem at all, because the cosmological constant has nothing to do with the vacuum energy from QFT. She says the cosmological constant is simply a natural constant related to spacetime itself, just like, say, the gravitational constant.

Step back a bit and remember that we have no way to predict the values of fundamental constants and we don't even have a real basis for assuming they can be predicted or linked together in some way and are not just random values.

Now think about the cosmological constant and gravitational constant in that context.

So what Sabine Hossenfelder seems to be saying is simply that we don't know why the fundamental constants that drop out of the GR equations have the values they have, and it's not safe physics to try and link any of those of a vague prediction of QFT.

Remember that QFT and GR are not compatible as QFT uses SR.

The safest result to take from the QFT prediction is to say "Hey, those two things are not linked" because the values are so different. Maybe a better theory (a GRQFT if such a thing exists) could make a better prediction that matches what we see, but we don't know that it will and it may be that the cosmological constant is simply another fundmanetal constant they we cannot predict the value of.

Her default position is that until you can prove they are linked, it's safer to assume they are not.

6

There is a nice explanation from John Baez on his page where he comes up with 5 possible values for the vacuum energy density:

  1. Very close to zero.
  2. Infinite.
  3. Enormous but finite.
  4. Zero.
  5. Not determined.

where 1. is based on experiment / observation and conservative assumptions about General Relativity; 2., 3. and 4. are based on naive theoretical calculations and 5. is the best that Quantum Field Theory can do right now$^*$.

The "based on naive calculations" of 3. is why S. Hossenfelder says that the factor of 10120 from QFT calculations is not a prediction.


$^*$Where "now" is 2011, when Baez wrote that explanation, but the situation is still the same as of now (2024).

2

Although we don't know the full details of quantum gravity yet, it will result in $4$-momentum-space integrals that ought to look like $\int_0^\Lambda f(k)d^4k+\int_{\Lambda}^\infty f(k)d^4k$ where we're only confident of the behaviour of $f(k)$ in the first term. In this context, $\Lambda$ is an energy cutoff scale, not the cosmological constant that unfortunately has the same symbol. In natural units $c=\hbar=1$, each quantity just has a mass dimension, e.g. $1$ for $\Lambda$ as well as $k$. We get the cosmological constant's density contribution in the Friedmann equations from a case $f\propto1$. In terms of the Planck mass $m_\text{Pl}$, the first term is$$\int_0^{\alpha m_\text{Pl}}d^4k\sim\int_0^{cm_\text{Pl}}k^3dk\sim m_\text{Pl}^4=\rho_\text{Pl},$$where we don't know the dimensionless $\alpha$ yet, $\sim$ means "up to multiplicative factors I won't try to get right given our current limited understanding", and $=$ uses natural units (in SI, the Planck density $\rho_\text{Pl}$ is $\frac{m_\text{Pl}^4c^3}{\hbar^3}$). The $\propto$, $\alpha$ and $\sim$s mean we don't have a firm "the Planck density results" conclusion, but it's not obvious that the factors they introduce should logarithmically make much of a difference. So at our current level of understanding, the result is morally on the Planck scale. It's empirically nowhere near that, hence the current problem: why isn't it close to the Planck-scale answer? (Some have asked a related question: why is it close to a Hubble-scale answer, viz. $\rho_\Lambda\sim H_0^2/G$? Both are examples of hierarchy problems, in which the least satisfying answer is that the second integral is $-1+\epsilon$ times the first with an empirically convenient but theoretically unexplained $\epsilon\ll1$.)

J.G.
  • 25,615
0

Most of the article by Dr B is an evisceration of an editors answer invoking 'negative pressure' (technically correct, hugely confusing to laypeople) to explain the acceleration of the universe.

This is also a bugbear of Sean Carroll it would seem, as can also be explained by the energy-density of vacuum is constant, so the universe expands at a fixed rate ($H$). This fixed expansion rate is not a speed, its a timescale. A constant expansion rate means accelerated motion away from us, for individual objects.

By pointing out we can't calculate the vacuum energy-density in General Relativity Dr B is basically saying in a slightly click-bait fashion its like trying to predict $G$.

That said, the 'worst prediction' (the CCP) is generally understood as saying that the number of degrees of freedom of dark energy in QFT are much too large to explain the observational data. Which is because QFT isn't a finished theory of quantum gravity.

Mr Anderson
  • 1,494
0

Brane cosmology could solve this issue. Here, an effective de Sitter brane is included in an Anti-de Sitter space-time.

With an effective Friedmann equation from brane cosmology (see below) one can realize

  • a small effective cosmological constant within the effective Friedmann equation
  • a large cosmological constant as a solution of the covariant energy conservation

Standard Cosmology on the Anti-de Sitter boundary, Class. Quantum Grav., 2021

https://doi.org/10.1088/1361-6382/ac27ee

https://arxiv.org/pdf/2010.03391.pdf

0

The status is that the prediction must be wrong and that this is an unsolved problem of QFT.

ACB
  • 2,588
my2cts
  • 27,443
-1

I'm not sure exactly what point Sabine was trying to make or what her understanding of the cosmological constant is (I didn't watch the video), but one alternative point of view to the "cosmological constant as vacuum energy" picture is that the cosmological constant is a boundary condition. In other words, it's part of the definition of the model rather than an output of the model. (I first encountered this idea in a paper by Banks and Fischler called, appropriately, Why The Cosmological Constant is a Boundary Condition , but I think it's an older idea.) This seems to agree at least in spirit with what you present as Sabine's view, in that the CC is a parameter specifying different models of quantum gravity rather than something we calculate within the model.

The picture of "quantum fields on curved spacetime" is only an effective, semiclassical description of the underlying microscopic degrees of freedom. In quantum gravity, unlike in quantum field theory, long-distance physics and microscopic physics are not independent. From this point of view it's not necessarily surprising that trying to derive the long-distance physics of the cosmological constant from an effective, semiclassical picture goes badly wrong.

d_b
  • 8,818
-1

The 120 orders of discreapancy between predicted QED enormous value and actual tiny value measured by cosmolgical constant is due to the fact that Dark energy of free space is hidden from our time-like spacetime.

We see only the tip of the ice-berg in the form of Dark Energy's noise thus the ZPF energy. Beyond, ZPF vacuum energy we have normal matter and light in our Universe.

Markoul11
  • 4,572