4

When I read on Wikipedia about Big Bang (Inflation and Baryogenesis), I got the impression that in the beginning of the universe all matter and antimatter had been created exclusively as pairs:

Temperatures were so high that the random motions of particles were at relativistic speeds, and particle–antiparticle pairs of all kinds were being continuously created and destroyed in collisions.

Hence, the matter-antimatter asymmetry seems to be logical problem as the process of pair production always produces equals amounts of matter and antimatter. Baryogenesis then tries to explains why matter was favored over antimatter.

However, when I read about it on this CERN webpage, I got the impression that the matter-antimatter asymmetry is rather a statistical problem as in the latter paragraphs the article speaks about "particles decaying either as matter or antimatter":

Some unknown entity intervening in this process in the early universe could have caused these "oscillating" particles to decay as matter more often than they decayed as antimatter.

The article also makes the analogy with spinning a coin which has a 50-50 chance of landing on its head or its tail.

But if that's really the process that caused matter and antimatter to emerge from the early high-energy universe, i.e., a statistical process of decaying either as matter or antimatter particles, then I don't see the actual problem. Physics could treat matter and antimatter exactly equal, i.e., the analogy of a perfectly fair 50-50 coin, but since there's only a finite amount of particles, the chances of ending up with exactly 50% matter and 50% antimatter is actually not that big. Consider spinning that 50-50 coin $2N$ times ($N\gg 1$); what's the probably that the coin will land exactly $N$ times on its head? This is given by the Binomial distribution $$ \binom{2N}{N}\frac{1}{2^{2N}} $$ which tends to zero for $N\rightarrow\infty$. So, if $N\gg 1$, it wouldn't be at all surprising that such a statistical process yields an outcome different from 50%-50%.

If that is how matter/antimatter was produced, it seems the anthropic principle can be applied and the matter-antimatter asymmetry doesn't seem to be a real issue anymore.

a_guest
  • 300
  • 1
  • 9

1 Answers1

2

The basics

This answer first addresses the title question:

Is the matter-antimatter asymmetry a logical or a statistical problem?

Then, it puts that question into a larger context that gives the answer a fuller meaning and a deeper basis to see how scientists have come to the conclusions behind those answers.

This is a logical issue up to extremely high energies

The Standard Model of Particle Physics conserves baryon number and lepton number on an interaction by interaction basis. Baryon number is equal to quarks minus anti-quarks divided by three. Lepton number is equal to leptons minus anti-leptons. And quarks and leptons are the only kinds of particles that are either matter or antimatter. The experimental evidence supporting these Standard Model conservation laws is explored in a footnote at the bottom of this answer.

Separate conservation of baryon number and lepton number locks in the matter-antimatter symmetry in a logical sense, and is subject to only one theoretical exception at extremely high energies, discussed below.

Conservation of baryon number and lepton number is a looser condition than pair production.

For example, you can have a W- boson (which is neither matter nor antimatter and has B=0 and L=0) decay to an electron (L=1) and an antineutrino (L=-1) which conserves lepton number but doesn't involve an identical matter-antimatter pair. You don't have to have electron-positron pair creation and destruction to conserve lepton number.

The exception is for sphaleron interactions, which don't separately conserve baryon number of lepton number, but do conserve the quantity B-L.

It is estimated that energies about 100 times the energies of the Large Hadron Collider (LHC) are necessary for sphaleron interactions to occur. The 9 TeV energies (or less) necessary for a sphaleron interaction to take place seem lower than or similar to those of the LHC. But the requirement that the energy be concentrated in a more compact space than the LHC can manage makes the effective energy requirement higher. A sphaleron requires a roughly 9 TeV energy to be concentrated at a density of about 1000 times that of the mass-energy density of a proton (i.e., a radius of about 8.4 * 10-17 meters) (incidentally, this wouldn't produce a black hole because the Schwarzschild radius of a 9 TeV sphaleron is about 2.4 * 10-50 meters). Creating a sphaleron would require a mass-energy density more than nine million times greater than a neutron star, i.e. 1017 kg/m3. See, e.g., Papaefstathiou (2019) and the papers cited therein. See also, e.g., Koichi Funakub, "Status of the Electroweak Baryogenesis" ("[W]e find that the sphaleron process is in chemical equilibrium at T between 100 GeV and 1012 GeV.")

It is even possible that some as yet unknown law of physics, which would be consistent will all scientific observations conducted to date, imposes a maximum mass-energy density below this threshold, in which case sphalerons would be impossible. See, e.g., Hujeirat (2021) (proposing the hypothesis that the local energy-density in our universe at any given place has an upper-limit of 2 to 3 the nuclear density). If so, the logical problem would persist all of the way up to the Big Bang which, in that case, could never be a true singularity.

So, it is a logical problem and not a statistical one, until you have energies necessary for sphaleron interactions, which are estimated to require energies of about 100 times the energies of the Large Hadron Collider to occur (if it is possible at all).

This is a statistical problem at extremely high energies

The conditions necessary to get baryon number and lepton number to zero at t=0 are called the Sakharov conditions after Andrei Sakharov who devised them in a 1967 paper (Yoshimura is also sometimes given credit for them). The three necessary "Sakharov conditions" are:

  1. Baryon number violation.

  2. C-symmetry and CP-symmetry violation.

  3. Interactions out of thermal equilibrium.

It is widely acknowledged (see also Babu (2013)) that those conditions are not present in a sufficient magnitude to get baryon number to zero by t=0 with Standard Model physics.

the CP violation in the standard model is a small effect. In particular it is too small to create the observed matter-antimatter asymmetry of the universe, for which CP violation is an indispensable ingredient.

From Thomas Mannel "Theory and Phenomenology of CP Violation" (2006) (also noting that there are no experimentally measured deviations from Charge-Parity-Time (CPT) symmetry in non-gravitational physics).

Charge-Parity (CP) conservation violation arise only from the CKM matrix (the sole source of CP violation for quarks in the Standard Model) and PMNS matrix (the sole source of CP violation for leptons in the Standard Model), and can tweak the matter-antimatter balance in sphaleron interactions.

The Standard Model prediction regarding CP violation has been confirmed experimentally. As of 2024, all experimentally measured CP violation observed in Nature (apart from neutrino oscillations data where the experimental uncertainties are too great to saying anything more than that significant CP violation occurs in these oscillations which may be possible to characterize with a single parameter of the PMNS matrix), is defined by a single parameter out of four parameters in all, in the CKM matrix. These sources of CP violation are insufficient in magnitude to explain the baryon asymmetry of the universe, or any possible lepton matter-antimatter asymmetry in the universe.

CP violation in the Standard Model is something that is a matter of probability that is a function of the relevant CP violating phase (from either the CKM matrix, in the case of quarks, or the PMNS matrix, in the case of leptons), rather than a strict conservation law that is observed in every single interaction. The probability of a particle converting to another particle in the Standard Model is basically equal to the square of the relevant CKM or PMNS matrix entry, some of which include a significant CP violating component.

But the time between the Big Bang and the time when the universe cools down too much for Nature to naturally produce sphaleron interactions is short. It is on the order of one to ten seconds in a standard cosmological chronology of the universe. This is less time than it takes to read this answer. And, this timing is generous.

There are no traces in the predictions of Big Bang Nucleosynthesis that imply that there was not baryon asymmetry in the initial conditions of the universe. Indeed J.-M. Frère, "Introduction to Baryo- and Leptogenesis" (2005) notes that:

based on nucleosynthesis (which occurs late in the history of the Universe and is therefore not too sensitive to the various scenarios – even if it can be affected by the number of neutrino species and the neutrino background) indicate a stricter, but compatible bound: 4 x 10−10 < nB/nγ < 7 x 10−10.

Any baryon number violating process must take place at T > 200 MeV (the QCD phase transition temperature), otherwise the success of nucleosynthesis will be spoiled. This temperature is about 400,000,000 times the temperature of the Sun and is believed to correspond to a time one microsecond after the Big Bang in the conventional chronology of the universe. One microsecond is about the time it takes a muon to decay. BBN itself is assumed to take place 10 to 1000 seconds after the Big Bang. This temperature is in the ballpark of the highest temperatures arising at the Large Hadron Collider (a temperature scale at which the Standard Model continues to perform as expected in myriad experimental tests).

Put another way, even advocates of a zero baryon number initial condition (and this would be a majority of theoretical physicists and cosmologists notwithstanding the lack of empirical or observational evidence for it) pretty much agree based upon observation and empirical evidence and well established SM equations and reasonable extrapolations beyond the Standard Model, that the baryon asymmetry of the universe had to be in place around one microsecond after the Big Bang.

The main reason we can't rule out baryon number violation prior to one microsecond after the Big Bang is that we have no way to observe it.

All known CP violating processes are stochastic. But they are not enough to get you to zero baryon number and lepton number from t= 10 seconds (or less, possibly as little as 10-6 seconds) back to t=0 as you go backwards in time. So, at very high energies, the baryon asymmetry of the universe and the matter-antimatter asymmetry of the universe are statistical problems.

Is this even a true "problem"?

But for matter-antimatter asymmetry to truly be a "problem", you have to assume that the aggregate baryon number of the universe and the aggregate lepton number of the universe should each be zero at t=0 (in a Big Bang chronology).

This is, however, a purely speculative assumption.

All the observational evidence and the Standard Model equations point to a non-zero baryon number and a non-zero lepton number at t=0. You can only get to a zero baryon number and lepton number at t=0 with new physics at extremely high energies, for which there is no observational support except that it would seem "pretty" if it was zero at time zero.

Thus, calling matter-antimatter asymmetry a "problem" is honestly a bit presumptuous, because the status quo doesn't involve any violation of the laws of physics or involve any contradictory observations. It is only a "problem" to the extent that you have preconceptions about what initial conditions should be present at the t=0, which could be contrary to what Nature actually does.

The "problem" simply may be that we have misguided assumptions about the initial conditions at the time of the Big Bang, rather than that there is actually any new physics that is needed to resolve the asymmetry. After all, we have no problem with another conserved quantities, the aggregate mass-energy of the universe, being non-zero at t=0.

The general issue of making unwarranted assumptions about how Nature should be and the extent to which doing so has not advanced physics over the last fifty years is explored by Sabine Hossenfelder in her 2018 book "Lost in Math: How Beauty Leads Physics Astray" (the German title, literally translated, is "The Ugly Universe.").

If you crave the beauty of B=0 and L=0 at t=0, however, one of the more elegant way to do so, while creating non-zero B and L shortly after t=0 is to assume that the matter excess in our universe is matched by an anti-matter excess in a mirror universe before t=0 that is anti-matter dominated and in which, due to entropy, time flows in the opposite direction as suggested in Licata (2020) and Boyle (2018). In these models matter-antimatter pairs are created that basically cross the t=0 boundary in a manner somewhat analogous to Hawking radiation near the event horizon of a black hole. This, conveniently, puts all "new physics" required by the proposal (i.e. the reversal of the arrow of entropic time in the antimatter mirror universe) in a place just barely outside the observable universe.

Aggregate B-L considered

We also don't know the value of aggregate B-L right now.

We know that there is a baryon matter-antimatter asymmetry and an equal charged lepton matter-antimatter asymmetry. But the number of baryons and charged leptons in the universe is vastly outnumbered by the number of neutrinos in the universe and we don't known the exact ratio of neutrinos to antineutrinos in the universe with any great precision, because that is very hard to measure.

The number of baryons in the universe is about 4 * 1079, and the number of neutrinos in the universe is about 1.2 * 1089. We know that the ratio of baryon antimatter to baryon matter (and the ratio of charged leptons to charged antileptons) is on the order of 10-11. And, we know that to considerable precision there are 2 neutrons for every 14 protons in the universe (this is a confirmed prediction of Big Bang Nucleosynthesis), and that the number of charged leptons is almost identical to the number of protons in the universe. (Source). The number of exotic mesons and baryons other than neutrons and protons is negligible at any given time in nature since they are so short lived and generated only at high energies.

Antimatter also isn't hiding in some corner of the observable universe. The possibility that there is antimatter in the observable universe and we just haven't found it yet, is strongly disfavored. See, e.g., Paolo S. Coppi, "How Do We Know Antimatter Is Absent?" (2004) (reviewing the evidence against spatial anti-matter domains). So, baryonic matter-antimatter asymmetry is real.

Neutrinos are so much more common than quarks and charged leptons that even if dark matter particles at a very wide range of proposed masses (keV and up) were all anti-matter, or all matter, for example, the matter-antimatter balance of neutrinos would still dwarf any matter-antimatter imbalances from any other sources contributing to aggregate B-L. If B-L is zero, then the number of anti-neutrinos should ever so slightly exceed the number of neutrinos, deviating from a 50-50 balance by about one part per 1010.

Nonetheless, matter-antimatter asymmetry is a major motivation for physicists to look for new physics beyond the Standard Model that violate baryon number and lepton number in a CP violating way.

The impetus for new physics to "solve" this "problem" at high energies has been diminished somewhat, however, since the discovery of the Higgs boson. Before then, many physicists expected that the Standard Model would break down at high energies requiring beyond the Standard Model new physics anyway.

But, the Higgs boson mass and the associated beta function for it, imply that the SM maintains unitarity up to Big Bang energies. There is nothing that would cause the SM to break down in terms of mathematically if there were no new physics at all at any scale above what is measured and the universe is at least metastable (the Higgs boson and top quark masses haven't been measured precisely enough to determine if the universe is stable or metastable if there are no laws of physics other than the Standard Model). See also, e.g., Koichi Funakub, "Status of the Electroweak Baryogenesis" (noting that Higgs boson masses with more than 120 GeV are problematic for models creating the Baryon Asymmetry of the Universe from a starting point of zero, when the global average measured value as of 2019 was 125.10 ± 0.14 GeV).

Footnote on experimental evidence for B and L conservation.

There has never been an observation of non-conservation of baryon number.

This has been tested in multiple processes, e.g. proton decay, flavor changing neutral currents, etc. The experimental bounds on proton decay and neutron oscillation are both very strict. "No baryon number violating processes have yet been observed." Lafferty (2006) citing S. Eidelman et al. (Particle Data Group), Phys. Lett. B592 (2004).

"Despite significant experimental effort, proton decay has never been observed. If it does decay via a positron, the proton's half-life is constrained to be at least 1.67 * 1034 years." Yet, the universe is roughly 1.4 * 109 years old. This experimental result has been a leading means by which GUT theories are ruled out.

Similarly, neutron-antineutron oscillation is not observed but if baryon asymmetry involves this process there "is an absolute upper limit on the n − n¯ oscillation time τn−n¯ of 5 × 1010 sec. irrespective of the B − L breaking scale, which follows from the fact that we must generate enough baryon asymmetry via this mechanism (according to the linked 2013 paper). The limit on neutron-antineutron oscillation as of 2009 was τn−n¯ ≥ 108 sec. See also confirming the experimental result here. In other words, neutron to antineutron oscillation happens less often than once every 31.7 years for any given neutron, even though the mean lifetime of a free neutron before it decays is only about 14 minutes and 37.75 seconds. See Hoogerheide (2021). Bound neutrons in atoms, however, are stable.

Exclusions for flavor changing neutral currents at the tree level have also not been observed although the measurements are less precise:

In the SM, flavor-changing neutral currents (FCNC) are forbidden at tree level and are strongly suppressed in loop corrections by the Glashow–Iliopoulos–Maiani (GIM) mechanism with the SM branching fraction of t → qH predicted to be O(10−15). Several extensions of the SM incorporate significantly enhanced FCNC behavior that can be directly probed at the CERN LHC.

In top quark decays, flavor changing neutral currents are excluded to a branching fraction of not more than about 0.47% (per the link above).

Likewise, there are no processes which have ever been observed which do not conserve lepton number (e.g. there is no observational evidence of neutrinoless double beta decay).

These bounds on lepton number conservation violation are very strict already. The universe is roughly 1.4 x 1010 years old, and the limit from GERDA (from 2015) means that no more than one in 3.8 x 1016 of hadrons that could have experienced neutrinoless double beta decay so have actually experienced neutrinoless double beta decay since the formation of the universe.

No sphaleron interactions have ever been observed, although this reality isn't all that profound because in the Standard Model the theoretical prediction for energies we have been able to reach experimentally is that we don't expect to have observed them yet.

ohwilleke
  • 3,997