5

Can anyone summarize calculations that have been done about the theoretical probability of a detectable black hole collision happening in the observable universe within the time that LIGO has been operating?

I mean, given what we know about processes and parameters in the universe - about densities of black holes, rate of explosion of stars etc - how often would we expect an observable event?

This question is looking for an analysis independent of observations of actual GW events. It's the scientific "what does theory predict" question.

If we just think for a moment about black hole mergers (as opposed to other GW events), the types of thing that obviously need to be taken into account is "what is the density of black holes throughout the lifetime of the universe?", to determine what is the chance that they merge. This is a complicated question in itself, because the relevant density comes from earlier and earlier in the universe's life the further away the event is (obviously?).

But interestingly, there's also the problem if the planar nature of GW wave fronts. What is the chance that Earth happens to be in the plane of the GW event at the time - it seems to me that this aspect alone must dramatically reduce the number of probable observations...

It appears to be an obvious question, but googling the exact question, and creative variants I could think of, does not uncover anyone else asking it or addressing it. Similarly, ligo.org does not appear to have any material talking about this topic.

The consideration is alluded to in this question and answer, but not given any direct elaboration that I could find.

Interestingly, the answers in this similar question are massively different. One says "possibly daily" and the other says "should see one by 2020". Neither offer any justification.

GreenAsJade
  • 1,281

3 Answers3

3

The simple reasoning in Mikael's answer makes sense. LIGO detected 1 strong and 1 weak event in 16 days of coincident data (1 calendar month of taking data at 50% duty cycle), so 1 event/month should be the right order of magnitude.

Note that this is not the rate of all GW events in the universe, but only the observable ones with the current sensitivity of LIGO. The observable distance scales with the sensitivity, and the observable rate with the observable volume, which goes with the cube of the distance. This means improving the sensitivity by a factor 2 will increase the detectable rate by a factor 8. Current LIGO should hopefully increase its sensitivity by a factor of 3 by fine-tuning the instrument in the next few years, so this should become several events / week. Future new facilities might increase this by another significant factor, exciting!

You can find an old (2010) estimate of the detectable rate based on astrophysical models here. Due to a large uncertainty in the models, there is about 2 orders of magnitude between pessimistic and optimistic rates. An updated rate given the detection of the first 1 or 2 events is here. I believe this is slightly better than the predicted rate for binary black holes, but we still have to observe the first binary neutron stars. Doing statistics with only 1 or 2 events obviously gives large uncertainties, but should get you into the rights order of magnitude. This should improve a lot after the first ~10 events have been detected.

2

This will be a rule of thumb answer so take it with a grain of salt.

If it took LIGO time $T$ to find it's first signal from black hole merger. It's most likely the next signal will come roughly at time $T$ after the first.

This calculation assumes that the signals are uncorrelated (likely) and Poison distributed, as they are discrete events with no memory of other events.

2

There are a huge pile of factors here. You could split them up into the intrinsic properties of the sources, the geometry of the situation and the sensitivity of the detector.

The first of these involves making an estimate of the density of potential GW sources as a function of their masses, separation and distance from us. This requires models for the formation of black holes and black hole binaries as a function of mass. This in turn involves assumptions and models about the birth rates of massive stars, their binarity and their mass-loss rates. You also need to assume something about the density of star forming galaxies.

Prior to the GW announcement, the rate of massive BH mergers was expected to be from 0.1 to around 1000 Gpc$^{-3}$ yr$^{-1}$ (Abadie et al. 2010).

This rate can be used to estimate a detection rate. This is extensively discussed in the LIGO discovery papers (e.g. Abbott et al. 2016a) and takes into account the assumed random direction and orbital inclination of the events, the intrinsic strain sensitivity of the instrument and a cosmological model to relate distance to co-moving volume. This results in an effective detection volume that can be multiplied by a theoretical rate to get an estimated detection rate. Looking solely at the geometry issues you mentioned - it appears that the random direction and orientation of a binary mean that the effective volume in which a detection might occur, is reduced by an order of magnitude over the limiting distance to which LIGO is sensitive if the geometry is optimal (i.e. face-on and overhead). Indeed , Abadie et al. (2010) suggest the geometric factor is $2.26^{3} = 11.5$

The effective volume depends on the black hole masses, because the GW strain depends strongly on mass for a given distance. It is around 0.1 Gpc$^3$ for merging 10 solar mass black holes and about 1.5 Gpc$^3$ for 30 solar mass black holes (see Fig.4, right panel of Abbott et al. 2016a).

If we assume the 0.1-1000 events Gpc$^{-3}$ yr$^{-1}$ figure referred to 10 solar mass BH mergers (larger BHs should be much rarer), we arrive at an anticipated annual rate of $10^{-2}$ to 100 detected events. The single LIGO detection in 16 days of data makes the high and low ends of this range extremely unlikely - Abbott et al. (2006b) estimate a rate of 2-400 Gpc$^{-3}$ yr$^{-1}$.

It is also worth noting that aLIGO will become about two times more sensitive over the coming years, which means it will sample about 8 times the volume.

ProfRob
  • 141,325