29

To measure the lifetime of a specific particle one needs to look at very many such particles in order to calculate the average. It cannot matter when the experimentalist actually starts his stopwatch to measure the time it takes for the particles to decay. If he measures them now or in 5 minutes makes no difference, since he still needs to take an average. If he measures later there will be particles out of the picture already (those who have decayed in the last 5 min), which won't contribute and the ones his measuring now behave (statistically) the very same, of course.

I have just read the following in Introduction to Elementary Particles by Griffiths:

Now, elementary particles have no memories, so the probability of a given muon decaying in the next microsecond is independent of how long ago that muon was created. (It's quite different in biological systems: an 80-year-old man is much more likely to die in the next year than is a 20-year-old, and his body shows the signs of eight decades of wear and tear, But all muons are identical, regardless of when they were produced; from an actuarial point of view they’re all on an equal footing.)

But this is not really the view I had. I was imagining, that a particle that has existed for a while is analogous to the 80 year old man, since it will probably die (decay) soon. It just doesn't matter because we are looking at a myriad of particles, so statistically there will be about as many old ones as babies. On the other hand it is true that I cannot see if a specific particle has already lived long or not; they are all indistinguishable. Still I am imagining particles as if they had an inner age, but one just can't tell by their looks. So is the view presented in Griffiths truer than mine or are maybe both valid?

How can one argue why my view is wrong?

Jack
  • 1,021
  • 12
  • 17

7 Answers7

72

It's impossible to say whether you are correct or Griffiths is correct a priori -- that is, before having any experience of how the world works. You need to do experiments, and Griffiths' version agrees with experiments better than yours.

The basic experiment involves detecting the products of decayed particles. Suppose we have some process that happens pretty quickly and produces a certain amount of some unstable particle. The particles were all produced at basically the same time, so they all have basically the same "age".

Now, if we just measure how many decays happen per second, Griffiths' version predicts that you'll see an exponential falloff in that number -- as Griffiths does a fine job of explaining. And that's what we actually see when we do experiments like this.

In your version, you would expect to see very few decays until some fixed time after the production, then a sudden rush of decays, and then the decays would mostly stop because the particles would basically all be gone. But that's just not what we see.

Again, there's no reason the laws of the universe have to work the way Griffiths says, and not work the way you say. It's just that the predictions of the two versions are testable, and only Griffiths' version agrees with experiments. That's science!

Mike
  • 15,586
11

Your thinking is analogous to the Gambler's Fallacy.

That is, the false belief that because HEADS hasn't come up in the last six coin tosses, it is somehow more likely to come up on the next toss.

The truth is that the event is independent of any prior events.

However, that doesn't mean that there aren't any old particles out there.

With numbers of astronomical magnitude, there's going to one coin that has come up heads for the entire 13 billion years of it's existence.

8

It all comes down to indistinguishability. Even in principle we can not differentiate one identical particle from another. If particles had some sort of inner workings for measuring time, than this piece of information would have to be encoded in them somehow. This would then be the differentiating factor, which contradicts with indistinguishability.

On a deeper level, all particles are really excitations of all-permeating quantum fields. So the appropriate analogy would be imagining a sheet in the draft, which would result in a bump propagating along the sheet. There is no sense in talking about the age of the bump, because the bumps at different places are different featureless entities, i.e. particles are just manifestations of the more fundamental underlying fields. Also in a further analogy to this video, you could say that determining the age of a particle is the same as determining the age of number $3$ (and not in a way that has relation to human culture).

mgphys
  • 1,729
7

I'd like to add (1) something I'd like the OP to take away from this discussion and also (2) to add a "counterexample", whose contemplation will serve to strengthen the other answers, which I must say are all excellent.

  1. Firstly, it should be emphasised that exponential distribution is the $\mathbf{unique}$ distribution that is memoryless in the sense spoken about in Mike's, Chris's and Manishearth's answers. In other words, to experimentally verify that a particle doesn't "heed its age", you look for the exponential distribution. If you see it, then you are observing memorylessness, as stated in the other answers. But it's stronger than this: if you don't see an exponential distribution, you $\mathbf{know}$ there is some memory of age present. To understand this uniqueness, we encode the memorylessness condition into the basic probability law $p(A,B) = p(A) \, p(B|A)$. That is, suppose after time $\delta$ you observe that your particle has not decayed (event A). If $f(t)$ is the propability distribution of lifetimes, then the probability the partcile has lasted at least this long is $1-\int_0^\delta f(u)du$. The $a\, priori$ probability distribution function that the particle will last until time $t+\delta$ and then decay in the time interval $dt$ is $f(t+\delta) dt$. (This is events $B$ and $A$ observed together, which is the same as plain old $p(B)$ since the particle cannot last unti time $t + \delta$ without living to $\delta$ first!) Hence the conditional probability density function is $p(B|A) = \frac{f(t+\delta)\,dt}{1-\int_0^\delta f(u)du}$. But this must be the same as the unconditional probability density that the particle lasts a further time $t$ measured from any time, by assumption of memorylessness. Thus we must have $\left(1 - \int_0^\delta f(u)du\right)\,f(t) = f(t+\delta)$, for all values of $\delta$. Letting $\delta\rightarrow 0$, we get the differential equation $f^\prime(t) = - f(0) f(t)$, whose unique solution is $f(t) = \frac{1}{\tau}\exp\left(-\frac{t}{\tau}\right)$. You can readily check that this function fulfills the general functional equation $\left(1 - \int_0^\delta f(u)du\right)\,f(t) = f(t+\delta)$ for any $\delta > 0$ as well.

  2. There are "particles" that do remember their age, although they're not fundamental particles and almost certainly don't qualify for what the OP is thinking. But they illustrate the other answers by showing what fundamental particles would need if they were to remember their age. If we think of an excited fluorophore (instead of a quantum field in a raised state), then fluorophores generally undergo one or more changes of state in their fluorescence process. We can think of this as a psuedoparticle - a quantum superposition of free photons and raised matter states - in the same way as a polariton is thought of as a pseudoparticle. Real fluorophores are more complicated - the quantum superposition involves states other than simply the primary excited state and photon, so there is an internal state to record the "particle's" "age". I have drawn below a schematic diagram of the energy levels for something like fluorescein below. The fluorophore generally gets raised to a higher level than it will fluoresce from, and thus undergoes a series of decays between these higher states before dropping back to the ground state (or, more often, something in a band just above the ground state). So the total fluorescence lifetime is the sum of several, memoryless pdfs: the total pdf - being the pdf of the sum of exponential distributions - is the convolution of all the individual exponential distributions.

Fluorescein fluorescence

If there is one dominant higher energy state with lifetime $\tau_1$ and the main fluorescence transition has lifetime $\tau_2$, then the pdf for the overall lifetime is $\frac{1}{\tau_1\,\tau_s}\int_0^t e^{-\frac{u}{\tau_1}} e^{-\frac{t-u}{\tau_2}} du =\frac{e^{-\frac{t}{\tau_1}}-e^{-\frac{t}{\tau_2}}}{\tau_1-\tau_2}$ and I have drawn a sample function of this kind for $\tau_1 = 1$ unit and $\tau_2 = 10$ units. Mostly, raised fluorescence states look like memoryless particles in practice because the higher states are so shortlived compared with the lowest singlet state, but there are some for which the behaviour below is quite observable: i.e. there is a time throughout which an excited population is quiet, then the fluorescence comes with a rush, then goes quiet again.

DoubleExponential

Selene Routley
  • 90,184
  • 7
  • 198
  • 428
5

If you feel that the particle should decay faster because it has already lived long and may be approaching or may have passed the average lifespan, this is the Gambler's Fallacy.

The average age of a particle can be derived from the knowledge that the particle has an x% chance of decaying in every small interval of time.

For example, if you have a die, and you stop rolling the minute you get a 6, the average lifespan of your game is around 5 rolls.

However, when calculating said average, you include the possibility that the game end in a single roll. And the possibility that it ends in two rolls (etc etc).

Now, let's say that after a single roll of the die, you get a number that is not a 6. At first, it seems obvious that the game will probably end in around 4 rolls.

However, there's a catch here -- you now have some information, and that information is that the first role was not a 6. Probabilities change as you get more information. Your initial probability calculation doesn't apply anymore as it assumed that there was a chance for the first roll to yield a 6.

The same concept applies here. Experimentally, particles follow an exponential decay distribution (The probability that a particle decays in time $t$ is $Ae^{-\lambda t}$, in other words there's a $\lambda$ probability of the particle decaying in any given second). The decay distribution supports the fact that the "probability of a particle decaying in the next moment" is constant, and thus the particle has no "age" that affects the decay.

Either way, an "age" would become a degree of freedom, which would affect the thermodynamic properties of the particle.

Also, when you model these particles mathematically, they all come out to be equivalent. I can swap two muons and I haven't changed the system at all.

Manishearth
  • 19,167
  • 6
  • 64
  • 104
4

Let me put it this way:

Over it's lifetime nothing inside the particles changes.

The only reason it dies is truly random - something inside it randomly tunnels out of the particle, and it no longer exist in previous form. Probability of this random process is equal for each and every nanosecond of particle's life.

BarsMonster
  • 2,492
3

You quoted the following phrase from Griffiths' book: "elementary particles have no memories, so the probability of a given muon decaying in the next microsecond is independent of how long ago that muon was created." It follows from this phrase that decay is exponential (the number of surviving particles depends on the time exponentially). This is a very good approximation, however, strictly speaking, this is not a precise law. As Khalfin noted half a century ago (please see some references in http://arxiv.org/abs/quant-ph/0408149 ), exponential decay is, strictly speaking, incompatible with quantum mechanics, so there must be deviations from exponential decay for very short and very long times. Such deviations have been found experimentally for very short times (http://george.ph.utexas.edu/papers/tunnelling.pdf ), but not for very long times (as far as I know): the deviations for very long times are difficult to observe, but there is little doubt that such deviations do exist, as quantum mechanics is in very good agreement with experiments.

As for Griffiths' phrase, I think it is warranted anyway because it is an established practice to give a simplified description in textbooks.

EDIT (07/07/2013): Looks like nonexponential decay of metastable states for long times was demonstrated experimentally: http://dro.dur.ac.uk/4234/1/4234.pdf .

akhmeteli
  • 27,962