top of page
SHAVING SCHRÖDINGER’S CAT WITH OCCAM’S RAZOR

The medieval philosopher William of Ockham (c.1288-1348) is associated with the principle known as Occam’s Razor which says that “entities should not be multiplied needlessly.” In other words theorists should follow the advice, attributed to Einstein, that “everything should be made as simple as possible, but not simpler.”

​

Old William’s principle has many applications to modern physics. I am naturally suspicious of any theory that postulates the existence of mysterious influences at work behind the scenes, especially ones which cannot be verified experimentally. The history of science is littered with discarded ideas like phlogiston, caloric and the luminiferous aether that were postulated to explain phenomena like fire, heat and light waves. These hypothetical substances could not be directly detected and were ultimately shown not to exist, whereupon new theories were devised that made them unnecessary. There was also the mythical planet Vulcan, originally designed to explain an anomaly in Mercury’s orbit, but which is now totally exploded (except in fiction where, despite having been imploded, it still explains Leonard Nimoy’s pointy ears).

​

The moral to be drawn is this: if you can’t detect it, it probably doesn’t exist, and you should look for a new explanation. Any theory which entails layer upon layer of complex undetectable processes is doubly suspect. So let us take Occam’s Razor in hand and apply it to present-day physics. Which of today’s cutting-edge ideas will end up on the cutting-room floor of history, discarded and forgotten like the aether? Remember, the most recent theories are not necessarily the best, they are simply the ones that haven’t yet been around for long enough to get discredited.

​

Quantum Mechanics

​

Plenty of descriptions and explanations of the quantum theory are available on the Internet, so I will not waste too many electrons repeating them here. I will simply say that quantum theory has been very good at describing HOW things work at the subatomic level. Its predictions have been verified experimentally to a remarkable degree of accuracy. But it has totally failed to explain WHY things work the way they do. At the subatomic level, “quantum weirdness” takes over: while we can accurately model the behavior of photons, electrons and the like, we are totally unable to explain what is going on. As the physicists put it, “shut up and calculate.” It is as if the uncertainty principle governs the theory of quantum mechanics itself: the more precisely we can predict something, the less we can understand it.

​

The basic mathematics of quantum theory can be formulated in a number of ways. For example, Schrödinger’s wave equation assigns a Complex number to every point in space-time and then relates how these numbers vary from place to place and from time to time. An alternative formulation involves analyzing the transformations of infinite-dimensional matrices. Thus, to model the behavior of a single particle, we need to mathematically analyze an infinite range of numbers. Talk about multiplication of entities! And yet when we try to measure the particle, we find (by the uncertainty principle) that we can only extract a very limited amount of information from it. Were all those numbers really necessary, or should we be looking for a radically simpler formulation?

​

Interpretations of Quantum Mechanics

​

Our models may be complex, but the real trouble begins when we try to interpret what is going on. According to the standard “Copenhagen” interpretation of quantum mechanics, each particle exists as a wave function that defines the probability of its being detected in any particular location. We cannot detect this wave function directly, since any act of measurement “collapses” it down to a single value. Thus a particle has no definite existence, but exists only as a range of possibilities, a probability wave smeared out in space, until it is observed. The act of observation destroys the wave. This interpretation leads to such philosophical absurdities as Schrödinger’s cat, which is shut in a box along with an apparatus rigged to kill it when a radioactive isotope decays–a quantum-driven random process with a 50% chance of occurring within an hour. According to this interpretation, the cat manages to be both dead and alive at one and the same time, until someone opens the box after an hour to check on it. It is only when the cat is observed (thus collapsing its wave functions into a definite state) that the Universe “decides” whether it is alive or dead.

​

To my mind this interpretation represents hubris of the highest degree. To say that a system has no objective reality until it is observed is like saying that a tree falling in the forest will only make a noise when someone is near enough to hear it. Why should the laws of physics only apply in the presence of a human observer? And to say that two possible cats (a dead one and a live one) occupy the box for an hour is to multiply entities needlessly. Old William would not have approved. The cat is a live cat unless and until the isotope decays, after which it is a dead cat. We shouldn’t say that both cats are in the box, just because we don’t know which one it contains. (And in any case, as long as it was alive the cat would know it was in the box.)

​

Another possible interpretation of quantum mechanics is the so-called “many-worlds” interpretation. This rejects the idea that the waveform collapses when we observe the cat. Instead, the entire Universe, including the observer, splits into two: one possible Universe in which she sees the cat as alive, and a parallel Universe in which her counterpart sees the cat as dead. And this splitting process does not only happen when we put a cat into a box, but it goes on all the time, in every part of every conceivable universe. As a result there are a myriad of possible universes, all of which are forever splitting into more and more alternate realities. Our Universe follows just one of the impossibly many branching paths in this hyper-universe; all other paths are undetectable to us. (A cosmological variant of this theory hypothesizes that our Universe is just one of many which were formed in the Big Bang, but which later split apart and went their separate ways.) The mere thought of this so-called “multiverse” would have had old William cutting his throat.

​

Quantum Field Theory and Electrodynamics

​

And when we start looking at Quantum Field Theory and Quantum Electrodynamics, which model the interactions between different particles, things get really weird. According to these theories, we cannot know which route a photon or an electron takes as it moves from place to place. Instead, we must calculate and combine the probabilities for every possible route the particle could take. And then we must allow for the possibility that the particle does some weird things along the way, such as emitting and then reabsorbing one or more “virtual” particles. These are particles so short-lived that they cannot be directly detected, they can only be inferred from the theory. (Notice a pattern here?) We must also allow for the possibility of virtual particle-antiparticle pairs being spontaneously created and then annihilated. And after doing all these calculations we must take one more step that, from a mathematical viewpoint, is totally invalid!  Known as “renormalization,” this involves canceling out the infinities which are thrown up by the calculations. And yet, it gives us answers that agree fantastically well with experiments.

​

What is going on here? Why do we need to assume the existence of all these undetectable entities in order to reach the “right” answer? Why can’t physicists interpret their own theories? It’s all enough to make old William turn in his grave (which presumably contains not only the dead Occam, but also the buried-alive one).

​

Dark Matter

​

Finally, the use of Occam's Razor is not restricted to the submicroscopic quantum realm. We can also apply it to the astronomical, where it is speculated that only about 4% of the matter and energy in the Universe is detectable. The remaining 96% is said to consist of so-called "dark" matter and energy. Cosmologists need to make this wild assumption in order to get their theories to agree with their observations. Now which do you think is more likely: (1) that the Universe consists mostly of some mysterious undetectable form of matter and energy, or (2) that the formulas used in cosmology are somehow off by a factor of about 24? I think I know what old William would have said, and it's not repeatable here.

Nick Mitchell, March 2009

bottom of page