Despite all the incredible practical success with quantum technology there was still an incompleteness about quantum’s interpretation. The trouble had to do with reconciling the quantum world with the macroscopic classical world. It wasn’t just a matter of a different set of equations. Logic itself was different. John Bell proved this when he published what became known as Bell’s inequality (1964). He came up with a simple equation, essentially:
N(A,~B)+N(B,~C) N(A,~C)
This video by Leonard Susskind explains it best – “the number of things in A and not B plus the number of things in B and not C is greater than or equal to the number of things in A and not C”. It’s easy to visualize with Venn diagrams and straight forward to prove this mathematically, just like a theorem of set theory. It involves no physical assumptions, just pure mathematics. But, turns out quantum mechanics doesn’t obey it! (see also Hardy’s paradox (1992)
for a really good brain teaser)
The trouble with quantum mechanics is that classical logic does not apply because the quantum world does not have the property of realism. Realism means that the things around us exist independently of whether we observe them. If there are mathematical sets A, B, and C those sets exist independent of the mathematician. In the quantum world, if we observe set A, it can change set B and C. The order that we observe the sets matters too. Realism means the proverbial tree that falls in the forest makes a sound whether we hear it or not. In the quantum world that’s not true. The tree exists in a superposition of states both making and not making a sound until someone, or something, observes it. This does not sound like a very plausible description of our practical experience though. From early on we all learn that nobody really disappears when we play “peek-a-boo”! It’s almost axiomatic. Realism does seem to be a property of the macroscopic universe. So, what gives?
The most common interpretation of quantum mechanics was called the Copenhagen interpretation. It said that the wave function would “collapse” upon measurement per the Born rule. It was a successful theory in that it worked – we could accurately predict what the results of a measurement would be. Still, this was kind of a band-aid on an otherwise elegant theory and the idea of having two entirely different logical views of the world was unsettling. Some physicists dissented and argued that it was not the responsibility of physicists to interpret the world, it was enough to have the equations to make predictions. This paradox became known as the quantum measurement problem and was one of the great unsolved mysteries of physics for over one hundred years. In the 1970’s the theory of decoherence was developed. This helped physicists understand why it was hard to keep things entangled, in a superposition, but it didn’t solve the problem of how things transitioned to a definite state upon measurement – it only partially addressed the problem. In fact, many brilliant physicists gave up on the idea of one Universe – to them it would take an infinite spectrum of constantly branching parallel Universes to understand quantum mechanics. This was known as the many world’s interpretation.
Figure 17: Excellent video introduction to quantum entanglement by Ron Garret entitled “The Quantum Conspiracy: What popularizers of QM Don’t Want You to Know“. Garret’s argument is that measurement “is” entanglement. We now understand entanglement is the first step in the measurement process, followed by asymptotic convergence to pointer states of the apparatus.
In 2013 A. Allahverdyan, R. Balian, and T. Nieuwenhuizen published a ground-breaking paper entitled “Understanding quantum measurement from the solution of dynamical models“. In this paper the authors showed that the measurement problem can be understood within the context of quantum statistical mechanics alone – pure quantum mechanics and statistics. No outside assumptions, no wave function collapse. All smooth, time reversible, unitary evolution of the wave function. The authors show that when a particle interacts with a macroscopic measuring device, in this case an ideal Curie-Weiss magnet, it first entangles with the billion-billion-billon (~) atoms in the device momentarily creating a vast superposition of states. Then, two extreme cases are examined: first, if the coupling to the measuring device is much stronger than the coupling to the environment, the system cascades asymptotically to a pointer state of the device. This gives the appearance of wave-function collapse, but it is not that, it is a smooth convergence, maybe like a lens focusing light to a single point. This is the case when the number of atoms, which all have magnetic moments, in the measuring device is large. At first this seems a counter-intuitive result. One might expect the entanglement to keep spreading throughout and into the environment in an increasingly chaotic and complex way, but this does not happen. The mathematics prove it.
In the second extreme, when the coupling to the environment is much stronger, the system experiences decoherence – the case when the number of atoms in the measuring device is small. This happens before entanglement can cascade to a pointer state and so the system fails to register a measurement.
The author’s results are applied to a particle’s spin interacting with a particular measuring device, but the results appear completely general. In other words, it may be that measurements in general, like the cloud chamber photos of particle physics or the images of molecular spectroscopy, are just asymptotic pointer states – no more wave-particle duality, just wave functions. Just more or less localized wave functions. It means that the whole of the classical world may just be an asymptotic state of the much more complex quantum world. Measurement happens often because pointer states are abundant, so the convergence gives the illusion of realism. And, in the vast majority of cases, this approximation works great. Definitely don’t stop playing “peek-a-boo”!
It may turn out that biological systems occupy a middle ground between these two extremes – many weak couplings but not so many strong ones. Lots of densely packed quantum states, but a distinct absence of pointers. In such a system, superpositions could potentially be preserved for longer time scales because it may be that the rate of growth of entanglement propagating through the system may equal the rate of decoherence. It may even be that no individual particle remains entangled but a dynamic wave of entanglement – an entanglement envelope – followed by a wave of decoherence will describe the quantum dynamics. A dynamic situation where entanglement is permanent, but always on the move.