By Paul CockshottThe philosopher Althusser said that philosophy represents ideology, in particular religious ideology to science, and science to ideology. As science extended its field of explanation, a series of ‘reprise’ operations were carried out by philosophers to either make the findings of science acceptable to religion or to cast doubt on the relative trustworthiness of science compared to the teachings of the church.
This started with Berkeley’s subjective idealism and extended through to the instrumentalist interpretation of scientific research popularised by Mach in the late 19th century. In more recent years a particular interpretation of quantum mechanics, the Copenhagen one, has provided a rich seam for such reprises. A classic example is given here:
Which are real, waves or particles? On this opinions are divided, but what humans actually perceive in laboratory experiments are particles, or the impacts of particles. Waves are postulated to account for the patterns such impacts make. So while some theorists affirm that probability waves really exist, most physicists have a preference for particles, which at least are actualities, not just probabilities.
But that preference carries with it some unusual implications, very different from those of classical physics. For it seems that particles only really exist when they are observed. John Wheeler says, ‘No elementary phenomenon is a real phenomenon until it is an observed phenomenon’. Philosophers will recall the eighteenth century Anglican Bishop Berkeley’s dictum that ‘to be is to be perceived’. Nothing is real, the Bishop held, unless it exists in the mind of some observer, whether it is some finite spirit or the mind of God.
Known as Idealism, this philosophical view has been unpopular in recent times, partly because science seemed to suggest that nothing exists except material particles, and that the mind is no more than an accidental by-product of the material brain. In a totally surprising way, quantum physics is taken by some to show that Berkeley was more or less right, after all. Nobel Laureate Eugene Wigner writes: ‘The very study of the external world led to the conclusion that the content of the consciousness is an ultimate reality’. Particles only exist when observed, he suggests, and so the reality of particles entails that consciousness is a fundamental element of reality, not just a by-product of some ‘real’ material world. (Gresham Professor of Divinity Keith Ward speaking in 2005)
Having gone from arguing the consciousness is the fundamental reality, it is an easy step for Professor Ward to conclude at the end of his lecture that “It moves God much closer to the centre of the scientific view of the world, and makes belief in God, if not compelling, at least highly plausible.”
Does quantum mechanics actually imply what he says?
Well that is certainly what one historically influential interpretation says. Ward is able to quote Wigner and von Neumann in his defence. But this is fundamentally a philosophical interpretation of the quantum theory not the theory itself. The interpretation can be seen as just a continuation of Mach’s instrumentalist views which were very influential around the turn of 19th to 20th century when founders of quantum mechanics were starting on their careers. According to this, science was about explaining correlations between measurements on scientific instruments; it could not go beyond this and assume the reality of what its theories described.Boltzman had huge difficulties persuading his contemporary physics community of his theory of statistical mechanics which depended on the existence of atoms. Mach’s instrumentalism held that atoms were just a convenient fiction. The argument being: classical thermodynamics can explain what we see on thermometers etc, why posit these atoms? It was not until 1905 and Einstein’s paper on Brownian motion that he was vindicated. If one thinks how dependent on the idea of atoms all subsequent solid state physics, organic chemistry, etc. has been, then Mach’s view, and the obstacles Boltzmann encountered were hardly helpful.
But the point here is that skepticism about the existence of atoms or particles preceded the Copenhagen interpretation of quantum physics on which Ward relies, and was essentially grounded in philosophical methodology.
There has, since the 1950s, been another interpretation available: the many worlds interpretation due to Everett. Suppose we are observing a particle with possible spin up or spin down states. According to the Copenhagen interpretation a system evolves according to the wave equation with multiple possible states each with their own wave amplitude until it is observed, at which point the wave function collapses, and there is a single observed value.
In the many worlds view, all these multiple states continue into the future, the collapse of the wave function is a subjective illusion since arising from the fact that we can only observe one of the possibilities at a time. There are multiple universes, in half of which we observe the spin pointing down and in another half we observe the spin pointing up.
Proponents of the Copenhagen view say this multiplicity of universes is a big price to pay. Surely Occam’s razor would enjoin us to the simpler solution : that the wave function simply collapses on observation.
The Copenhagen view puts the observer at the center, as the Ptolomaic view did in astronomy. The Copernican revolution introduced, for the first time, the possibility of many worlds around other suns and reduced us as observers to an insignificant portion of the universe. Everett’s many worlds interpretation posits many parallel worlds occupying the same space as us, with our conscious experience being just one of multiple possible threads through this multiverse.
The Everett interpretation is objectivist, and undercuts the attempt to find support for theology in quantum theory. But you might think it was a matter of ’you pay your money and you take your choice’, with one interpretation being as good as another.
The game changer is quantum computing. The whole field stems from Deutsch’s 1984 paper in Transactions of the Royal Society. Deutsch’s paper explicitly draws on the Everett hypothesis to justify the proposal for quantum parallelism. He has said that as a young physicist he was inspired by Everett and his book The Fabric of Reality is a popular expression of the many worlds view. If one accepts the Everett hypothesis then the idea of quantum parallelism is much easier to come to than if you accept the Copenhagen view. Quantum computing does not depend on the prior advances of semiconductor technology, it is having to invent the basic technology from the start, and as such, it could as well have started research in the 1960s than the 1990s. So here we have another instance where the dominance of instrumentalism may plausibly have held a field back.
In a conventional computer each bit holds either a one or a zero. In a quantum computer it can hold both values simultaneous. Quantum parallelism uses many threads of the multiverse simultaneously. The difficult part comes from getting the different threads to interfere so that information can be passed from one thread to another. As of now there are only a few quantum algorithms and they seem to mainly have applications in cryptography. Grover’s algorithm for example can be used to crack passwords by searching through all possible passwords simultaneously. Suppose we have an eight-character password (as used in the DES standard). Since most people will use seven-bit ASCII as their passwords, this means that the password is effectively 56 bits long. As long ago as when DES was proposed in the 1970s it was pointed out that, in principle, this lent itself to cracking using massive parallelism. Suppose we can check a potential DES code in one microsecond using special combinatorial logic, and suppose that the NSA can afford one million such chips, both plausible assumptions. Then we could search through all combinations in under five hours, and on average, find the password in just over two hours. Using a single quantum computer running Grover’s algorithm, again performing checks at a microsecond each, you could get an answer in around four minutes. It does this by searching all possible passwords in parallel and allowing the different threads of the multiverse to interfere until the probability of ending up in the thread that contains the right answer is high.
The parables of the Copenhagen interpretation have a certain plausibility when the intervention of the human observer is between two binary values : a spin up or spin down. One can just about credit ‘free will’ with being able to do this. But when it is a matter of selecting one out of hundreds of billions of possible passwords, or the extraction of prime factors using Shorr’s algorithm then one has either to accept the reality of the multiverse or attribute supernatural prescience to the ‘observer’.
Up to now, people can not build quantum computers big enough to run more than toy examples. It requires extraordinarily nice engineering — manipulating individual ions in some designs — and reliability is a huge problem. But they prove the principle, the rest is just engineering development.
Paul Cockshott is a computer scientist and political economist working at the University of Glasgow. His most recent books are Computation and its Limits (with Mackenzie and Michaelson) and Arguments for Socialism (with Zachariah). His research includes programming languages and parallelism, hypercomputing and computability, image processing, and experimental computers.
Subscribe to the OUPblog via email or RSS.
Image credits: (1) George Berkeley portrait by John Smybert 1727. National Portrait Gallery, Smithsonian Institution. Public domain via Wikimedia Commons. (2) Ludwig Boltzmann portrait, 1902. Public domain via Wikimedia Commons.