At the end of a recent discussion (in the comments to this post), Allen and I started to discuss whether quantum computing theory had something to say about the potential for simulating the human mind on a computer. I then read a couple of review articles on quantum computing he referenced. Before getting to what I learned (below), I wanted to explain my prior philosophical view on the simulation question.
The Russellian Stance, Functionalism, and Simulation
I endorse a form of the Russellian approach to solving the hard problem of consciousness. Russell described the world as a causal network of events, and he noted that physical theory only describes the extrinsic or dispositional nature of these events. These events also have an intrinsic nature which ultimately grounds the qualitative and experiential character of consciousness. I think quantum mechanics provides support for this view: quantum measurements seem to fit perfectly into this Russellian picture as the base-level events which ultimately underpin both the physical and experiential facts.
Functionalism is the thesis that the mind can be described as an abstract causal system. As a practical matter, the functionalist’s description is taken at a coarse-grained level – i.e. there is some minimal scale below which the actual physical details of the brain/body system are assumed to be irrelevant to its function. It follows that such a functional model could be realized in any number of physical ways, including via computer. Computationalism is the variety of functionalism which pursues the computer modeling approach.
Now, I think the Russellian stance on functionalism and the potential for simulation is nuanced. On the one hand, functionalism is seen as misguided because it only considers extrinsic causal structure. On the other hand, unlike an old fashioned dualist, the Russellian shouldn’t rule out the possibility that the mind could be simulated. The mind, after all, is a product of a natural system – we don’t need extra immaterial stuff to explain it. Perhaps a simulation can get the functional structure right and the correct intrinsic experiential character will come along “for free.”
The problems come with the coarse-graining. In every functionalist account I’ve seen, this takes place at a scale where quantum mechanics is assumed to be safely irrelevant. But every process in the body ultimately is grounded in molecular, atomic and sub-atomic activity which must be described quantum mechanically. So, a coarse-grained, approximated simulation of the brain/body’s causal structure on some physical device would likely miss crucial details which lie at the quantum level (details I think simultaneously crucial to both extrinsic function and conscious experience)
How would a functionalist respond to the quantum question? First, many believe distinctive quantum phenomena effectively “wash out” in a macroscopic system like the human brain/body. This belief is often based on the presumed impact of environmental decoherence. I’m not going to pursue this issue in this post (I discuss this in some of my posts on quantum biology). Another response to the quantum question is an appeal to a commonly believed thesis that any physical system (including a quantum system) can be modeled by a classical computer -- so the traditional functionalist/computationalist approach wouldn’t be missing anything distinctive anyway. This is the view that I wanted to explore by reading up on quantum computing theory.
[Please note the discussion that follows may suffer even more than usual from by my ignorance of the subject matter.]
Simulation and the Church-Turing thesis
So, is it true that any physical system, including a quantum system, can be simulated by a classical computer? Well, this idea has been defended by appeals to versions of the Church-Turing thesis. The original Church-Turing thesis states (in a formulation from this article) that any effectively calculable function can be computed using a Turing machine. (For a description of a Turing machine, see here.) Now it seems that what this thesis really meant can probably only be appreciated by studying its original logical/mathematical context. In his SEP article on the C-T thesis, Jack Copeland first traces the development of the ideas associated with the thesis in the pioneering work on calculation and computing by Turing, Church and others beginning in the 1930’s. Then, Copeland spends much of the rest of the article objecting to how the thesis has been misunderstood and misused by philosophers and computing theorists. The C-T thesis did not purport to say that all physical systems, or even all machines regardless of architecture, could be simulated by a Turing machine. It certainly did not prove anything of the sort. (This discussion reminded me of similar debates over the philosophical applicability of Gödel’s incompleteness theorems beyond their original context – see old posts here and here.)
Nevertheless, as long as one is careful not to inappropriately invoke the authority of the original C-T thesis, one can explore more expansive versions and try to evaluate their validity.
Physical Versions of the C-T thesis
In her review article on Quantum Computing, Dorit Aharonov presents two versions of the thesis. First, (p.3), she presents a simple “physical” version: “A Turing machine can compute any function computable by a reasonable physical device.” She says this is something which cannot be proven, but that no known counterexamples exist. In particular, quantum computers are not believed capable of computing functions non-computable by a Turing machine.
She quickly then notes that: “However, in the theory of computation, we are interested not only in the question of which functions can be computed, but mainly in the cost of computing these functions.” [Emphasis original] The way this is evaluated is by noting whether the computational resources needed rise as a polynomial or an exponential function of input size. It is the former which form the set of tractable computations.
After also discussing the superior efficiency of a probabilistic version of the Turing machine, she presents another thesis to consider (p.4): “The modern Church thesis: A probabilistic Turing machine can simulate any reasonable physical device in polynomial cost.” We have a great deal of evidence, though not proof, that this thesis is contradicted by quantum computers.
The Advantages of Quantum Algorithms
Aharonov explains first, that quantum computers can simulate classical computers, at little loss in efficiency. On the flip side, it appears classical computers can simulate quantum computers but only at exponential cost. What we really want, though, is a positive demonstration of how far quantum computers can outperform their classical counterparts.
The a priori expectation might be that the ability to manipulate qubits, which can be in a superposition of states as opposed to just to two states, would lead to great increases in computing power. Because of the necessity for conducting a measurement to extract results (collapsing superpositions), however, the power of this idea is muted. Other, more subtle sources of limitations on quantum computing are discussed later in the paper.
Despite this, however, many investigations into quantum computing over the years have found quantum algorithms which improve efficiency. One of these algorithms, Shor’s, gives a polynomial algorithm for factoring integers where all known classical algorithms have exponential cost, thus crossing the crucial boundary. It must be pointed out that as of yet there is no proof that a classical polynomial algorithm for factoring is impossible.
Most of what I’ve discussed in Aharonov’s article above comes from the introduction. In the ensuing 60 pages she goes into more detail about the nature of computers and computing, various models for quantum computing algorithms, the issues of noise correction and fault tolerance, and some of her own ideas of what quantum computing theory says about the boundary between quantum and classical regimes in physics.
On the key question of what quantum computers can do better than classical ones, one is left with the impression that the question is much more subtle than might first be imagined. We have some exciting theoretical results, but perhaps fewer than might have been anticipated on a naïve expectation. At the same time, it seems we’re still in the early phase of growth in our knowledge of the field. A lot of interesting work and new developments lie ahead. (The engineering efforts underway toward building quantum computers and the challenges they face is another interesting topic).
Let me return to the question about what all this might mean for the philosophy of mind, assuming (as I do) that the quantum level grounding of biology contributes meaningfully to the mind’s function. I think a modest conclusion is called for. The fact that a classical computer can simulate a quantum computer only at an exponential cost suggests that the project of simulating a human mind is impractical, though not blocked in principle. This conclusion is broadly consistent with my philosophical stance regarding the simulation project.
[P.S: after drafting this I recalled there was a good debate (thanks to Tanasije and Mike) on the simulation topic in the comments to this May 2008 post on Russellian theory. My memory is awful.]