Tuesday, June 27, 2006

Interpreting Quantum Probability

I have been reading more about the philosophical interpretations of probability and how this topic relates to the interpretation of quantum mechanics (QM). If we assume hidden-variable theories don’t work, some notion of probability is at the core of QM.

I had previously read this SEP article on the interpretation of probability. It provides a good overview of the history of the issue among philosophers, and reveals a wide variety of ideas under past and present consideration. Among physicists, however, debate on interpreting probability seems to center broadly on two conceptions: a frequency interpretation and a Bayesian interpretation.

I thought this recent post at physics musings was a good one on this topic, and I also benefited from the links (including the one to this John Baez page). Then, a recent post at Quantum Quandaries provided a link to this paper by Marcus Appleby. In it Appleby argues persuasively for the Bayesian conception and considers the implications for interpreting QM.

The frequentist conception appeals because it is intended to be objective. If we could repeat an experiment an infinite number of times we would empirically fill out the probability distribution of outcomes. The problem is we can’t do this. The Bayesian interpretation is epistemic: it shows how, given one’s prior assumption about a probability distribution, a measurement outcome serves to improve it. Appleby argues that an epistemic conception is unavoidable. He shows the problems with frequentist conceptions which try to provide an interpretation in situations with finite ensembles; here, one attempts to focus only on a pragmatically relevant finite subset of outcomes. However, this choice of subset is influenced by the context of the situation and the biases of the chooser and thus reintroduces the subjective element.

Appleby discusses the propensity interpretation of QM, which places the probability as an objective property of the system being measured. He suspects this idea usually underlies the adoption of a frequentist perspective. Propensity can, however, be made consistent with the Bayesian conception, if one gives up the idea that propensity is a directly observable property.

Appleby next discusses attempts to formulate an objective version of the Bayesian conception. Can the prior probability distribution be objectively grounded? No, because as some point you have an initial assumption which is not empirical. You cannot derive a probability from a non-probabilistic empirical fact.

Now, getting back to what it all means for our worldview: since probabilities are irreducible to objective facts, and quantum mechanics describes reality, does this mean we have to give up the idea that there is an objective real world out there? Is it true, a la most summaries of the Copenhagen interpretation, that QM only describes the content of our knowledge?

In turning to this question in the last section of his paper, Appleby surprised me by bringing up the problem of qualia from the domain of the philosophy of mind. Qualia (arguably) are irreducibility first-person phenomena which do not fit into a mechanistic view of the world. A fully objective realist view of the world has no place for qualia. And yet, Appleby says, you would say the same thing about real probability or propensity, since these are irretrievably “contaminated” by subjectivity. For him, this points to the need to give up the fully objective realism and accept that we need to find a fuller extension or development of a Copenhagen-style interpretation.

Not mentioned in this 2-year old paper is the Relational Interpretation of Quantum Mechanics (RQM), which can be thought of as such a generalization and extension of Copenhagen. This interpretation seems to fit best with the Bayesian interpretation of probability. For some more recent discussion of RQM follow some of the links in the physics musings post above and also see the recent posts in this thread at PhysicsForums.


Anonymous said...

Marcus will no doubt be pleased that his paper is generating some interest in cyberspace.

I had some comments (here and here) about the relational interpretation of quantum mechanics too. Generally, I think this interpretation is getting quite an easy ride. It's not clear to me whether it is consistent, or whether it is worked out in enough detail to be applied to all conceivable situations. Some constructive criticism from the philosophy community would be worthwhile I think.

Steve said...

Thanks Matt. Those are good points, and I will look closely at what you posted on the topic and think about where RQM comes up short. Hopefully at some point some philosophers of science will come out with critical evaluations.

joseph f. johnson said...

I read with interest the on-line encyclpedia article on
probability cited here, which you read. I noticed that its bibliography did not reference
von Plato's important paper....see the bibliography to my paper,
even though it includes his mammoth review book,
he told me that he left out his own theory simply so as to have
a defense from all his friends' complaining about why did he
leave out *their* theory....

Most of the criticisms of the frequentist or propensity
theories (practically the same thing) hinge on a hidden assumption:
logical positivism. Which has been abandoned by most physicists.

You cannot say that a definition or other logical construction is
not objective or not physical simply because it cannot be measured
or observed, unless you are committed to logical positivism. But
Wittgenstein's definition of truth (an implicit one, of course,
since no explicit definition is possible within a strictly
logical framework) is not positivist, it is simply that the
meaning of a sentence is the state of things which would
be the case if the sentence were true....so no simple-minded
point about the finiteness of any set concerned is decisive.

Littlewood and Burnside long ago identified a much
deeper problem with any frequency theory, but first von Plato,
and then me, have managed to fix this, at a severe price, of course.

Feynman's insight that probabilities only enter into
the question when an amplifying device is employed, since that is
the only way to measure a micro-physical phenonemom, turns out
to be key, but what neither he nor Dirac realised is that a
logically unexceptionable definition of probability, which answers
the problems Littlewood (and later Kolmogoroff) identified,
is equally key.

The only events to which are attached probabilities are
the events of a measurement apparatus. Since these probabilities
are specific to a definite physical set-up, which precludes other
measurement apparati, it is automatically non-commutative. Yet,
within a fixed experimental set-up, the probabilities are
commutative. The same definition of probability that works for
classical physics also works for quantum physics.
Furthermore, this is the only direction in which one can hope
to develop a relativistic theory of measurement and probability,
since attempts based on operators seem to have all failed. In
fact, one should not assume a priori that 'probability' is a
relativistically invariant concept, since the division between
noise and signal *might* not be relativistically invariant, and
this approach links probabilities to a kind of 'noise' as
envisioned by Wiener long ago.
I hope this helps a little.

Steve Esser said...

Thank you very much for the comments and the references, I will read your paper.

You make a number of interesting points. This seems to be a key insight in getting to the bottom of things:

"The only events to which are attached probabilities are
the events of a measurement apparatus. Since these probabilities
are specific to a definite physical set-up, which precludes other
measurement apparati, it is automatically non-commutative. Yet,
within a fixed experimental set-up, the probabilities are

Your final comments about probability in the context of relativity touches an area I don't know anything about, but will do some more reading there, too.