Monday, February 25, 2008

Deconstructing Decoherence

In 2002 Anthony Leggett, whose mainstream credentials can do without my seal of approval, memorably wrote ([1]):

"This argument, with the conclusion that observation of QIMDS [quantum interference between macroscopically distinct states] will be in practice totally impossible, must appear literally thousands of times in the literature of the last few decades on the quantum measurement problem, to the extent that it was at one time the almost universally accepted orthodoxy [apparently it still is].
...
Let us now try to assess the decoherence argument. Actually, the most economical tactic at this point would be to go directly to the results of the next section, namely that it is experimentally refuted! However, it is interesting to spend a moment enquiring why it was reasonable to anticipate this in advance of the actual experiments. In fact, the argument contains several major loopholes ...".

The brackets are mine. Since 1999 I have been shouting over the rooftops that decoherence theory is essentially flawed ([2], [3]). While it's pleasant to get some company, decoherence still rules the mainstream. It's interesting that Leggett graduated in Classics before he turned to physics. He argues ([4]) that his classic education shaped the way he later looked at physics, enabling him to identify and evaluate his assumptions in a way that most other physicists could not. That's indeed a precious and rare skill when it comes to the semantic problem, which in various forms haunts the current scientific debate. Quite concretely, it is only through a novel semantic framework such as RQM that the spurious non-locality of entanglement has been liquidated.

No comments: