Dynamical Solution to the Quantum Measurement Problem, Causality, and Paradoxes of the Quantum Century
- Cite this article as:
- Belavkin, V.P. Open Systems & Information Dynamics (2000) 7: 101. doi:10.1023/A:1009663822827
- 24 Downloads
A history and drama of the development of quantum theory is outlined starting from the discovery of the Plank's constant exactly 100 years ago. It is shown that before the rise of quantum mechanics 75 years ago, the quantum theory had appeared first in the form of the statistics of quantum thermal noise and quantum spontaneous jumps which have never been explained by quantum mechanics. Moreover, the only reasonable probabilistic interpretation of quantum theory put forward by Max Born was in fact in irreconcilable contradiction with traditional mechanical reality and causality. This led to numerous quantum paradoxes; some of them, related to the great inventors of quantum theory such as Einstein and Schrödinger, are reconsidered in the paper. The development of quantum measurement theory, initiated by von Neumann, indicated a possibility for the resolution of this interpretational crisis by a divorce of the algebra of dynamical generators and a subalgebra of the actual observables. It is shown that within this approach quantum causality can be rehabilitated in the form of a superselection rule for compatibility of past observables with the potential future. This rule together with self-compatibility of measurements ensuring the consitency of histories is called the nondemolition principle. The application of these rules in the form of dynamical commutation relations leads to the derivation of the von Neumann projection postulate, as well as to more general reductions, instantaneous, spontaneous, and even continuous in time. This gives a quantum probabilistic solution in the form of dynamical filtering equations to the notorious measurement problem which was tackled unsuccessfully by many famous physicists starting from Schrödinger and Bohr. The simplest Markovian quantum stochastic model for time-continuous measurements involves a boundary-value problem in second quantization for input "offer" waves in one extra dimension, and a reduction of the algebra of "actual" observables to an Abelian subalgebra for the output waves.