Luigi Accardi (Italy), Brian Davies (UK), David Elworthy (UK), Franco Fagnola (Italy), Karl Haubold (Germany), Robin Hudson (UK), John Lewis (Ireland), Nobuaki Obata (Japan), Rolando Rebolledo (Chili), Michael Schurmann (Gernany), Geoffrey Sewell (UK), Ray Streater (UK), Aubrey Truman (UK), Wilhelm von Waldenfels (Germany), Victor Zadkov (Russia)
If anyone says he can think about quantum problems without getting giddy, that only shows he has not understood the first thing about them - Max Planck.
Quantum Theory is the greatest intellectual achievement of the past century. Since the discovery of quanta by Max Planck exactly 100 years ago on the basis of spectral analysis of quantum thermal noise it has produced numerous paradoxes and confusions even in the greatest scientific minds such as Einstein, De Broglie, Schroedinger, Bell, and still confuses many contemporary philosophists and scientists. The rapid development of the beautiful and sophisticated mathematics for quantum mechanics and the development of its interpretation by Bohr, Born, Heisenberg, Dirac and many others who abandoned traditional causality were little help in resolving these paradoxes despite the astonishing success in the prediction of quantum phenomena. Both the implication and consequences of the quantum theory of light and matter, as well as its profound mathematical, conceptual and philosophical foundations are not yet understood completely. In order to appreciate the quantum drama which has been developing through the whole century, and to estimate possible consequences of it in the new quantum technological age, it seems useful to give a brief account of the discovery of quantum theory in the beginning of the century.
In mathematics you don't understand the things. You just get used to them - John von Neumann.
In 1932 von Neumann put quantum theory on firm theoretical basis by setting the mathematical foundation for new, quantum, probability theory, the quantitative theory for counting non commuting events of quantum logics. This noncommutative probability theory is based on essentially more general axioms than the classical (Kolmogorovian) probability of the commuting events forming a Boolean logic of common sense, which was first formalized by Aristotle. It has been under extensive development during last 30 years since the introduction of algebraic and operational approaches for treatment of the noncommutative probabilities, and currently serves as the mathematical basis for quantum information and measurement theory. In this framework new mathematical methods of stochastic analysis and calculus of complex physical processes, both in classical and quantum open dynamical systems, have been developed at Nottingham Centre of Quantum Probability.
How wonderful we have met with a paradox, now we have some hope of making progress - Niels Bohr.
The quantum probability approach resolves the famous paradoxes of quantum measurement theory in a constructive way by giving exact nontrivial models for the statistical analysis of the quantum observation processes underlying these paradoxes. Conceptually it is based upon the new idea of quantum causality called the Nondemolition Principle which divides the world into the classical past, forming the consistent histories, and the quantum future, the state of which is predictable for each such history. The differential analysis of these models is based on the quantum stochastic calculus created by Robin Hudson. The most recent mathematical development of these methods leads to the profound quantum filtering and control theory in quantum open systems which has found numerous applications in quantum statistics, optics and spectroscopy, and is an appropriate tool for the solution of the so called decoherence problem for quantum communications and computations.
A series of fundamental problems are going to be discussed during the lecture, among them: