Quantum foundations


Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.
There exist different approaches to resolve this conceptual gap:
Research in quantum foundations is structured along these roads.

Non-classical features of quantum theory

Quantum nonlocality

Two or more separate parties conducting measurements over a quantum state can observe correlations which cannot be explained with any local hidden variable theory. Whether this should be regarded as proving that the physical world itself is "nonlocal" is a topic of debate, but the terminology of "quantum nonlocality" is commonplace. Nonlocality research efforts in quantum foundations focus on determining the exact limits that classical or quantum physics enforces on the correlations observed in a Bell experiment or more complex causal scenarios. This research program has so far provided a generalization of Bell’s theorem that allows falsifying all classical theories with a superluminal, yet finite, hidden influence.

Quantum contextuality

Nonlocality can be understood as an instance of quantum contextuality. A situation is contextual when the value of an observable depends on the context in which it is measured. The original definition of measurement contextuality can be extended to state preparations and even general physical transformations.

Epistemic models for the quantum wave-function

A physical property is epistemic when it represents our knowledge or beliefs on the value of a second, more fundamental feature. The probability of an event to occur is an example of an epistemic property. In contrast, a non-epistemic or ontic variable captures the notion of a “real” property of the system under consideration.
There is an on-going debate on whether the wave-function represents the epistemic state of a yet to be discovered ontic variable or, on the contrary, it is a fundamental entity. Under some physical assumptions, the Pusey–Barrett–Rudolph theorem demonstrates the inconsistency of quantum states as epistemic states, in the sense above. Note that, in QBism and Copenhagen-type views, quantum states are still regarded as epistemic, not with respect to some ontic variable, but to one’s expectations about future experimental outcomes. The PBR theorem does not exclude such epistemic views on quantum states.

Axiomatic reconstructions

Some of the counter-intuitive aspects of quantum theory, as well as the difficulty to extend it, follow from the fact that its defining axioms lack a physical motivation. An active area of research in quantum foundations is therefore to find alternative formulations of quantum theory which rely on physically compelling principles. Those efforts come in two flavors, depending on the desired level of description of the theory: the so called Generalized Probabilistic Theories approach and the Black boxes approach.

The framework of Generalized Probabilistic Theories

Generalized Probabilistic Theories are a general framework to describe the operational features of arbitrary physical theories. Essentially, they provide a statistical description of any experiment combining state preparations, transformations and measurements. The framework of GPTs can accommodate classical and quantum physics, as well as hypothetical non-quantum physical theories which nonetheless possess quantum theory’s most remarkable features, such as entanglement or teleportation. Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.
L. Hardy introduced the concept of GPT in 2001, in an attempt to re-derive quantum theory from basic physical principles. Although Hardy’s work was very influential, one of his axioms was regarded as unsatisfactory: it stipulated that, of all the physical theories compatible with the rest of the axioms, one should choose the simplest one. The work of Dakic and Brukner eliminated this “axiom of simplicity” and provided a reconstruction of quantum theory based on three physical principles. This was followed by the more rigorous reconstruction of Masanes and Müller.
Axioms common to these three reconstructions are:
An alternative GPT reconstruction proposed by Chiribella et al. around the same time is also based on the
The use of purification to characterize quantum theory has been criticized on the grounds that it also applies in the Spekkens toy model.
To the success of the GPT approach, it can be countered that all such works just recover finite dimensional quantum theory. In addition, none of the previous axioms can be experimentally falsified unless the measurement apparatuses are assumed to be tomographically complete.

The framework of black boxes

In the black box or device-independent framework, an experiment is regarded as a black box where the experimentalist introduces an input and obtains an output. Experiments conducted by two or more parties in separate labs are hence described by their statistical correlations alone.
From Bell's theorem, we know that classical and quantum physics predict different sets of allowed correlations. It is expected, therefore, that far-from-quantum physical theories should predict correlations beyond the quantum set. In fact, there exist instances of theoretical non-quantum correlations which, a priori, do not seem physically implausible. The aim of device-independent reconstructions is to show that all such supra-quantum examples are precluded by a reasonable physical principle.
The physical principles proposed so far include no-signalling, Non-Trivial Communication Complexity, No-Advantage for Nonlocal computation, Information Causality, Macroscopic Locality, and Local Orthogonality. All these principles limit the set of possible correlations in non-trivial ways. Moreover, they are all device-independent: this means that they can be falsified under the assumption that we can decide if two or more events are space-like separated. The drawback of the device-independent approach is that, even when taken together, all the afore-mentioned physical principles do not suffice to single out the set of quantum correlations. In other words: all such reconstructions are partial.

Interpretations of quantum theory

An interpretation of quantum theory is a correspondence between the elements of its mathematical formalism and physical phenomena. For instance, in the pilot wave theory, the quantum wave function is interpreted as a field that guides the particle trajectory and evolves with it via a system of coupled differential equations. Most interpretations of quantum theory stem from the desire to solve the quantum measurement problem.

Extensions of quantum theory

In an attempt to reconcile quantum and classical physics, or to identify non-classical models with a dynamical causal structure, some modifications of quantum theory have been proposed.

Collapse models

posit the existence of natural processes which periodically localize the wave-function. Such theories provide an explanation to the nonexistence of superpositions of macroscopic objects, at the cost of abandoning unitarity and exact energy conservation.

Quantum Measure Theory

In Sorkin's quantum measure theory, physical systems are not modeled via unitary rays and Hermitian operators, but through a single matrix-like object, the decoherence functional. The entries of the decoherence functional determine the feasibility to experimentally discriminate between two or more different sets of classical histories, as well as the probabilities of each experimental outcome. In some models of QMT the decoherence functional is further constrained to be positive semidefinite. Even under the assumption of strong positivity, there exist models of QMT which generate stronger-than-quantum Bell correlations.

Acausal quantum processes

The formalism of process matrices starts from the observation that, given the structure of quantum states, the set of feasible quantum operations follows from positivity considerations. Namely, for any linear map from states to probabilities one can find a physical system where this map corresponds to a physical measurement. Likewise, any linear transformation that maps composite states to states corresponds to a valid operation in some physical system. In view of this trend, it is reasonable to postulate that any high-order map from quantum instruments to probabilities should also be physically realizable. Any such map is termed a process matrix. As shown by Oreshkov et al., some process matrices describe situations where the notion of global causality breaks.
The starting point of this claim is the following mental experiment: two parties, Alice and Bob, enter a building and end up in separate rooms. The rooms have ingoing and outgoing channels from which a quantum system periodically enters and leaves the room. While those systems are in the lab, Alice and Bob are able to interact with them in any way; in particular, they can measure some of their properties.
Since Alice and Bob’s interactions can be modeled by quantum instruments, the statistics they observe when they apply one instrument or another are given by a process matrix. As it turns out, there exist process matrices which would guarantee that the measurement statistics collected by Alice and Bob is incompatible with Alice interacting with her system at the same time, before of after Bob, or any convex combination of these three situations. Such processes are called acausal.