The theme of this special issue of Foundations of Physics is the interplay between, and mutual enrichment of, quantum information theory and quantum foundations. Information-theoretic concepts have always had a place (sometimes beneath the surface) in discussions of the foundations of quantum mechanics. This is hardly surprising, since quantum mechanics, as usually presented, is a probabilistic theory. And, indeed, pioneering work by Holevo [9] and others in the 1970s had already begun the development of a full-fledged quantum information theory.

However, with the emergence of quantum information theory as a distinct discipline, in the 1990s, the richness of this point of view became clearer. In part, this owes to the realization that, in connection with concrete information-processing tasks, one usually need only consider finitely many degrees of freedom—equivalently, a finite-dimensional subspace of the full Hilbert space of the quantum systems involved.

Another, more important, development was the realization that properties of entangled states of composite quantum systems could be regarded, not as anomalies, but as resources that could be systematically exploited. Since issues involving entangled states had always been central to discussions of quantum foundations, it was natural for researchers in QIT to take an interest in the older foundational literature, and to become interested in the problems it dealt with. Two excellent examples of this are the efforts to understand the information-theoretic meaning of the Tsirel’son bound on the strength of nonlocal correlations in quantum theory [10], and efforts to reconstruct finite-dimensional QM from largely information-theoretic postulates, beginning with the celebrated paper of Lucien Hardy [8].

Now, two decades on, information-theoretically inflected quantum foundations, or foundationally-inflected quantum information theory, is emerging as its own area of research, distinct from both mainstream quantum information theory, from which it draws some techniques and a certain orientation, and traditional quantum foundations, with which it shares core problems, but from which it differs in its focus on finite-dimensional systems, entangled states, and the idea of physical systems as information-carriers. The seven papers in this volume constitute a sampler of some (though by no means all) of the areas of current research that define this new area.

Two of these papers focus on the broad question of how to measure the degree to which bipartite or multi-partite quantum states are entangled—for instance, by imposing a suitable metric on the space of quantum states, and then identifying the given state’s distance from the set \({\mathcal {S}}\) of separable states, or from the set \({\mathcal {K}}\) of classical states. A related question is, roughly speaking, how much of the quantum state space is taken up by classical or separable states. That is, are genuinely entangled states rare, or commonplace, among bipartite—and multipartite—quantum states? In their contribution, Li and Winter address the first question, discussing a particular measure of entanglement known as squashed entanglement (defined in terms of quantum conditional mutual information), in terms of which, among other things, they obtain a bound on the trace distance from a state from the set of separable states. The second question is the focus of the survey paper by Palazuelos. A natural approach here is to ask for the probability that a “random” bipartite state will belong to the classical (or the separable) set; however, defining a suitable notion of randomness in this setting is challenging, and work in this area involves relatively sophisticated mathematical tools (e.g., the Grothendieck inequality, and various extensions thereof the multilinear forms). Palazuelos undertakes the comensurately challenging task of making this area accessible to a non-specialist audience.

Quantum information theory accepts at face value the reading of quantum mechanics as simply are density operators, understood as assigning probabilities to effects via the Born rule. This basic picture, of states assigning probabilities to the outcomes of possible—and possibly incompatible—measurements, is easily abstracted to frame a QM can be located as just one example, albeit the one Nature seems to favor. In this approach, one usually assumes only that the state-space of a physical system is a convex set, usually compact, and that measurement-outcomes are represented by so-called effects, i.e., affine functionals from the state space to the unit interval. (In quantum theory, the state space is the convex set of density operators on a Hilbert space, and the effects are easily shown to correspond, via trace duality, to positive operators on the same Hilbert space, dominated by the identity operator—that is, effects in the usual sense.) Many “distinctively quantum” phenomena, including the existence and basic properties of entangled states, versions of the no-cloning and no-broadcasting theorems, teleportation protocols, and more, arise quite profusely in probabilistic theories of this kind.Footnote 1 See [2] for a recent survey.

Four of the papers in this Issue deal with such “generalized probabilistic theories” (GPTs), reflecting their growing importance in both quantum information theory and quantum foundations. A natural question to ask about a particular (generalized) probabilistic model is the extent to which it can be simulated by a quantum-mechanical one. Another question is how one can combine such models to form models of composite systems, in a way respecting the no-signaling requirement of general relativity. In their paper in this volume Sainz and Wolfe connect these questions, showing that a well studied class of quantum-mechanically simulatable states, the so-called \(Q_1\) states, on a composite can depend significantly on which of several plausible composition rules one uses. They also show, however, that this dependence disappears when one requires the composite to respect, not only the no-signaling constraint, but also the so-called “local orthogonality” principle. The papers by Garner and by Barnum, Lee and Selby deal with computation in GPTs, in both cases linking the computational power of such a theory to the degree to which the theory in question supports interference (suitably defined). In particular, Barnum, Lee and Selby link the number of queries to an oracle needed to solve certain learning problems, to the degree of interference (in the sense of [12]) the theory exhibits. Garner, meanwhile, establishes that computations involving interference between the arms of an interferometer can be replicated in certain theories other than standard QM, including quaternionic QM and Spekkens’ well-known “toy theory” [13]. The paper of Branford, Dahlsten and Garner concerns dynamics in GPTs. In particular, they address the question of how one ought to define a “Hamiltonian” in such a setting, aiming for a definition that applies to as broadly as possible. Among other things, they show that Hamiltonians, satisfying their desiderata, are always available for systems having 3-dimensional state spaces.

In a certain sense, questions about the interpretation of quantum theory are dual to those about the foundations of QM. Historically, intepretational issues take the formal framework of quantum mechanics as given, and ask how we can sensibly read this as telling us a story about physical reality. The development of the information-theoretic view of quantum foundations has had an impact on this discussion. A distinction is often made between those interpretations of QM that treat the quantum state as an objective, observer-independent feature of reality, and those that regard it as a encoding the expectations of (possible) agents about their own probable future experiences (say, in the laboratory). In his paper for this Issue, Richard Healey defends and further explicates his pragmatist interpretation, which, he argues, does not fit well with this distinction. In Healey’s view, quantum states provide objectively most reliable information—that is, advice—about what agents (even purely theoretical ones) ought to expect from their future encounters with quantum systems. Healey illustrates this with a careful analysis of quantum teleportation protocols.

While the papers in this Issue touch on a wide range of issues of current interest, we wish to stress that these do not begin to exhaust the connections between quantum information theory and quantum foundations. Among topics that are not represented here, we might mention the program of reconstructing at least finite-dimensional QM from probabilistic or information-theoretic assumptions (beginning with Hardy’s paper, mentioned above), the broad subject of hidden variables, which has been revived as a useful analytical tool [11], efforts to understand thermodynamics from a GPT perspective (e.g., [3, 6]), the category-theoretic approach to quantum and more general theories [1], and the topic of resource theories [7]. In any case, we hope the papers in this Issue will leave many readers interested enough in this general area to undertake some further exploration.