The year 2019 witnessed the 20th Jubileum of the Växjö conference series on quantum foundations and probability in physics. This has been the longest running series of conferences on the subject in history. Many old and new friendships were forged at Linnaeus University and the beautiful surrounding lakes of Småland, where once yearly everyone gathers to renew the debate and report their latest progress. 2019 also represents the Porcelain Anniversary—18 years—of the point of view on quantum theory known as QBism. In this regard, the 2001 meeting in the series was pivotal in more ways than one. Not only did it instigate Fuchs’s break [1] from the earlier (Jaynes style) “objective Bayesianism” and neo-Copenhagen thinking of Caves et al. [2], but it was the first meeting at an international scale to indicate that the quantum information revolution might have something genuine to contribute to quantum foundational thought. That meeting set the pace far beyond Växjö by the participation of a number of luminaries of quantum information: The late Asher Peres, Daniel Greenberger, David Mermin, Herbert Bernstein, Carlton Caves, Rüdiger Schack, Richard Jozsa, Benjamin Schumacher, and John Smolin. One could almost hear a battle cry echo from the woods, “Quantum foundations from quantum information or die!” The foundations debate had turned so stale by the time, it would surely die without an influx of new thinking.

Starting with the conference proceedings from the 2001 meeting [3] with its two programmatic papers [4, 5], the series has never wavered in its support of a vigorous debate on the battle cry. (See e.g., the collections [6,7,8,9,10,11,12,13,14,15]). This year seemed a special opportunity to reassess where we stand, and we thus came to the conference title, “Quantum Information Revolution: Impact to Foundations‽” The interrobang ‽ seemed perfect for the boisterous debate we expected! Indeed one will see it even in these proceedings.

This special issue of Foundations of Physics starts with a line-up of papers from the QBism session. The year 2019, like 2001, was a watershed for QBism, both deepening the theory at the same time as sharpening its promise for a new physics, even a new understanding of spacetime. The paper of Pienaar “Extending the Agent in QBism” addresses a gap in the QBist interpretation of quantum theory concerning its tenet that ‘’the instruments of observation are a prolongation of the sense organs of the observer.” He shows how the gap can be removed by introducing a process of “agent extension,” whereby an instrument is first treated as an external system which is to be “tuned,” and afterwards may be incorporated as a part of the agent, subject to certain formal conditions. In doing so he also discovers an unexpected connection to the resource theory of quantum measurements. The paper by DeBrota, Fuchs, and Schack “Respecting One’s Fellow: QBism’s Analysis of Wigner’s Friend,” builds the apparatus for a QBist response to the recent thought experiments of Frauchiger and Renner [16] and Baumann and Brukner [17]. Responding to these thought experiments helped clarify the principled stand QBism must always take: When two agents take actions on each other, each agent has a dual role as a physical system for the other agent: No user of quantum theory is more privileged than any other. This in fact leads to a kind of Copernican principle which had been mentioned in previous papers, but had not been shown to be such a powerful concept. The paper by Cavalcanti “The View from a Wigner Bubble,” then runs with this new Copernican principle to conclude that the outcomes an agent obtains upon a making quantum measurements cannot be considered absolute events in a single spacetime. Differing from usual QBist-style argumentation, his argument is instead based on another recent no-go theorem on the Wigner’s friend paradox [18]. The technique is significant in that one does not have be a full-blown QBist to end up at one of QBism’s key tenets—the tenet that quantum measurement outcomes must be understood as personal to the agent performing the measurement. In fact, the argument carries weight for any theory that aims to retain the universal validity of the predictions of quantum theory at the same time as a certain principle of Local Action.

Moving beyond the QBism session, Bengtsson presents a work that is nonetheless extremely important for the QBist project of trying to find an elegant formulation of quantum theory involving only probabilities, instead of state vectors and operators. In his paper “SICs: Some Explanations,” he explains how the (still-open) problem of proving the existence of SIC-POVMs in finite-dimensional Hilbert spaces has led to a recent breakthrough in algebraic number theory. He then sketches a strategy for proving existence in an infinite sequence of dimensions suggested back by number theory. The paper ends by asking if such sequences can play a role in real meat-and-potato physics.

The paper of Khrennikov “Quantum versus classical entanglement: eliminating the issue of quantum nonlocality” analyzes the relation between quantum and classical entanglement. The latter notion is widely used in classical optical simulations of various quantum-like features of light. The common view that “quantum nonlocality” is the basic factor differentiating quantum and classical realizations of entanglement is criticized. It is stressed that one can proceed without referring to quantum nonlocality by taking into account that quantum theory is about acts of observation. These acts are characterized by individuality and discreteness. In discussions on classical entanglement, it is often missed that the main deviation of classical light models from quantum theory is not only in the states, but in the description of measurement procedures. Classical and semiclassical descriptions are based on intensities of signals, whereas the quantum description of measurements is based on counting discrete events, clicks of detectors, with the aid of Born’s rule.

Plotnitsky’s article “The Unavoidable Interaction Between the Object and the Measuring Instrument” is in part stimulated by Khrennikov’s critique of the concept of quantum nonlocality as grounding quantum entanglement and the difference between classical and quantum phenomena. In contrast to Khrennikov, this article suggests that the idea of quantum nonlocality be retained. Specifically, it argues that six features of quantum mechanics—(1) the irreducible role of measuring instruments in defining quantum phenomena, (2) discreteness, (3) complementarity, (4) entanglement, (5) quantum nonlocality, and (6) the irreducibly probabilistic nature of quantum predictions—are all interconnected in defining quantum phenomena and distinguishing them from classical ones. It is thus difficult to give an unconditional priority to any one of them.

Auffeves and Grangier in their paper “Deriving Born’s rule from an inference to the best explanation” continue the line of research elaborated in previous articles in which they present a simple set of axioms on “Contexts, Systems and Modalities” (CSM). A modality corresponds to a repeatable measurement result obtained on a given quantum system, within a given measurement context. In this framework, the structure of quantum mechanics appears as a result of the interplay between the quantized number of modalities accessible to a quantum system and the continuum of contexts required to define the modalities. They further discuss the impact of these ideas on quantum foundations by showing how to obtain Born’s rule within this framework.

Jaeger in his paper “Exchange forces in particle physics” emphasizes that the notion of the operation of fundamental forces through the exchange of intermediary particles has been central to both the intuitive understanding of them and their explanation by practitioners of particle physics for almost a century. However, because the intermediary particles involved are virtual particles, it has been difficult to provide a proper foundation for that conception of force. This article provides such a foundation by explicating the fundamental forces in relativistic quantum field theory in terms of the standard methodology of particle physics, providing a full and proper understanding of virtual intermediary particles.

Elze’s article “Are quantum spins but small perturbations of ontological Ising spins?” studies a classical, deterministically evolving chain of Ising spins to illustrate ’t Hooft’s Cellular Automaton Interpretation (CAI) of quantum theory. Following a synopsis of some of CAI’s achievements in demystifying quantum foundations, a discrete classical model of a field theory is described, characterized by a finite signal velocity and a Hamiltonian with long-range interactions. This is achieved by applying techniques from quantum theory, including a new finite Baker–Campbell–Hausdorff formula. The resulting classical Hamiltonian has no free parameters and arbitrarily small perturbations turn it into a genuine N-qubit operator. Here the ontological Ising spin system appears as an island in parameter space embedded in a large sea of quantum spin models, and supports ’t Hooft’s CAI as reflecting an epistemic approach to handling physical phenomena.

Hofmann’s contribution “Quantum causality relations and the emergence of reality from coherent superpositions” aims to show how the quantum mechanical relation between initial conditions and final effects via inner products of Hilbert-space vectors approximates the classical continuity of reality. He uses measurement theory to derive an uncertainty limit at which the classical notion of causality associated with the principle of least action breaks down. The results of the analysis indicate that the classical notion of causality mediated by intermediate realities is based on observations made at resolutions too low to identify any microscopic reality. He suggests that our continued failure to understand quantum foundations might be a consequence of a mistaken assumption that causality requires intermediate realities even if these hypothetical realities cannot have any observable effects.

Finally, D’Ariano’s paper “No purification ontology, no quantum paradoxes” tackles a presupposition of many well-known interpretations of quantum theory (including the many versions of Everettianism, quantum Darwinism, etc.) that all mixed states must be understood as the marginals of pure entangled states and that all transformations on a quantum system are actually achieved by a unitary between the system at hand and an ancillary system. This is a supposition John Smolin once dubbed “The Church of the Larger Hilbert Space” and called by the author in this context the “purification ontology.” Interestingly, the paper shows by rigorous demonstration that the purification ontology cannot be falsified. Thus it may be an extraneous assumption that simply causes trouble, creating paradoxes where none actually exist. The paper also applies this realization to the well-known black-hole information paradox.

We hope that the reader will enjoy this special issue, which can be useful to experts working in all domains of quantum physics and quantum information theory, ranging from experimenters, to theoreticians and philosophers.