We take a fresh look at the logics of informational dependence and independence of Hintikka and Sandu and Väänänen, and their compositional semantics due to Hodges. We show how Hodges’ semantics can be seen as a special case of a general construction, which provides a context for a useful completeness theorem with respect to a wider class of models. We shed some new light on each aspect of the logic. We show that the natural propositional logic carried by the semantics (...) is the logic of Bunched Implications due to Pym and O’Hearn, which combines intuitionistic and multiplicative connectives. This introduces several new connectives not previously considered in logics of informational dependence, but which we show play a very natural rôle, most notably intuitionistic implication. As regards the quantifiers, we show that their interpretation in the Hodges semantics is forced, in that they are the image under the general construction of the usual Tarski semantics; this implies that they are adjoints to substitution, and hence uniquely determined. As for the dependence predicate, we show that this is definable from a simpler predicate, of constancy or dependence on nothing. This makes essential use of the intuitionistic implication. The Armstrong axioms for functional dependence are then recovered as a standard set of axioms for intuitionistic implication. We also prove a full abstraction result in the style of Hodges, in which the intuitionistic implication plays a very natural rôle. (shrink)
Abramsky, S., Domain theory in logical form, Annals of Pure and Applied Logic 51 1–77. The mathematical framework of Stone duality is used to synthesise a number of hitherto separate developments in theoretical computer science.• Domain theory, the mathematical theory of computation introduced by Scott as a foundation for detonational semantics• The theory of concurrency and systems behaviour developed by Milner, Hennesy based on operational semantics.• Logics of programsStone duality provides a junction between semantics and logics . Moreover, the underlying (...) logic is geometric, which can be computationally interpreted as the logic of observable properties—i.e., properties which can be determined to hold of a process on the basis of a finite amount of information about its execution.These ideas lead to the following programme. A metalanguage is introduced, comprising• types = universes of discourse for various computational situations;• terms = PROGRAMS = syntactic intensions for models or points. A standard denotational interpretation of the metalanguage is given, assigning domains to types and domain elements to terms. The metalanguage is also given a logical interpretation, in which types are interpreted as propositional theories and terms are interpreted via a program logic, which axiomatises the properties they satisfy. The two interpretations are related by showing that they are Stone duals of each other. Hence, semantics and logic are guaranteed to be in harmony with each other, and in fact each determines the other up to isomorphism. This opens the way to a whole range of applications. Given a denotational description of a computational situation in our metalanguage, we can turn the handle to obtain a logic for that situation. (shrink)
We use a simple relational framework to develop the key notions and results on hidden variables and non-locality. The extensive literature on these topics in the foundations of quantum mechanics is couched in terms of probabilistic models, and properties such as locality and no-signalling are formulated probabilistically. We show that to a remarkable extent, the main structure of the theory, through the major No-Go theorems and beyond, survives intact under the replacement of probability distributions by mere relations.
Bell inequalities play a central role in the study of quantum nonlocality and entanglement, with many applications in quantum information. Despite the huge literature on Bell inequalities, it is not easy to find a clear conceptual answer to what a Bell inequality is, or a clear guiding principle as to how they may be derived. In this paper, we introduce a notion of logical Bell inequality which can be used to systematically derive testable inequalities for a very wide variety of (...) situations. There is a single clear conceptual principle, based on purely logical consistency conditions, which underlies our notion of logical Bell inequalities. We show that in a precise sense, all Bell inequalities can be taken to be of this form. Our approach is very general. It applies directly to any family of sets of commuting observables. Thus it covers not only the n-partite scenarios to which Bell inequalities are standardly applied, but also Kochen-Specker configurations, and many other examples. There is much current work on experimental tests for contextuality. Our approach directly yields, in a systematic fashion, testable inequalities for a very general notion of contextuality. There has been much work on obtaining proofs of Bell's theorem “without inequalities” or “without probabilities.” These proofs are seen as being in a sense more definitive and logically robust than the inequality-based proofs. On the hand, they lack the fault-tolerant aspect of inequalities. Our approach reconciles these aspects, and in fact shows how the logical robustness can be converted into systematic, general derivations of inequalities with provable violations. Moreover, the kind of strong non-locality or contextuality exhibited by the GHZ argument or by Kochen-Specker configurations can be shown to lead to maximal violations of the corresponding logical Bell inequalities. Thus the qualitative and the quantitative aspects are combined harmoniously. (shrink)
A short scientific biography emphasising the main phases of Abramsky’s research: duality theory and domains in logical form, game semantics, categorical quantum mechanics, the sheaf-theoretic approach to contextuality, and game comonads and a structural view of resources and descriptive complexity.
We investigate the use of coalgebra to represent quantum systems, thus providing a basis for the use of coalgebraic methods in quantum information and computation. Coalgebras allow the dynamics of repeated measurement to be captured, and provide mathematical tools such as final coalgebras, bisimulation and coalgebraic logic. However, the standard coalgebraic framework does not accommodate contravariance, and is too rigid to allow physical symmetries to be represented. We introduce a fibrational structure on coalgebras in which contravariance is represented by indexing. (...) We use this structure to give a universal semantics for quantum systems based on a final coalgebra construction. We characterize equality in this semantics as projective equivalence. We also define an analogous indexed structure for Chu spaces, and use this to obtain a novel categorical description of the category of Chu spaces. We use the indexed structures of Chu spaces and coalgebras over a common base to define a truncation functor from coalgebras to Chu spaces. This truncation functor is used to lift the full and faithful representation of the groupoid of physical symmetries on Hilbert spaces into Chu spaces, obtained in our previous work, to the coalgebraic semantics. (shrink)
We pursue a model-oriented rather than axiomatic approach to the foundations of Quantum Mechanics, with the idea that new models can often suggest new axioms. This approach has often been fruitful in Logic and Theoretical Computer Science. Rather than seeking to construct a simplified toy model, we aim for a 'big toy model', in which both quantum and classical systems can be faithfully represented—as well as, possibly, more exotic kinds of systems. To this end, we show how Chu spaces can (...) be used to represent physical systems of various kinds. In particular, we show how quantum systems can be represented as Chu spaces over the unit interval in such a way that the Chu morphisms correspond exactly to the physically meaningful symmetries of the systems—the unitaries and antiunitaries. In this way we obtain a full and faithful functor from the groupoid of Hubert spaces and their symmetries to Chu spaces. We also consider whether it is possible to use a finite value set rather than the unit interval; we show that three values suffice, while the two standard possibilistic reductions to two values both fail to preserve fullness. (shrink)
This Handbook is a combination of authoritative exposition, comprehensive survey, and fundamental research exploring the underlying unifying themes in the various areas. The intended audience is graduate students and researchers in the areas of computing and logic, as well as other people interested in the subject. We assume as background some mathematical sophistication. Much of the material will also be of interest to logicians and mathematicians.
Genericity is the idea that the same program can work at many different data types. Longo, Milstead and Soloviev proposed to capture the inability of generic programs to probe the structure of their instances by the following equational principle: if two generic programs, viewed as terms of type , are equal at any given instance A[T], then they are equal at all instances. They proved that this rule is admissible in a certain extension of System F, but finding a semantically (...) motivated model satisfying this principle remained an open problem.In the present paper, we construct a categorical model of polymorphism, based on game semantics, which contains a large collection of generic types. This model builds on two novel constructions:•A direct interpretation of variable types as games, with a natural notion of substitution of games. This allows moves in games A[T] to be decomposed into the generic part from A, and the part pertaining to the instance T. This leads to a simple and natural notion of generic strategy.•A “relative polymorphic product” which expresses quantification over the type variable Xi in the variable type A with respect to a “universe” which is explicitly given as an additional parameter B. We then solve a recursive equation involving this relative product to obtain a universe in a suitably “absolute” sense.Full Completeness for ML types is proved for this model. (shrink)
We present the model construction technique called Linear Realizability. It consists in building a category of Partial Equivalence Relations over a Linear Combinatory Algebra. We illustrate how it can be used to provide models, which are fully complete for various typed λ-calculi. In particular, we focus on special Linear Combinatory Algebras of partial involutions, and we present PER models over them which are fully complete, inter alia, w.r.t. the following languages and theories: the fragment of System F consisting of ML-types, (...) the maximal theory on the simply typed λ-calculus with finitely many ground constants, and the maximal theory on an infinitary version of this latter calculus. (shrink)