This paper is the report of a meetingthat gathered many of the UK's most senioranimal scientists with representatives of thefarming industry, consumer groups, animalwelfare groups, and environmentalists. Therewas strong consensus that the current economicstructure of agriculture cannot adequatelyaddress major issues of concern to society:farm incomes, food security and safety, theneeds of developing countries, animal welfare,and the environment. This economic structure isbased primarily on competition betweenproducers and between retailers, driving foodprices down, combined with externalization ofmany costs. These issues must be addressed (...) by acombination of legislation, restructuring ofthe market, and use of public funds. Themeeting included workshops that made otherrecommendations for research and education. Themost urgent requirement is recognition thatchange is needed and development of a visionfor what that change must achieve. (shrink)
While there is now considerable experimental evidence that, on the one hand, participants assign to the indicative conditional as probability the conditional probability of consequent given antecedent and, on the other, they assign to the indicative conditional the ?defective truth-table? in which a conditional with false antecedent is deemed neither true nor false, these findings do not in themselves establish which multi-premise inferences involving conditionals participants endorse. A natural extension of the truth-table semantics pronounces as valid numerous inference patterns that (...) do seem to be part of ordinary usage. However, coupled with something the probability account gives us?namely that when conditional-free ? entails conditional-free ?, ?if ? then ?? is a trivial, uninformative truth?we have enough logic to derive the paradoxes of material implication. It thus becomes a matter of some urgency to determine which inference patterns involving indicative conditionals participants do endorse. Only thus will we be able to arrive at a realistic, systematic semantics for the indicative conditional. (shrink)
A proof employing no semantic terms is offered in support of the claim that there can be truths without truthmakers. The logical resources used in the proof are weak but do include the structural rule Contraction.
Some propositions add more information to bodies of propositions than do others. We start with intuitive considerations on qualitative comparisons of information added . Central to these are considerations bearing on conjunctions and on negations. We find that we can discern two distinct, incompatible, notions of information added. From the comparative notions we pass to quantitative measurement of information added. In this we borrow heavily from the literature on quantitative representations of qualitative, comparative conditional probability. We look at two ways (...) to obtain a quantitative conception of information added. One, the most direct, mirrors Bernard Koopman’s construction of conditional probability: by making a strong structural assumption, it leads to a measure that is, transparently, some function of a function P which is, formally, an assignment of conditional probability (in fact, a Popper function). P reverses the information added order and mislocates the natural zero of the scale so some transformation of this scale is needed but the derivation of P falls out so readily that no particular transformation suggests itself. The Cox–Good–Aczél method assumes the existence of a quantitative measure matching the qualitative relation, and builds on the structural constraints to obtain a measure of information that can be rescaled as, formally, an assignment of conditional probability. A classical result of Cantor’s, subsequently strengthened by Debreu, goes some way towards justifying the assumption of the existence of a quantitative scale. What the two approaches give us is a pointer towards a novel interpretation of probability as a rescaling of a measure of information added. (shrink)
Intervals in boolean algebras enter into the study of conditional assertions (or events) in two ways: directly, either from intuitive arguments or from Goodman, Nguyen and Walker's representation theorem, as suitable mathematical entities to bear conditional probabilities, or indirectly, via a representation theorem for the family of algebras associated with de Finetti's three-valued logic of conditional assertions/events. Further representation theorems forge a connection with rough sets. The representation theorems and an equivalent of the boolean prime ideal theorem yield an algebraic (...) completeness theorem for the three-valued logic. This in turn leads to a Henkin-style completeness theorem. Adequacy with respect to a family of Kripke models for de Finetti's logic, Łukasiewicz's three-valued logic and Priest's Logic of Paradox is demonstrated. The extension to first-order yields a short proof of adequacy for Körner's logic of inexact predicates. (shrink)
This article begins by exploring a lost topic in the philosophy of science:the properties of the relations evidence confirming h confirmsh'' and, more generally, evidence confirming each ofh1, h2, ..., hm confirms at least one of h1, h2,ldots;, hn''.The Bayesian understanding of confirmation as positive evidential relevanceis employed throughout. The resulting formal system is, to say the least, oddlybehaved. Some aspects of this odd behaviour the system has in common withsome of the non-classical logics developed in the twentieth century. Oneaspect (...) – its ``parasitism'''' on classical logic – it does not, and it is this featurethat makes the system an interesting focus for discussion of questions in thephilosophy of logic. We gain some purchase on an answer to a recently prominentquestion, namely, what is a logical system? More exactly, we ask whether satisfaction of formal constraints is sufficient for a relation to be considered a (logical) consequence relation. The question whether confirmation transfer yields a logical system is answered in the negative, despite confirmation transfer having the standard properties of a consequence relation, on the grounds that validity of sequents in the system is not determined by the meanings of the connectives that occur in formulas. Developing the system in a different direction, we find it bears on the project of ``proof-theoretic semantics'''': conferring meaning on connectives by means of introduction (and possibly elimination) rules is not an autonomous activity, rather it presupposes a prior, non-formal,notion of consequence. Some historical ramifications are alsoaddressed briefly. (shrink)
In natural deduction classical logic is commonly formulated by adding a rule such as Double Negation Elimination (DNE) or Classical Reductio ad Absurdum (CRA) to a set of introduction and elimination rules sufficient for intuitionist first-order logic with conjunction, disjunction, implication, negation and the universal and existential quantifiers all taken as primitive. The natural deduction formulation of intuitionist logic, coming from Gentzen, has nice properties:— (i) the separation property: an intuitionistically valid inference is derivable using only the introduction and elimination (...) rules governing the connectives and/or quantifiers that occur in the premises (if any) and conclusion; (ii) the (strict) subformula property: more narrowly, there is a derivation of any intuitionistically valid inference that employs only subformulas of the formulas occurring in premises (if any) and conclusion. (Every formula is, of course, a subformula of itself.). (shrink)
The moral justification for government is, that it is needed to promote the community's interest. What is that interest an interest in? Upon what basis can disagreements about the community's interest and individual interests be reconciled? Can democracy enable dissatisfaction with their reconciliation to be lived with? Perhaps, if people are prepared to meet the requirements of democratic citizenship. What are these requirements, and what is their justification? These are the questions with which this book is concerned.
The thesis that, in a system of natural deduction, the meaning of a logical constant is given by some or all of its introduction and elimination rules has been developed recently in the work of Dummett, Prawitz, Tennant, and others, by the addition of harmony constraints. Introduction and elimination rules for a logical constant must be in harmony. By deploying harmony constraints, these authors have arrived at logics no stronger than intuitionist propositional logic. Classical logic, they maintain, cannot be justified (...) from this proof-theoretic perspective. This paper argues that, while classical logic can be formulated so as to satisfy a number of harmony constraints, the meanings of the standard logical constants cannot all be given by their introduction and/or elimination rules; negation, in particular, comes under close scrutiny. (shrink)
Of his numerous investigations ... Tarski was most proud of two: his work on truth and his design of an algorithm in 1930 to decide the truth or falsity of any sentence of the elementary theory of the high school Euclidean geometry. [...] His mathematical treatment of the semantics of languages and the concept of truth has had revolutionary consequences for mathematics, linguistics, and philosophy, and Tarski is widely thought of as the man who "defined truth". The seeming simplicity of (...) his famous example that the sentence "Snow is white" is true just in case snow is white belies the depth and complexity of the consequences which can be drawn from the possibility of giving a general treatment of the concept of truth in formal mathematical languages in a rigorous mathematical way. (J.W. Addison). (shrink)
Proofs of Gödel's First Incompleteness Theorem are often accompanied by claims such as that the gödel sentence constructed in the course of the proof says of itself that it is unprovable and that it is true. The validity of such claims depends closely on how the sentence is constructed. Only by tightly constraining the means of construction can one obtain gödel sentences of which it is correct, without further ado, to say that they say of themselves that they are unprovable (...) and that they are true; otherwise a false theory can yield false gödel sentences. (shrink)
As Wilfrid Hodges has observed, there is no mention of the notion truth-in-a-model in Tarski's article 'The Concept of Truth in Formalized Languages'; nor does truth make many appearances in his papers on model theory from the early 1950s. In later papers from the same decade, however, this reticence is cast aside. Why should Tarski, who defined truth for formalized languages and pretty much founded model theory, have been so reluctant to speak of truth in a model? What might explain (...) the change in his practice? The answers, I believe, lie in Tarski's views on truth simpliciter. (shrink)
Consistent application of coherece arguments shows that fair betting quotients are subject to constraints that are too stringent to allow their identification with either degrees of belief or probabilities. The pivotal role of fair betting quotients in the Dutch Book Argument, which is said to demonstrate that a rational agent's degrees of belief are probabilities, is thus undermined from both sides.
Our starting point is Michael Luntley's falsificationist semantics for the logical connectives and quantifiers: the details of his account are criticised but we provide an alternative falsificationist semantics that yields intuitionist logic, as Luntley surmises such a semantics ought. Next an account of the logical connectives and quantifiers that combines verificationist and falsificationist perspectives is proposed and evaluated. While the logic is again intuitionist there is, somewhat surprisingly, an unavoidable asymmetry between the verification and falsification conditions for negation, the conditional, (...) and the universal quantifier. Lastly we are lead to a novel characterization of realism. (shrink)
This article begins by outlining some of the history—beginning with brief remarks of Quine's—of work on conditional assertions and conditional events. The upshot of the historical narrative is that diverse works from various starting points have circled around a nexus of ideas without convincingly tying them together. Section 3 shows how ideas contained in a neglected article of de Finetti's lead to a unified treatment of the topics based on the identification of conditional events as the objects of conditional bets. (...) The penultimate section explores some of the consequences of the resulting logic of conditional events while the last defends it. (shrink)
From the point of view of proof-theoretic semantics, we examine the logical background invoked by Neil Tennant's abstractionist realist account of mathematical existence. To prepare the way, we must first look closely at the rule of existential elimination familiar from classical and intuitionist logics and at rules governing identity. We then examine how well free logics meet the harmony and uniqueness constraints familiar from the proof-theoretic semantics project. Tennant assigns a special role to atomic formulas containing singular terms. This, we (...) find, secures harmony and uniqueness but militates against the putative realism. (shrink)
Uncertainty and vagueness/imprecision are not the same: one can be certain about events described using vague predicates and about imprecisely specified events, just as one can be uncertain about precisely specified events. Exactly because of this, a question arises about how one ought to assign probabilities to imprecisely specified events in the case when no possible available evidence will eradicate the imprecision (because, say, of the limits of accuracy of a measuring device). Modelling imprecision by rough sets over an approximation (...) space presents an especially tractable case to help get one’s bearings. Two solutions present themselves: the first takes as upper and lower probabilities of the event X the (exact) probabilities assigned X ’s upper and lower rough-set approximations; the second, motivated both by formal considerations and by a simple betting argument, is to treat X ’s rough-set approximation as a conditional event and assign to it a point-valued (conditional) probability. (shrink)
Wikipedia is a goldmine of information; not just for its many readers, but also for the growing community of researchers who recognize it as a resource of exceptional scale and utility. It represents a vast investment of manual effort and judgment: a huge, constantly evolving tapestry of concepts and relations that is being applied to a host of tasks. This article provides a comprehensive description of this work. It focuses on research that extracts and makes use of the concepts, relations, (...) facts and descriptions found in Wikipedia, and organizes the work into four broad categories: applying Wikipedia to natural language processing; using it to facilitate information retrieval and information extraction; and as a resource for ontology building. The article addresses how Wikipedia is being used as is, how it is being improved and adapted, and how it is being combined with other structures to create entirely new resources. We identify the research groups and individuals involved, and how their work has developed in the last few years. We provide a comprehensive list of the open-source software they have produced. (shrink)