In natural deduction classical logic is commonly formulated by adding a rule such as Double Negation Elimination (DNE) or Classical Reductio ad Absurdum (CRA) to a set of introduction and elimination rules sufficient for intuitionist first-order logic with conjunction, disjunction, implication, negation and the universal and existential quantifiers all taken as primitive. The natural deduction formulation of intuitionist logic, coming from Gentzen, has nice properties:— (i) the separation property: an intuitionistically valid inference is derivable using only the introduction and elimination (...) rules governing the connectives and/or quantifiers that occur in the premises (if any) and conclusion; (ii) the (strict) subformula property: more narrowly, there is a derivation of any intuitionistically valid inference that employs only subformulas of the formulas occurring in premises (if any) and conclusion. (Every formula is, of course, a subformula of itself.). (shrink)
The thesis that, in a system of natural deduction, the meaning of a logical constant is given by some or all of its introduction and elimination rules has been developed recently in the work of Dummett, Prawitz, Tennant, and others, by the addition of harmony constraints. Introduction and elimination rules for a logical constant must be in harmony. By deploying harmony constraints, these authors have arrived at logics no stronger than intuitionist propositional logic. Classical logic, they maintain, cannot be justified (...) from this proof-theoretic perspective. This paper argues that, while classical logic can be formulated so as to satisfy a number of harmony constraints, the meanings of the standard logical constants cannot all be given by their introduction and/or elimination rules; negation, in particular, comes under close scrutiny. (shrink)
In making assertions one takes on commitments to the consistency of what one asserts and to the logical consequences of what one asserts. Although there is no quick link between belief and assertion, the dialectical requirements on assertion feed back into normative constraints on those beliefs that constitute one's evidence. But if we are not certain of many of our beliefs and that uncertainty is modelled in terms of probabilities, then there is at least prima facie incoherence between the normative (...) constraints on belief and the probability-like structure of degrees of belief. I suggest that the norm-governed practice relating to degrees of belief is the evaluation of betting odds. (shrink)
Of his numerous investigations ... Tarski was most proud of two: his work on truth and his design of an algorithm in 1930 to decide the truth or falsity of any sentence of the elementary theory of the high school Euclidean geometry. [...] His mathematical treatment of the semantics of languages and the concept of truth has had revolutionary consequences for mathematics, linguistics, and philosophy, and Tarski is widely thought of as the man who "defined truth". The seeming simplicity of (...) his famous example that the sentence "Snow is white" is true just in case snow is white belies the depth and complexity of the consequences which can be drawn from the possibility of giving a general treatment of the concept of truth in formal mathematical languages in a rigorous mathematical way. (J.W. Addison). (shrink)
Proofs of Gödel's First Incompleteness Theorem are often accompanied by claims such as that the gödel sentence constructed in the course of the proof says of itself that it is unprovable and that it is true. The validity of such claims depends closely on how the sentence is constructed. Only by tightly constraining the means of construction can one obtain gödel sentences of which it is correct, without further ado, to say that they say of themselves that they are unprovable (...) and that they are true; otherwise a false theory can yield false gödel sentences. (shrink)
As Wilfrid Hodges has observed, there is no mention of the notion truth-in-a-model in Tarski's article 'The Concept of Truth in Formalized Languages'; nor does truth make many appearances in his papers on model theory from the early 1950s. In later papers from the same decade, however, this reticence is cast aside. Why should Tarski, who defined truth for formalized languages and pretty much founded model theory, have been so reluctant to speak of truth in a model? What might explain (...) the change in his practice? The answers, I believe, lie in Tarski's views on truth simpliciter. (shrink)
While there is now considerable experimental evidence that, on the one hand, participants assign to the indicative conditional as probability the conditional probability of consequent given antecedent and, on the other, they assign to the indicative conditional the ?defective truth-table? in which a conditional with false antecedent is deemed neither true nor false, these findings do not in themselves establish which multi-premise inferences involving conditionals participants endorse. A natural extension of the truth-table semantics pronounces as valid numerous inference patterns that (...) do seem to be part of ordinary usage. However, coupled with something the probability account gives us?namely that when conditional-free ? entails conditional-free ?, ?if ? then ?? is a trivial, uninformative truth?we have enough logic to derive the paradoxes of material implication. It thus becomes a matter of some urgency to determine which inference patterns involving indicative conditionals participants do endorse. Only thus will we be able to arrive at a realistic, systematic semantics for the indicative conditional. (shrink)
Successful technologies’ ubiquity changes uses, users and ethicolegal responsibilities and duties of care. We focus on dementia to review critically ethicolegal implications of increasing use of social networking sites (SNS) by those with compromised decision-making capacity, assessing concerned parties’ responsibilities. Although SNS contracts assume ongoing decision-making capacity, many users’ may be compromised or declining. Resulting ethicolegal issues include capacity to give informed consent to contracts, protection of online privacy including sharing and controlling data, data leaks between different digital platforms, and (...) management of digital identities and footprints. SNS uses in healthcare raise additional issues. Online materials acting as archives of ‘the self’ bolster present and future identities for users with compromised capacity. E-health involves actual and potential intersection of data gathered for the purpose of delivering health technological support with data used for social networking purposes. Ethicolegal guidance is limited on the implications of SNS usage in contexts where users have impaired/reduced capacity to understand and/or consent to sharing personal data about their health, medication or location. Vulnerable adults and family/carers face uncertainty in regard to consent, data protection, online identity and legal liabilities. Ethicolegal responsibilities and duties of care of technology providers, healthcare professionals, regulatory bodies and policymakers need clarification. (shrink)
Various natural deduction formulations of classical, minimal, intuitionist, and intermediate propositional and first-order logics are presented and investigated with respect to satisfaction of the separation and subformula properties. The technique employed is, for the most part, semantic, based on general versions of the Lindenbaum and Lindenbaumlmarck.
Bertrand Russell’s 1906 article ‘The Theory of Implication’ contains an algebraic weak completeness proof for classical propositional logic. Russell did not present it as such. We give an exposition of the proof and investigate Russell’s view of what he was about, whether he could have appreciated the proof for what it is, and why there is no parallel of the proof in Principia Mathematica.
Professor Tennant and I agree on much regarding the proof-theoretic semantics of free logic. Here I point to two issues, one on which we disagree, the other on which I find it hard to say how closely we may agree. The first concerns the exact content of Tennant's Rule of Atomic Denotation. The second concerns the nature of assumptions whose formal counterparts contain parametric occurrences of names.
Consistent application of coherece arguments shows that fair betting quotients are subject to constraints that are too stringent to allow their identification with either degrees of belief or probabilities. The pivotal role of fair betting quotients in the Dutch Book Argument, which is said to demonstrate that a rational agent's degrees of belief are probabilities, is thus undermined from both sides.
A proof employing no semantic terms is offered in support of the claim that there can be truths without truthmakers. The logical resources used in the proof are weak but do include the structural rule Contraction.
Some propositions add more information to bodies of propositions than do others. We start with intuitive considerations on qualitative comparisons of information added . Central to these are considerations bearing on conjunctions and on negations. We find that we can discern two distinct, incompatible, notions of information added. From the comparative notions we pass to quantitative measurement of information added. In this we borrow heavily from the literature on quantitative representations of qualitative, comparative conditional probability. We look at two ways (...) to obtain a quantitative conception of information added. One, the most direct, mirrors Bernard Koopman’s construction of conditional probability: by making a strong structural assumption, it leads to a measure that is, transparently, some function of a function P which is, formally, an assignment of conditional probability (in fact, a Popper function). P reverses the information added order and mislocates the natural zero of the scale so some transformation of this scale is needed but the derivation of P falls out so readily that no particular transformation suggests itself. The Cox–Good–Aczél method assumes the existence of a quantitative measure matching the qualitative relation, and builds on the structural constraints to obtain a measure of information that can be rescaled as, formally, an assignment of conditional probability. A classical result of Cantor’s, subsequently strengthened by Debreu, goes some way towards justifying the assumption of the existence of a quantitative scale. What the two approaches give us is a pointer towards a novel interpretation of probability as a rescaling of a measure of information added. (shrink)
Our starting point is Michael Luntley's falsificationist semantics for the logical connectives and quantifiers: the details of his account are criticised but we provide an alternative falsificationist semantics that yields intuitionist logic, as Luntley surmises such a semantics ought. Next an account of the logical connectives and quantifiers that combines verificationist and falsificationist perspectives is proposed and evaluated. While the logic is again intuitionist there is, somewhat surprisingly, an unavoidable asymmetry between the verification and falsification conditions for negation, the conditional, (...) and the universal quantifier. Lastly we are lead to a novel characterization of realism. (shrink)
This article begins by outlining some of the history—beginning with brief remarks of Quine's—of work on conditional assertions and conditional events. The upshot of the historical narrative is that diverse works from various starting points have circled around a nexus of ideas without convincingly tying them together. Section 3 shows how ideas contained in a neglected article of de Finetti's lead to a unified treatment of the topics based on the identification of conditional events as the objects of conditional bets. (...) The penultimate section explores some of the consequences of the resulting logic of conditional events while the last defends it. (shrink)
From the point of view of proof-theoretic semantics, we examine the logical background invoked by Neil Tennant's abstractionist realist account of mathematical existence. To prepare the way, we must first look closely at the rule of existential elimination familiar from classical and intuitionist logics and at rules governing identity. We then examine how well free logics meet the harmony and uniqueness constraints familiar from the proof-theoretic semantics project. Tennant assigns a special role to atomic formulas containing singular terms. This, we (...) find, secures harmony and uniqueness but militates against the putative realism. (shrink)
Intervals in boolean algebras enter into the study of conditional assertions (or events) in two ways: directly, either from intuitive arguments or from Goodman, Nguyen and Walker's representation theorem, as suitable mathematical entities to bear conditional probabilities, or indirectly, via a representation theorem for the family of algebras associated with de Finetti's three-valued logic of conditional assertions/events. Further representation theorems forge a connection with rough sets. The representation theorems and an equivalent of the boolean prime ideal theorem yield an algebraic (...) completeness theorem for the three-valued logic. This in turn leads to a Henkin-style completeness theorem. Adequacy with respect to a family of Kripke models for de Finetti's logic, Łukasiewicz's three-valued logic and Priest's Logic of Paradox is demonstrated. The extension to first-order yields a short proof of adequacy for Körner's logic of inexact predicates. (shrink)
Uncertainty and vagueness/imprecision are not the same: one can be certain about events described using vague predicates and about imprecisely specified events, just as one can be uncertain about precisely specified events. Exactly because of this, a question arises about how one ought to assign probabilities to imprecisely specified events in the case when no possible available evidence will eradicate the imprecision (because, say, of the limits of accuracy of a measuring device). Modelling imprecision by rough sets over an approximation (...) space presents an especially tractable case to help get one’s bearings. Two solutions present themselves: the first takes as upper and lower probabilities of the event X the (exact) probabilities assigned X ’s upper and lower rough-set approximations; the second, motivated both by formal considerations and by a simple betting argument, is to treat X ’s rough-set approximation as a conditional event and assign to it a point-valued (conditional) probability. (shrink)
Wikipedia is a goldmine of information; not just for its many readers, but also for the growing community of researchers who recognize it as a resource of exceptional scale and utility. It represents a vast investment of manual effort and judgment: a huge, constantly evolving tapestry of concepts and relations that is being applied to a host of tasks. This article provides a comprehensive description of this work. It focuses on research that extracts and makes use of the concepts, relations, (...) facts and descriptions found in Wikipedia, and organizes the work into four broad categories: applying Wikipedia to natural language processing; using it to facilitate information retrieval and information extraction; and as a resource for ontology building. The article addresses how Wikipedia is being used as is, how it is being improved and adapted, and how it is being combined with other structures to create entirely new resources. We identify the research groups and individuals involved, and how their work has developed in the last few years. We provide a comprehensive list of the open-source software they have produced. (shrink)
This paper responds to Rancière’s reading of Lyotard’s analysis of the sublime by attempting to articulate what Lyotard would call a “differend” between the two. Sketching out Rancière’s criticisms, I show that Lyotard’s analysis of the Kantian sublime is more defensible than Rancière claims. I then provide an alternative reading, one that frees Lyotard’s sublime from Rancière’s central accusation that it signals nothing more than the mind’s perpetual enslavement to the lawof the Other. Reading the sublime through the figure of (...) the “event,” I end by suggesting that it may have certain affinities with what Rancière calls “politics.”. (shrink)
A conception of probability as an irreducible feature of the physical world is outlined. Propensity analyses of probability are examined and rejected as both formally and conceptually inadequate. It is argued that probability is a non-dispositional property of trial-types; probabilities are attributed to outcomes as event-types. Brier's Rule in an objectivist guise is used to forge a connection between physical and subjective probabilities. In the light of this connection there are grounds for supposing physical probability to obey some standard set (...) of axioms. However, there is no a priori reason why this should be the case. (shrink)
hese notes don’t reach any conclusions. Their purpose is to point to issues one needs to think through seriously when thinking about logic teaching. They indicate some of the relevant literature where some of these issues are addressed, but they also raise points that seem to have been overlooked. They aim to promote informed discussion. That indeed was their origin: they are descended from an internal discussion document prepared a few years ago when the then Department of Philosophy at the (...) University of Edinburgh was reviewing its logic teaching. (shrink)
1 — 50 / 180
Graduate studies at Western
Using PhilPapers from home?
Click here to configure this browser for off-campus access.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it: