In natural deduction classical logic is commonly formulated by adding a rule such as Double Negation Elimination (DNE) or Classical Reductio ad Absurdum (CRA) to a set of introduction and elimination rules sufficient for intuitionist first-order logic with conjunction, disjunction, implication, negation and the universal and existential quantifiers all taken as primitive. The natural deduction formulation of intuitionist logic, coming from Gentzen, has nice properties:— (i) the separation property: an intuitionistically valid inference is derivable using only the introduction and elimination (...) rules governing the connectives and/or quantifiers that occur in the premises (if any) and conclusion; (ii) the (strict) subformula property: more narrowly, there is a derivation of any intuitionistically valid inference that employs only subformulas of the formulas occurring in premises (if any) and conclusion. (Every formula is, of course, a subformula of itself.). (shrink)
The visual arts operated as a touchstone for French philosopher Jean-François Lyotard, influencing his thinking on everything from epistemology to politics. Building on the recent publication of a bilingual, six-volume edition of his writings on contemporary art and artists, this special issue of_ Cultural Politics_ provides a focus on Lyotard’s aesthetics. The issue includes a review of Lyotard’s writings on art, a discussion of his early figural aesthetics, and an essay on Lyotard’s little-known work, _Pacific Wall_, as well as two (...) essays on Lyotard and music. Two previously untranslated works by Lyotard himself are also featured: the influential article “Argumentation and Presentation: The Crisis of Foundations” and the interview “What to Paint?,” given at the time of the publication of the book of the same name. Painter Leon Phillips, whose work embodies many of the attributes of painting that were most important to Lyotard, is the featured artist for the issue. Throughout, the contributors argue for the primary importance of aesthetics in understanding Lyotard’s thought. Peter W. Milne is Assistant Professor in the Department of Aesthetics at Seoul National University. (shrink)
This paper offers a critique of sustainability reporting and, in particular, a critique of the modern disconnect between the practice of sustainability reporting and what we consider to be the urgent issue of our era: sustaining the life-supporting ecological systems on which humanity and other species depend. Tracing the history of such reporting developments, we identify and isolate the concept of the ‘triple bottom line’ (TBL) as a core and dominant idea that continues to pervade business reporting, and business engagement (...) with sustainability. Incorporating an entity’s economic, environmental and social performance indicators into its management and reporting processes, we argue, has become synonymous with corporate sustainability; in the process, concern for ecology has become sidelined. Moreover, this process has become reinforced and institutionalised through SustainAbility’s biennial benchmarking reports, KPMG’s triennial surveys of practice, initiatives by the accountancy profession and, particularly, the Global Reporting Initiative (GRI)’s sustainability reporting guidelines. We argue that the TBL and the GRI are insufficient conditions for organizations contributing to the sustaining of the Earth’s ecology. Paradoxically, they may reinforce business-as-usual and greater levels of un-sustainability. (shrink)
In making assertions one takes on commitments to the consistency of what one asserts and to the logical consequences of what one asserts. Although there is no quick link between belief and assertion, the dialectical requirements on assertion feed back into normative constraints on those beliefs that constitute one's evidence. But if we are not certain of many of our beliefs and that uncertainty is modelled in terms of probabilities, then there is at least prima facie incoherence between the normative (...) constraints on belief and the probability-like structure of degrees of belief. I suggest that the norm-governed practice relating to degrees of belief is the evaluation of betting odds. (shrink)
In this paper, we identify and discuss how sustainability reporting has spread throughout the Australian business community over the past twenty years or so. We identified all Australian business organisations that have produced a sustainability report since 1995, and we undertook an interview survey with managers of reporting companies. By incorporating a wide range and large number of reporting companies, we offer insights beyond those obtained from traditional report content analysis and from close analyses of singular case-study organisations. We reveal (...) that sustainability reporting has deepened in a few high-impact industries, and it has spread to a small number of firms in a wide range of low-impact industries. We tested whether there were any relationships between the drivers of reporting and the experiences of different types of reporting firms. Many of the relationships we observed were not as clear or as consistent as expected. Sustainability reporting is, however, of strategic importance to reporting companies. Given the small number of reporters in Australia, we raise the possibility of strategic differentiation as a key driver of reporting behaviour and suggest further studies to explore institutional fields that may be shaping sustainability reporting practice. (shrink)
While there is now considerable experimental evidence that, on the one hand, participants assign to the indicative conditional as probability the conditional probability of consequent given antecedent and, on the other, they assign to the indicative conditional the ?defective truth-table? in which a conditional with false antecedent is deemed neither true nor false, these findings do not in themselves establish which multi-premise inferences involving conditionals participants endorse. A natural extension of the truth-table semantics pronounces as valid numerous inference patterns that (...) do seem to be part of ordinary usage. However, coupled with something the probability account gives us?namely that when conditional-free ? entails conditional-free ?, ?if ? then ?? is a trivial, uninformative truth?we have enough logic to derive the paradoxes of material implication. It thus becomes a matter of some urgency to determine which inference patterns involving indicative conditionals participants do endorse. Only thus will we be able to arrive at a realistic, systematic semantics for the indicative conditional. (shrink)
First paragraph: Truthmaker theory maintains that for every truth there is something, some thing, some entity, that makes it true. Balking at the prospect that logical truths are made true by any particular thing, a consequence that may in fact be hard to avoid (see Restall 1996, Read 2000), this principle of truthmaking is sometimes restricted to (logically) contingent truths. I aim to show that even in its restricted form, the principle is provably false.
Starting from John MacFarlane's recent survey of answers to the question ‘What is assertion?’, I defend an account of assertion that draws on elements of MacFarlane's and Robert Brandom's commitment accounts, Timothy Williamson's knowledge norm account, and my own previous work on the normative status of logic. I defend the knowledge norm from recent attacks. Indicative conditionals, however, pose a problem when read along the lines of Ernest Adams' account, an account supported by much work in the psychology of reasoning. (...) Furthermore, there seems to be no place for degrees of belief in the accounts of belief and assertion given here. Degrees of belief do have a role in decision‐making, but, again, there is much evidence that the orthodox theory of subjective utility maximization is not a good description of what we do in decision‐making and, arguably, neither is it a good normative guide to how we ought to make decisions. (shrink)
This article begins by outlining some of the history—beginning with brief remarks of Quine's—of work on conditional assertions and conditional events. The upshot of the historical narrative is that diverse works from various starting points have circled around a nexus of ideas without convincingly tying them together. Section 3 shows how ideas contained in a neglected article of de Finetti's lead to a unified treatment of the topics based on the identification of conditional events as the objects of conditional bets. (...) The penultimate section explores some of the consequences of the resulting logic of conditional events while the last defends it. (shrink)
According to the axiologist the value concepts are basic and the deontic concepts are derivative. This paper addresses two fundamental problems that arise for the axiologist. Firstly, what ought the axiologist o understand by the value of an act? Second, what are the prospects in principle for an axiological representation of moral theories. Can the deontic concepts of any coherent moral theory be represented by an agent-netural axiology: (1) whatever structure those concepts have and (2) whatever the causal structure of (...) the world happens to be. We show that the answer is "almost always". The only substantive constraint is that autonomous moral agents cannot have the power to simultaneously block the options open to other autonomous moral agents. But this seems to be part and parcel of the notion of an autonomous moral agent. (shrink)
A proof employing no semantic terms is offered in support of the claim that there can be truths without truthmakers. The logical resources used in the proof are weak but do include the structural rule Contraction.
Proofs of Gödel's First Incompleteness Theorem are often accompanied by claims such as that the gödel sentence constructed in the course of the proof says of itself that it is unprovable and that it is true. The validity of such claims depends closely on how the sentence is constructed. Only by tightly constraining the means of construction can one obtain gödel sentences of which it is correct, without further ado, to say that they say of themselves that they are unprovable (...) and that they are true; otherwise a false theory can yield false gödel sentences. (shrink)
This article begins by exploring a lost topic in the philosophy of science:the properties of the relations evidence confirming h confirmsh'' and, more generally, evidence confirming each ofh1, h2, ..., hm confirms at least one of h1, h2,ldots;, hn''.The Bayesian understanding of confirmation as positive evidential relevanceis employed throughout. The resulting formal system is, to say the least, oddlybehaved. Some aspects of this odd behaviour the system has in common withsome of the non-classical logics developed in the twentieth century. Oneaspect (...) – its ``parasitism'''' on classical logic – it does not, and it is this featurethat makes the system an interesting focus for discussion of questions in thephilosophy of logic. We gain some purchase on an answer to a recently prominentquestion, namely, what is a logical system? More exactly, we ask whether satisfaction of formal constraints is sufficient for a relation to be considered a (logical) consequence relation. The question whether confirmation transfer yields a logical system is answered in the negative, despite confirmation transfer having the standard properties of a consequence relation, on the grounds that validity of sequents in the system is not determined by the meanings of the connectives that occur in formulas. Developing the system in a different direction, we find it bears on the project of ``proof-theoretic semantics'''': conferring meaning on connectives by means of introduction (and possibly elimination) rules is not an autonomous activity, rather it presupposes a prior, non-formal,notion of consequence. Some historical ramifications are alsoaddressed briefly. (shrink)
The thesis that, in a system of natural deduction, the meaning of a logical constant is given by some or all of its introduction and elimination rules has been developed recently in the work of Dummett, Prawitz, Tennant, and others, by the addition of harmony constraints. Introduction and elimination rules for a logical constant must be in harmony. By deploying harmony constraints, these authors have arrived at logics no stronger than intuitionist propositional logic. Classical logic, they maintain, cannot be justified (...) from this proof-theoretic perspective. This paper argues that, while classical logic can be formulated so as to satisfy a number of harmony constraints, the meanings of the standard logical constants cannot all be given by their introduction and/or elimination rules; negation, in particular, comes under close scrutiny. (shrink)
Successful technologies’ ubiquity changes uses, users and ethicolegal responsibilities and duties of care. We focus on dementia to review critically ethicolegal implications of increasing use of social networking sites (SNS) by those with compromised decision-making capacity, assessing concerned parties’ responsibilities. Although SNS contracts assume ongoing decision-making capacity, many users’ may be compromised or declining. Resulting ethicolegal issues include capacity to give informed consent to contracts, protection of online privacy including sharing and controlling data, data leaks between different digital platforms, and (...) management of digital identities and footprints. SNS uses in healthcare raise additional issues. Online materials acting as archives of ‘the self’ bolster present and future identities for users with compromised capacity. E-health involves actual and potential intersection of data gathered for the purpose of delivering health technological support with data used for social networking purposes. Ethicolegal guidance is limited on the implications of SNS usage in contexts where users have impaired/reduced capacity to understand and/or consent to sharing personal data about their health, medication or location. Vulnerable adults and family/carers face uncertainty in regard to consent, data protection, online identity and legal liabilities. Ethicolegal responsibilities and duties of care of technology providers, healthcare professionals, regulatory bodies and policymakers need clarification. (shrink)
This paper is the report of a meetingthat gathered many of the UK's most senioranimal scientists with representatives of thefarming industry, consumer groups, animalwelfare groups, and environmentalists. Therewas strong consensus that the current economicstructure of agriculture cannot adequatelyaddress major issues of concern to society:farm incomes, food security and safety, theneeds of developing countries, animal welfare,and the environment. This economic structure isbased primarily on competition betweenproducers and between retailers, driving foodprices down, combined with externalization ofmany costs. These issues must be addressed (...) by acombination of legislation, restructuring ofthe market, and use of public funds. Themeeting included workshops that made otherrecommendations for research and education. Themost urgent requirement is recognition thatchange is needed and development of a visionfor what that change must achieve. (shrink)
Some propositions add more information to bodies of propositions than do others. We start with intuitive considerations on qualitative comparisons of information added . Central to these are considerations bearing on conjunctions and on negations. We find that we can discern two distinct, incompatible, notions of information added. From the comparative notions we pass to quantitative measurement of information added. In this we borrow heavily from the literature on quantitative representations of qualitative, comparative conditional probability. We look at two ways (...) to obtain a quantitative conception of information added. One, the most direct, mirrors Bernard Koopman’s construction of conditional probability: by making a strong structural assumption, it leads to a measure that is, transparently, some function of a function P which is, formally, an assignment of conditional probability (in fact, a Popper function). P reverses the information added order and mislocates the natural zero of the scale so some transformation of this scale is needed but the derivation of P falls out so readily that no particular transformation suggests itself. The Cox–Good–Aczél method assumes the existence of a quantitative measure matching the qualitative relation, and builds on the structural constraints to obtain a measure of information that can be rescaled as, formally, an assignment of conditional probability. A classical result of Cantor’s, subsequently strengthened by Debreu, goes some way towards justifying the assumption of the existence of a quantitative scale. What the two approaches give us is a pointer towards a novel interpretation of probability as a rescaling of a measure of information added. (shrink)
As Wilfrid Hodges has observed, there is no mention of the notion truth-in-a-model in Tarski's article 'The Concept of Truth in Formalized Languages'; nor does truth make many appearances in his papers on model theory from the early 1950s. In later papers from the same decade, however, this reticence is cast aside. Why should Tarski, who defined truth for formalized languages and pretty much founded model theory, have been so reluctant to speak of truth in a model? What might explain (...) the change in his practice? The answers, I believe, lie in Tarski's views on truth simpliciter. (shrink)
From the point of view of proof-theoretic semantics, we examine the logical background invoked by Neil Tennant's abstractionist realist account of mathematical existence. To prepare the way, we must first look closely at the rule of existential elimination familiar from classical and intuitionist logics and at rules governing identity. We then examine how well free logics meet the harmony and uniqueness constraints familiar from the proof-theoretic semantics project. Tennant assigns a special role to atomic formulas containing singular terms. This, we (...) find, secures harmony and uniqueness but militates against the putative realism. (shrink)
Uncertainty and vagueness/imprecision are not the same: one can be certain about events described using vague predicates and about imprecisely specified events, just as one can be uncertain about precisely specified events. Exactly because of this, a question arises about how one ought to assign probabilities to imprecisely specified events in the case when no possible available evidence will eradicate the imprecision (because, say, of the limits of accuracy of a measuring device). Modelling imprecision by rough sets over an approximation (...) space presents an especially tractable case to help get one’s bearings. Two solutions present themselves: the first takes as upper and lower probabilities of the event X the (exact) probabilities assigned X ’s upper and lower rough-set approximations; the second, motivated both by formal considerations and by a simple betting argument, is to treat X ’s rough-set approximation as a conditional event and assign to it a point-valued (conditional) probability. (shrink)
Intervals in boolean algebras enter into the study of conditional assertions (or events) in two ways: directly, either from intuitive arguments or from Goodman, Nguyen and Walker's representation theorem, as suitable mathematical entities to bear conditional probabilities, or indirectly, via a representation theorem for the family of algebras associated with de Finetti's three-valued logic of conditional assertions/events. Further representation theorems forge a connection with rough sets. The representation theorems and an equivalent of the boolean prime ideal theorem yield an algebraic (...) completeness theorem for the three-valued logic. This in turn leads to a Henkin-style completeness theorem. Adequacy with respect to a family of Kripke models for de Finetti's logic, Łukasiewicz's three-valued logic and Priest's Logic of Paradox is demonstrated. The extension to first-order yields a short proof of adequacy for Körner's logic of inexact predicates. (shrink)
Consistent application of coherece arguments shows that fair betting quotients are subject to constraints that are too stringent to allow their identification with either degrees of belief or probabilities. The pivotal role of fair betting quotients in the Dutch Book Argument, which is said to demonstrate that a rational agent's degrees of belief are probabilities, is thus undermined from both sides.
George schlesinger has characterized justified belief probabilistically. I question the propriety of this characterization and demonstrate that with respect to it certain principles of epistemic logic that he considers plausible are unsound.
Taking as starting point two familiar interpretations of probability, we develop these in a perhaps unfamiliar way to arrive ultimately at an improbable claim concerning the proper axiomatization of probability theory: the domain of definition of a point-valued probability distribution is an orthomodular partially ordered set. Similar claims have been made in the light of quantum mechanics but here the motivation is intrinsically probabilistic. This being so the main task is to investigate what light, if any, this sheds on quantum (...) mechanics. In particular it is important to know under what conditions these point-valued distributions can be thought of as derived from distribution-pairs of upper and lower probabilities on boolean algebras. Generalising known results this investigation unsurprisingly proves unrewarding. In the light of this failure the next topic investigated is how these generalized probability distributions are to be interpreted. (shrink)
Of his numerous investigations ... Tarski was most proud of two: his work on truth and his design of an algorithm in 1930 to decide the truth or falsity of any sentence of the elementary theory of the high school Euclidean geometry. [...] His mathematical treatment of the semantics of languages and the concept of truth has had revolutionary consequences for mathematics, linguistics, and philosophy, and Tarski is widely thought of as the man who "defined truth". The seeming simplicity of (...) his famous example that the sentence "Snow is white" is true just in case snow is white belies the depth and complexity of the consequences which can be drawn from the possibility of giving a general treatment of the concept of truth in formal mathematical languages in a rigorous mathematical way. (J.W. Addison). (shrink)
This paper responds to Rancière’s reading of Lyotard’s analysis of the sublime by attempting to articulate what Lyotard would call a “differend” between the two. Sketching out Rancière’s criticisms, I show that Lyotard’s analysis of the Kantian sublime is more defensible than Rancière claims. I then provide an alternative reading, one that frees Lyotard’s sublime from Rancière’s central accusation that it signals nothing more than the mind’s perpetual enslavement to the lawof the Other. Reading the sublime through the figure of (...) the “event,” I end by suggesting that it may have certain affinities with what Rancière calls “politics.”. (shrink)