Adapated from talks at the UCLA Logic Center and the Pitt Philosophy of Science Series. Exposition of material from Fixing Frege, Chapter 2 (on predicative versions of Frege’s system) and from “Protocol Sentences for Lite Logicism” (on a form of mathematical instrumentalism), suggesting a connection. Provisional version: references remain to be added. To appear in Mathematics, Modality, and Models: Selected Philosophical Papers, coming from Cambridge University Press.
Fixing Frege is one of the most important investigations to date of Fregean approaches to the foundations of mathematics. In addition to providing an unrivalled survey of the technical program to which Frege’s writings have given rise, the book makes a large number of improvements and clariﬁcations. Anyone with an interest in the philosophy of mathematics will enjoy and beneﬁt from the careful and well informed overview provided by the ﬁrst of its three chapters. Specialists will ﬁnd the book an (...) indispensable reference and an invaluable source of insights and new results. Although Frege is widely regarded as the father of analytic philosophy, his work on the foundations of mathematics was for a long time rather peripheral to the ongoing research. The main reason for this is no doubt Russell’s discovery in 1901 that the paradox now bearing his name can be derived in Frege’s logical system. But recent decades have seen a huge surge of interest in Fregean approaches to the foundations of mathematics. (The work of George Boolos, Kit Fine, Bob Hale, Richard Heck, Stewart Shapiro, and Crispin Wright is singled out for particular attention in the present monograph.) A variety of consistent theories have been discovered that can be salvaged from Frege’s inconsistent system, and foundational and philosophical claims have been made on behalf of many of these theories. Burgess claims quite plausibly that the signiﬁcance of any such modiﬁed Fregean theory will in large part depend on how much of ordinary mathematics it enables us to develop.1 His.. (shrink)
The discovery of the note cards for Quine’s previously unpublished 1946 lecture on nominalism provides an obvious occasion for commenting on the differences between the issue of nominalism as Quine first publicized it to a wide philosophical audience and the issue of nominalism as debated among Quine’s successors today. Yet as I read and reread the text of Quine’s lecture, I found myself struck less by the differences between Quine’s position there and the positions of present-day writers than by differences (...) between Quine’s position there and the positions of Quine himself in later writings — and not his writings from many years later but his writings from the next few years, and especially one of his writings from the very next year, his notorious joint paper with Goodman. (shrink)
This long-awaited volume is a must-read for anyone with a serious interest in\nphilosophy of mathematics. The book falls into two parts, with the primary focus of\nthe first on ontology and structuralism, and the second on intuition and\nepistemology, though with many links between them. The style throughout involves\nunhurried examination from several points of view of each issue addressed, before\nreaching a guarded conclusion. A wealth of material is set before the reader along\nthe way, but a reviewer wishing to summarize the author’s views (...) crisply will be\nfrustrated. The chapter-by-chapter survey below conveys at best a very incomplete\nand imperfect impression of the work’s virtues, and even of its contents, falling\nshort even of supplying a full menu for the banquet of food for thought that Parsons\nserves up to his readers. (shrink)
A revision of a sermon on the evils of calling model theory “semantics”, preached at Notre Dame on Saint Patrick’s Day, 2005. Provisional version: references remain to be added. To appear in Mathematics, Modality, and Models: Selected Philosophical Papers, coming from Cambridge University Press.
What is the simplest and most natural axiomatic replacement for the set-theoretic definition of the minimal fixed point on the Kleene scheme in Kripke’s theory of truth? What is the simplest and most natural set of axioms and rules for truth whose adoption by a subject who had never heard the word "true" before would give that subject an understanding of truth for which the minimal fixed point on the Kleene scheme would be a good model? Several axiomatic systems, old (...) and new, are examined and evaluated as candidate answers to these questions, with results of Harvey Friedman playing a significant role in the examination. (shrink)
One textbook may introduce the real numbers in Cantor’s way, and another in Dedekind’s, and the mathematical community as a whole will be completely indifferent to the choice between the two. This sort of phenomenon was famously called to the attention of philosophers by Paul Benacerraf. It will be argued that structuralism in philosophy of mathematics is a mistake, a generalization of Benacerraf’s observation in the wrong direction, resulting from philosophers’ preoccupation with ontology.
In this era when results of empirical scientific research are being appealed to all across philosophy, when we even find moral philosophers invoking the results of brain scans, many profess to practice "naturalized epistemology," or to be "epistemological naturalists." Such phrases derive from the title of a well-known essay by Quine, but Paul Gregory's thesis in the work under review is that there is less connection than is usually assumed between Quine's variety of naturalized epistemology and what is today taken, (...) by opponents and proponents alike, to constitute epistemological naturalism. To put it bluntly, as Gregory does in the opening sentence of his introduction, Quine "has not been well understood." If there is less connection between the Quinian and other epistemological naturalisms than there has often been taken to be, on Gregory's account there is also much more connection between Quine's position on epistemology and his positions on other contentious issues. (shrink)
It is shown that for invariance under the action of special groups the statements "Every invariant PCA is decomposable into (1 invariant Borel sets" and "Every pair of invariant PCA is reducible by a pair of invariant PCA sets" are independent of the axioms of set theory.
EEG Hyperscanning is a method for studying two or more individuals simultaneously with the objective of elucidating how co-variations in their neural activity (i.e. hyperconnectivity) are influenced by their behavioural and social interactions. The aim of this study was to compare the performance of different hyper-connectivity measures using i) simulated data, where the degree of coupling could be systematically manipulated, and ii) individually recorded human EEG combined into pseudo-pairs of participants where no hyper-connections could exist. With simulated data we found (...) that each of the most widely used measures of hyperconnectivity were biased and detected hyper-connections where none existed. With pseudo-pairs of human data we found spurious hyper-connections that arose because there were genuine similarities between the EEG recorded from different people independently but under the same experimental conditions. Specifically, there were systematic differences between experimental conditions in terms of the rhythmicity of the EEG that were common across participants. As any imbalance between experimental conditions in terms of stimulus presentation or movement may affect the rhythmicity of the EEG, this problem could apply in many hyperscanning contexts. Furthermore, as these spurious hyper-connections reflected real similarities between the EEGs, they were not Type-1 errors that could be overcome by some appropriate statistical control. However, some measures that have not previously been used in hyperconnectivity studies, notably the circular correlation co-efficient, were less susceptible to detecting spurious hyper-connections of this type. The reason for this advantage in performance is discussed and the use of the circular correlation co-efficient as an alternative measure of hyperconnectivity is advocated. (shrink)
Metalinguistic descriptivism is the view that proper names are semantically equivalent to descriptions featuring their own quotations (e.g., ?Socrates? means ?the bearer of ?Socrates??). The present paper shows that Millians can actually accept an inferential version of this equivalence thesis without running afoul of the modal argument. Indeed, they should: for it preserves the explanatory virtues of more familiar forms of descriptivism while avoiding objections (old and new) to Kent Bach's nominal description theory. We can make significant progress on Frege's (...) puzzle and Plato's beard without committing ourselves one way or the other on the semantic values of proper names. The view on offer can also be motivated by analogy with Tarski's schema T, inviting the idea that the equivalence between a name and the associated nominal description has more to do with the semantics of representational locutions than it does with names per se. My response to the modal argument exploits the Kripkean distinction between reference at a world and reference in a world, and can be accepted by metalinguistic descriptivists and Millians alike. (shrink)
Which concepts should we use to think and talk about the world and to do all of the other things that mental and linguistic representation facilitates? This is the guiding question of the field that we call ‘conceptual ethics’. Conceptual ethics is not often discussed as its own systematic branch of normative theory. A case can nevertheless be made that the field is already quite active, with contributions coming in from areas as diverse as fundamental metaphysics and social/political philosophy. In (...) this pair of papers, we try to unify the field, reflecting on its basic nature, structure, and methodology. (shrink)
Which concepts should we use to think and talk about the world, and to do all of the other things that mental and linguistic representation facilitates? This is the guiding question of the field that we call ‘conceptual ethics’. Conceptual ethics is not often discussed as its own systematic branch of normative theory. A case can nevertheless be made that the field is already quite active, with contributions coming in from areas as diverse as fundamental metaphysics and social/political philosophy. In (...) this pair of papers, we try to unify the field, reflecting on its basic nature, structure, and methodology. (shrink)
The source, status, and significance of the derivation of the necessity of identity at the beginning of Kripke’s lecture “Identity and Necessity” is discussed from a logical, philosophical, and historical point of view.
The eight pieces constituting this Meeting Report are summaries of presentations made during a panel session at the 2011 Association for Practical and Professional Ethics (APPE) annual meeting held between March 3rd and 6th in Cincinnati. Lisa Newton organized the session and served as chair. The panel of eight consisted both of pioneers in the field and more recent arrivals. It covered a range of topics from how the field has developed to where it should be going, from identification of (...) issues needing further study to problems of training the next generation of engineers and engineering-ethics scholars. (shrink)
The contemporary American political landscape is littered with talk of apology. Throughout the 2012 presidential campaign, both camps sparred over when, why, and to whom apologies should be made. The most striking clash occurred in July 2012. The Obama camp ran a series of campaign advertisements alleging that the then presumptive Republican nominee had in fact remained at Bain Capitol in a leadership role longer than he had claimed, bolstering their characterization of Romney as a businessman whose business was not (...) good for America.1 When Romney’s aide failed to quiet the critique by claiming that the candidate had “retired retroactively” (DeLong 2012), Romney himself took to the airwaves to speak to the .. (shrink)
Public participation is increasingly an aspect of policy development in many areas, and the governance of biomedical research is no exception. There are good reasons for this: biomedical research relies on public funding; it relies on biological samples and information from large numbers of patients and healthy individuals; and the outcomes of biomedical research are dramatically and irrevocably changing our society. There is thus arguably a democratic imperative for including public values in strategic decisions about the governance of biomedical research. (...) However, it is not immediately clear how this might best be achieved. While different approaches have been proposed and trialled, we focus here on the use of public deliberation as a mechanism to develop input for policy on biomedical research. We begin by explaining the rationale for conducting public deliberation in biomedical research. We focus, in particular, on the ELS (ethical, legal, social) aspects of human tissue biobanking. The last few years have seen the development of methods for conducting public deliberation on these issues in several jurisdictions, for the purpose of incorporating lay public voices in biobanking policy. We explain the theoretical foundation underlying the notion of deliberation, and outline the main lessons and capacities that have been developed in the area of conducting public deliberation on biobanks. We next provide an analysis of the theoretical and practical challenges that we feel still need to be addressed for the use of public deliberation to guide ethical norms and governance of biomedical research. We examine the issues of: (i) linking the outcomes of deliberation to tangible action; (ii) the mandate under which a deliberation is conducted; (iii) the relative weight that should be accorded to a public deliberative forum vs other relevant voices; (iv) evaluating the quality of deliberation; and (5) the problem of scalability of minipublics. (shrink)
There are statements of the form “There are no Fs” that we would like to count as true, yet it is hard to see how they could be true (at least, operating within the semantic framework of structured propositions). The relevant Fs are general terms that we take to be semantically fundamental or primitive, especially those native to metaphysical discourse. A case can be made the problem is no less difficult than the corresponding problem for singular terms.
This paper aims to make three contributions to decision theory. First there is the hope that it will help to re-establish the legitimacy of the problem, pace various recent analyses provided by Maitzen and Wilson, Slezak and Priest. Second, after pointing out that analyses of the problem have generally relied upon evidence that is conditional on the taking of one particular option, this paper argues that certain assumptions implicit in those analyses are subtly flawed. As a third contribution, the piece (...) aims to draw attention to an important similarity between Newcomb’s problem and the toxin puzzle. In short, both problems illustrate the fact that you can have a reason to intend to φ without having a reason to actually φ. (shrink)
The "default mode network" is commonly described as a set of brain regions in which activity is suppressed during relatively demanding, or difficult, tasks. But what sort of tasks are these? We review some of the contrasting ways in which a task might be assessed as being difficult, such as error rate, response time, propensity to interfere with performance of other tasks, and requirement for transformation of internal representations versus accumulation of perceptual information. We then describe a fMRI study in (...) which 18 participants performed two "stimulus-oriented" tasks, where responses were directly cued by visual stimuli, alongside a "stimulus-independent" task, with a greater reliance on internally-generated information. When indexed by response time and error rate, the stimulus-independent task was intermediate in difficulty between the two stimulus-oriented tasks. Nevertheless, BOLD signal in medial rostral prefrontal cortex (MPFC) - a prominent part of the default mode network - was reduced in the stimulus-independent condition in comparison with both the more difficult and the less difficult stimulus-oriented conditions. By contrast, other regions of the default mode network showed greatest deactivation in the difficult stimulus-oriented condition. There was therefore significant functional heterogeneity between different default mode regions. We conclude that task difficulty - as measured by response time and error rate - does not provide an adequate account of signal change in MPFC. At least in some circumstances, a better predictor of MPFC activity is the requirement of a task for transformation and manipulation of internally-represented information, with greatest MPFC activity in situations predominantly requiring attention to perceptual information. (shrink)
Recent philosophy of language has been profoundly impacted by the idea that mainstream, model-theoretic semantics is somehow incompatible with deflationary accounts of truth and reference. The present article systematizes the case for incompatibilism, debunks circularity and “modal confusion” arguments familiar in the literature, and reconstructs the popular thought that truth-conditional semantics somehow “presupposes” a correspondence theory of truth as an inference to the best explanation. The case for compatibilism is closed by showing that this IBE argument fails to rule out (...) two kinds of deflationism: the position Field famously accused Tarski of having; and a less familiar version of the view that defines reference in terms of a deflated notion of truth. Finally, the distinction between unifying and constitutive explanation is used to forestall the response that correspondence theory is literally part of mainstream semantics. (shrink)
Saul Kripke has made fundamental contributions to a variety of areas of logic, and his name is attached to a corresponding variety of objects and results. 1 For philosophers, by far the most important examples are ‘Kripke models’, which have been adopted as the standard type of models for modal and related non-classical logics. What follows is an elementary introduction to Kripke’s contributions in this area, intended to prepare the reader to tackle more formal treatments elsewhere.2 2. WHAT IS A (...) MODEL THEORY? Traditionally, a statement is regarded as logically valid if it is an instance of a logically valid form, where a form is regarded as logically valid if every instance is true. In modern logic, forms are represented by formulas involving letters and special symbols, and logicians seek therefore to define a notion of model and a notion of a formula’s truth in a model in such a way that every instance of a form will be true if and only if a formula representing that form is true in every model. Thus the unsurveyably vast range of instances can be replaced for purposes of logical evaluation by the range of models, which may be more tractable theoretically and perhaps practically. Consideration of the familiar case of classical sentential logic should make these ideas clear. Here a formula, say (p & q) ∨ ¬p ∨ ¬q, will be valid if for all statements P.. (shrink)
There are three general ways to approach reconciliation: from the side of nonfactualism, from the side of deflationism, or from both sides at once. To approach reconciliation from a given side, as I will use the expression, just means to attend in the first instance to the details of that side’s position. (It will be important to keep in mind that the success of an approach from one side may ultimately require concessions from the other side.) The only attempts at (...) reconciliation in the literature of which I am aware fall in to the first of these three categories. Such writers argue that the tension between our –isms can be resolved by paying sufficiently close attention to the nature of nonfactualism. While I have nothing against this approach in principle, I do have reservations about the particular proposals that have been made in its pursuit. The first section of the present paper briefly develops a line of objection against one such proposal, in order to motivate the approach to reconciliation from the side of deflationism. In section two, I argue that the deflationist can and should reject the inference from (2) to (3) above. Section three addresses a special problem of reconciliation for the nonfactu- alist who continues to use the discourse she takes to be factually defective. By paying close attention to the details of deflationism about reference, I show how a deflationist about truth might avoid this problem. I conclude that deflationism can be developed in a way that renders it compatible with nonfactualism. (shrink)
The argument from potential has been hard to assess because the versions presented by friends and those presented by enemies have born very little resemblance to each other. I here try to improve this situation by attempting to bring both versions into enforced contact. To this end, I sketch a more detailed analysis of the modern concept of potential than any hitherto attempted. As one would expect, arguments from potential couched in terms of that notion are evident non-starters. I then (...) ask how the modern notion of potential needs to be supplemented in order to produce a more convincing argument. I then enquire whether the supplementations utilised in the most distinguished recent presentations of the argument have anything better than an ad hoc role to play in contemporary metaphysics. I conclude that the rehabilitation of the argument is unlikely; in any event, the onus of proof seems to be on the friend of that argument to show that it is uncontrived. Finally, I argue that the (modern) notion of potential has an important role to play in any plausible account of foetal value. (shrink)
This paper re-examines the question of whether quirks of early human foetal development tell against the view (conceptionism) that we are human beings at conception. A zygote is capable of splitting to give rise to identical twins. Since the zygote cannot be identical with either human being it will become, it cannot already be a human being. Parallel concerns can be raised about chimeras in which two embryos fuse. I argue first that there are just two ways of dealing with (...) cases of fission and fusion and both seem to be available to the conceptionist. One is the Replacement View according to which objects cease to exist when they fission or fuse. The other is the Multiple Occupancy View – both twins may be present already in the zygote and both persist in a chimera. So, is the conceptionist position tenable after all? I argue that it is not. A zygote gives rise not only to a human being but also to a placenta – it cannot already be both a human being and a placenta. Neither approach to fission and fusion can help the conceptionist with this problem. But worse is in store. Both fission and fusion can occur before and after the development of the inner cell mass of the blastocyst – the entity which becomes the embryo proper. The idea that we become human beings with the arrival of the inner cell mass leads to bizarre results however we choose to accommodate fission and fusion. (shrink)
My contribution to the symposium on Goedel’s philosophy of mathematics at the spring 2006 Association for Symbolic Logic meeting in Montreal. Provisional version: references remain to be added. To appear in an ASL volume of proceedings of the Goedel sessions at that meeting.
1 Choice conjecture In axiomatizing nonclassical extensions of classical sentential logic one tries to make do, if one can, with adding to classical sentential logic a finite number of axiom schemes of the simplest kind and a finite number of inference rules of the simplest kind. The simplest kind of axiom scheme in effect states of a particular formula P that for any substitution of formulas for atoms the result of its application to P is to count (...) as an axiom. The simplest kind of onepremise inference rule in effect states of a particular pair of formulas P and Q that for any substitution of formulas for atoms, if the result of its application to P is a theorem, then the result of its application to Q is to count as a theorem; similarly for many-premise rules. Such are the schemes and rules of all the best-known modal and tense logics, for instance. Sometimes it is difficult to find such simple schemes and rules (though it is usually even more difficult to prove that none exist). In that case one may resort to less simple schemes or less simple rules. There is no generally recognized rigorous definition of "next simplest kind" of scheme. (In the case of schemes, one fact that makes a rigorous definition difficult is that, if the logic in question is axiomatizable at all, which is to say, if the set of formulas wanted as theorems is recursively enumerable, then by Craig’s trick one can always get a primitive recursive set of schemes of the simplest kind, even if one cannot get a finite set. Intuitively, some primitive recursive sets are much simpler than others, but it is difficult to reduce this intuition to a rigorous definition.) Neither is there any generally recognized definition of "next simplest kind" of rule, and hence there is no fully rigorous enunciation of the choice conjecture, the conjecture that schemes of the next simplest kind can always be avoided in favor of rules of the next simplest kind and vice versa. Nonetheless, there are cases where intuitively one does recognize that the schemes or rules in a given axiomatization are only slightly more complex than the simplest kind, including cases where one does have a choice between adopting slightly-more-complex-than-simplest schemes and adopting slightly-more-complex-than-simplest rules. In tense logic early examples of slightly more complex rules are found in  and : there is one example of the embarrassed use of such rules in the former, and many examples of the enthusiastic use of such rules in the latter and its sequels. Accordingly the rules in question have come to be called "Gabbay-style" rules.. (shrink)