The goal of this paper is to make sense of relativism about truth. There are two key ideas. (1) To be a relativist about truth is to allow that a sentence or proposition might be assessment-sensitive: that is, its truth value might vary with the context of assessment as well as the context of use. (2) Making sense of relativism is a matter of understanding what it would be to commit oneself to the truth of an assessment-sensitive sentence or proposition.
Helen Macfarlane, revolutionary social critic, feminist and Hegelian philosopher was the first English translator of Karl Marx and Fredrich Engel's theCommunist Manifesto. Her original translation is included in this edition. Marx publicly admired her as a rare and original thinker and journalist. This book recreates her intellectual and political world at a key turning point in European history.
In Confusion: A Study in the Theory of Knowledge, Joseph Camp argues that the reasoning of a person who has confused two objects in her thought and talk ought to be appraised using a four-valued relevance logic. I discuss two key moves in Camp’s argument: the assumption that charity to the reasoner requires recognition of her arguments as valid, and the argument that validity for a truth-valueless discourse should not be defined in terms of truth preservation. I then question whether (...) Camp’s four-valued semantics satisfies his own desiderata for a logic of confusion. -/- . (shrink)
The Beatles and McLuhan: Understanding the Electric Age examines how the incorporation of electric technology in The Beatles’ art would enhance their musical impact. MacFarlane surveys the relationship between McLuhan's ideas on the nature and effects of electric technology and The Beatles own engagement of that technology; offers analyses of key works from The Beatles' studio years; and collates these data to offer stunning conclusions about The Beatles’ creative process in the recording studio and its cultural implications.
In ‘Epistemic Modals’ (2007), Seth Yalcin proposes Stalnaker-style semantics for epistemic possibility. He is inspired by John MacFarlane’s ingenious defence of relativism, in which claims of epistemic possibility are made rigidly from the perspective of the assessor’s actual stock of information (rather than from the speaker’s knowledge base or that of his audience or community). The innovations of MacFarlane and Yalcin independently reinforce the modal collapse espoused by Jaakko Hintikka in his 1962 epistemic logic (which relied on the (...) implausible KK principle and heavy idealizations). I respond to this new challenge with fresh objections to the underlying S4 equivalence: p p . I also propose counter-analyses of the intriguing data which Yalcin cites in support of his new semantics. A key collateral motivation for this defence of irredundant iterations is to ward off a threat to higher order vagueness. (shrink)
In this paper, I shall discuss several topics related to Frege’s paradigms of second-order abstraction principles and his logicism. The discussion includes a critical examination of some controversial views put forward mainly by Robin Jeshion, Tyler Burge, Crispin Wright, Richard Heck and John MacFarlane. In the introductory section, I try to shed light on the connection between logical abstraction and logical objects. The second section contains a critical appraisal of Frege’s notion of evidence and its interpretation by Jeshion, the (...) introduction of the course-of-values operator and Frege’s attitude towards Axiom V, in the expression of which this operator occurs as the key primitive term. Axiom V says that the course-of-values of the function f is identical with the course-of-values of the function g if and only if f and g are coextensional. In the third section, I intend to show that in Die Grundlagen der Arithmetik (1884) Frege hardly could have construed Hume’s Principle (HP) as a primitive truth of logic and used it as an axiom governing the cardinality operator as a primitive sign. HP expresses that the number of Fs is identical with the number of Gs if and only if F and G are equinumerous. In the fourth section, I argue that Wright falls short of making a convincing case for the alleged analyticity of HP. In the final section, I canvass Heck’s arguments for his contention that Frege knew he could deduce the simplest laws of arithmetic from HP without invoking Axiom V. I argue that they do not carry conviction. I conclude this section by rejecting an interpretation concerning HP suggested by MacFarlane. (shrink)
Philosophy of language is the branch of philosophy that examines the nature of meaning, the relationship of language to reality, and the ways in which we use, learn, and understand language. _The Routledge Companion to Philosophy of Language _provides a comprehensive and up-to-date survey of the field, charting its key ideas and movements, and addressing contemporary research and enduring questions in the philosophy of language. Unique to this _Companion _is clear coverage of research from the related disciplines of formal logic (...) and linguistics, and discussion of the applications in metaphysics, epistemology, ethics and philosophy of mind. Organized thematically, the _Companion _is divided into seven sections: Core Topics; Foundations of Semantics; Parts of Speech; Methodology; Logic for Philosophers of Language; Philosophy of Language for the Rest of Philosophy; and Historical Perspectives. Comprised of 70 never-before-published essays from leading scholars--including Sally Haslanger, Jeffrey King, Sally McConnell-Ginet, Rae Langton, Kit Fine, John MacFarlane, Jeff Pelletier, Scott Soames, Jason Stanley, Stephen Stich and Zoltan Gendler Szabo--the _Routledge Companion to Philosophy of Language_ promises to be the most comprehensive and authoritative resource for students and scholars alike. (shrink)
In this paper, I shall discuss several topics related to Frege's paradigms of second-order abstraction principles and his logicism. The discussion includes a critical examination of some controversial views put forward mainly by Robin Jeshion, Tyler Burge, Crispin Wright, Richard Heck and John MacFarlane. In the introductory section, I try to shed light on the connection between logical abstraction and logical objects. The second section contains a critical appraisal of Frege's notion of evidence and its interpretation by Jeshion, the (...) introduction of the course-of-values operator and Frege's attitude towards Axiom V, in the expression of which this operator occurs as the key primitive term. Axiom V says that the course-of-values of the function f is identical with the course-of-values of the function g if and only if f and g are coextensional. In the third section, I intend to show that in Die Grundlagen der Arithmetik Frege hardly could have construed Hume's Principle as a primitive truth of logic and used it as an axiom governing the cardinality operator as a primitive sign. HP expresses that the number of Fs is identical with the number of Gs if and only if F and G are equinumerous. In the fourth section, I argue that Wright falls short of making a convincing case for the alleged analyticity of HP. In the final section, I canvass Heck's arguments for his contention that Frege knew he could deduce the simplest laws of arithmetic from HP without invoking Axiom V. I argue that they do not carry conviction. I conclude this section by rejecting an interpretation concerning HP suggested by MacFarlane. (shrink)
John MacFarlane explores how we might make sense of the idea that truth is relative. He provides new, satisfying accounts of parts of our thought and talk that have resisted traditional methods of analysis, including what we mean when we talk about what is tasty, what we know, what will happen, what might be the case, and what we ought to do.
Let me start with a well-known story. Kant held that logic and conceptual analysis alone cannot account for our knowledge of arithmetic: “however we might turn and twist our concepts, we could never, by the mere analysis of them, and without the aid of intuition, discover what is the sum [7+5]” (KrV, B16). Frege took himself to have shown that Kant was wrong about this. According to Frege’s logicist thesis, every arithmetical concept can be defined in purely logical terms, and (...) every theorem of arithmetic can be proved using only the basic laws of logic. Hence, Kant was wrong to think that our grasp of arithmetical concepts and our knowledge of arithmetical truth depend on an extralogical source—the pure intuition of time (Frege 1884, §89, §109). Arithmetic, properly understood, is just a part of logic. (shrink)
By “epistemic modals,” I mean epistemic uses of modal words: adverbs like “necessarily,” “possibly,” and “probably,” adjectives like “necessary,” “possible,” and “probable,” and auxiliaries like “might,” “may,” “must,” and “could.” It is hard to say exactly what makes a word modal, or what makes a use of a modal epistemic, without begging the questions that will be our concern below, but some examples should get the idea across. If I say “Goldbach’s conjecture might be true, and it might be false,” (...) I am not endorsing the Cartesian view that God could have made the truths of arithmetic come out differently. I make the claim not because I believe in the metaphysical contingency of mathematics, but because I know that Goldbach’s conjecture has not yet been proved or refuted. Similarly, if I say “Joe can’t be running,” I am not saying that Joe’s constitution prohibits him from running, or that Joe is essentially a non-runner, or that Joe isn’t allowed to run. My basis for making the claim may be nothing more than that I see Joe’s running shoes hanging on a hook. (shrink)
John MacFarlane has made relativism popular again. Focusing just on his original discussion, I argue that the data he uses to motivate the position do not, in fact, motivatie it at all. Many of the points made here have since been made, independently, by Hermann Cappelen and John Hawthorne, in their book Relativism and Monadic Truth.
According to Semantic Minimalism, every use of "Chiara is tall" (fixing the girl and the time) semantically expresses the same proposition, the proposition that Chiara is (just plain) tall. Given standard assumptions, this proposition ought to have an intension (a function from possible worlds to truth values). However, speakers tend to reject questions that presuppose that it does. I suggest that semantic minimalists might address this problem by adopting a form of "nonindexical contextualism," according to which the proposition invariantly expressed (...) by "Chiara is tall" does not have a context-invariant intension. Nonindexical contextualism provides an elegant explanation of what is wrong with "context-shifting arguments" and can be seen as a synthesis of the (partial) insights of semantic minimalists and radical contextualists. (shrink)
We consider a paradox involving indicative conditionals (‘ifs’) and deontic modals (‘oughts’). After considering and rejecting several standard options for resolv- ing the paradox—including rejecting various premises, positing an ambiguity or hidden contextual sensitivity, and positing a non-obvious logical form—we offer a semantics for deontic modals and indicative conditionals that resolves the paradox by making modus ponens invalid. We argue that this is a result to be welcomed on independent grounds, and we show that rejecting the general validity of modus (...) ponens is compatible with vindicating most ordinary uses of modus ponens in reasoning. (shrink)
The relativist's central objection to contextualism is that it fails to account for the disagreement we perceive in discourse about "subjective" matters, such as whether stewed prunes are delicious. If we are to adjudicate between contextualism and relativism, then, we must first get clear about what it is for two people to disagree. This question turns out to be surprisingly difficult to answer. A partial answer is given here; although it is incomplete, it does help shape what the relativist must (...) say if she is to do better than the contextualist in securing genuine disagreement. (shrink)
Philosophers on all sides of the contextualism debates have had an overly narrow conception of what semantic context sensitivity could be. They have conflated context sensitivity (dependence of truth or extension on features of context) with indexicality (dependence of content on features of context). As a result of this conflation, proponents of contextualism have taken arguments that establish only context sensitivity to establish indexicality, while opponents of contextualism have taken arguments against indexicality to be arguments against context sensitivity. Once these (...) concepts are carefully pulled apart, it becomes clear that there is conceptual space in semantic theory for nonindexical forms of contextualism that have many advantages over the usual indexical forms. (shrink)
If it is not now determined whether there will be a sea battle tomorrow, can an assertion that there will be one be true? The problem has persisted because there are compelling arguments on both sides. If there are objectively possible futures which would make the prediction true and others which would make it false, symmetry considerations seem to forbid counting it either true or false. Yet if we think about how we would assess the prediction tomorrow, when a sea (...) battle is raging (or not), it seems we must assign the utterance a deﬁnite truth-value. I argue that both arguments must be given their due, and that this requires relativizing utterance-truth to a context of assessment. I show how this relativization can be handled in a rigorous formal semantics, and I argue that we can make coherent sense of assertion without assuming that utterances have their truth-values absolutely. (shrink)
Recent years have seen an explosion of interest in the semantics of knowledge-attributing sentences, not just among epistemologists but among philosophers of language seeking a general understanding of linguistic context sensitivity. Despite all this critical attention, however, we are as far from consensus as ever. If we have learned anything, it is that each of the standard views—invariantism, contextualism, and sensitive invariantism—has its Achilles’ heel: a residuum of facts about our use of knowledge attributions that it can explain only with (...) special pleading. This is not surprising if, as I will argue, there is a grain of truth in each of these views. (shrink)
To assert something is to perform a certain kind of act. This act is different in kind both from other speech acts, like questions, requests, commands, promises, and apologies, and from acts that are not speech acts, like toast buttering and inarticulate yodeling. My question, then is this: what features of an act qualify it as an assertion, and not one of these other kinds of act? To focus on a particular example: in uttering “Bill will close the window,” one (...) might be practicing English pronunciation, asserting that Bill will close the window, or requesting that Bill close the window. What makes it the case that one is doing one of these and not another? (shrink)
While many books focus on the broader socially ethical topics of widening participation and promoting equal opportunities, this unique book concentrates specifically on the lecturer's professional responsibilities. Bruce Macfarlane analyzes the pros and cons of prescriptive professional codes of practice employed by many universities and proposes the active development of professional virtues over bureaucratic recommendations. The material is presented in a scholarly yet accessible style and case examples are used throughout to encourage a practical, reflective approach.
Much philosophy of logic is shaped, explicitly or implicitly, by the thought that logic is distinctively formal and abstracts from material content. The distinction between formal and material does not appear to coincide with the more familiar contrasts between a priori and empirical, necessary and contingent, analytic and synthetic—indeed, it is often invoked to explain these. Nor, it turns out, can it be explained by appeal to schematic inference patterns, syntactic rules, or grammar. What does it mean, then, to say (...) that logic is distinctively formal? (shrink)
I want to discuss a puzzle about the semantics of epistemic modals, like “It might be the case that” as it occurs in “It might be the case that Goldbach’s conjecture is false.”1 I’ll argue that the puzzle cannot be adequately explained on standard accounts of the semantics of epistemic modals, and that a proper solution requires relativizing utterance truth to a context of assessment, a semantic device whose utility and coherence I have defended elsewhere for future contingents (MacFarlane..
Logic is usually thought to concern itself only with features that sentences and arguments possess in virtue of their logical structures or forms. The logical form of a sentence or argument is determined by its syntactic or semantic structure and by the placement of certain expressions called “logical constants.” Thus, for example, the sentences Every boy loves some girl. and Some boy loves every girl. are thought to differ in logical form, even though they share a common syntactic and semantic (...) structure, because they differ in the placement of the logical constants “every” and “some”. By contrast, the sentences Every girl loves some boy. and Every boy loves some girl. are thought to have the same logical form, because “girl” and “boy” are not logical constants. Thus, in order to settle questions about logical form, and ultimately about which arguments are logically valid and which sentences logically true, we must distinguish the “logical constants” of a language from its nonlogical expressions. (shrink)
In 1936, Frank Macfarlane Burnet published a paper entitled "Induced lysogenicity and the mutation of bacteriophage within lysogenic bacteria," in which he demonstrated that the introduction of a specific bacteriophage into a bacterial strain consistently and repeatedly imparted a specific property – namely the resistance to a different phage – to the bacterial strain that was originally susceptible to lysis by that second phage. Burnet's explanation for this change was that the first phage was causing a mutation in the (...) bacterium which rendered it and its successive generations of offspring resistant to lysogenicity. At the time, this idea was a novel one that needed compelling evidence to be accepted. While it is difficult for us today to conceive of mutations and genes outside the context of DNA as the physico-chemical basis of genes, in the mid 1930s, when this paper was published, DNA's role as the carrier of hereditary information had not yet been discovered and genes and mutations were yet to acquire physical and chemical forms. Also, during that time genes were considered to exist only in organisms capable of sexual modes of replication and the status of bacteria and viruses as organisms capable of containing genes and manifesting mutations was still in question. Burnet's paper counts among those pieces of work that helped dispel the notion that genes, inheritance and mutations were tied to an organism's sexual status. In this paper, I analyze the implications of Burnet's paper for the understanding of various concepts – such as "mutation," and "gene," – at the time it was published, and how those understandings shaped the development of the meanings of these terms and our modern conceptions thereof. (shrink)
This paper motivates and explores an expressivist theory of vagueness, modelled on Allan Gibbard’s normative expressivism. It shows how Chris Kennedy’s semantics for gradable adjectives can be adjusted to fit into a theory on Gibbardian lines, where assertions constrain not just possible worlds but plans for action. Vagueness, on this account, is literally indecision about where to draw lines. It is argued that the distinctive phenomena of vagueness, such as the intuition of tolerance, can be explained in terms of practical (...) constraints on plans, and that the expressivist view captures what is right about several contending theories of vagueness. (shrink)
Richard on truth and commitment Content Type Journal Article Pages 1-9 DOI 10.1007/s11098-011-9795-1 Authors John MacFarlane, Department of Philosophy, University of California, Berkeley, CA, USA Journal Philosophical Studies Online ISSN 1573-0883 Print ISSN 0031-8116.
The main impetus for my book came from the widespread acceptance of relativistic views about truth and knowledge within the Academy, especially within the humanities and the humanistic social sciences. In its introductory sections, though, I noted that there is one discipline within the humanities in which the influence of relativistic views is quite weak—namely, within analytic philosophy itself. Ironically, no sooner had the ink dried on the final version of my manuscript sometime in mid-2005—although, of course it had been (...) in the works for a number of years prior to that—than I began to become aware of a huge interest in certain kinds of relativistic views that was beginning to build within analytic philosophy. That interest—which is ongoing as I write—has been fueled to a considerable extent by the work of a younger generation of philosophers including John MacFarlane, Max Kölbel and Peter Lasersohn. (shrink)
According to “sensitive invariantism,” the word “know” expresses the same relation in every context of use, but what it takes to stand in this relation to a proposition can vary with the subject’s circumstances. Sensitive invariantism looks like an attractive reconciliation of invariantism and contextualism. However, it is incompatible with a widely-held view about the way knowledge is transmitted through testimony. If both views were true, someone whose evidence for p fell short of what was required for knowledge in her (...) circumstances could come to know that p simply by feeding her evidence to someone in less demanding circumstances and then accepting his testimony. (shrink)
It is taken for granted in much of the literature on vagueness that semantic and epistemic approaches to vagueness are fundamentally at odds. If we can analyze borderline cases and the sorites paradox in terms of degrees of truth, then we don’t need an epistemic explanation. Conversely, if an epistemic explanation suﬃces, then there is no reason to depart from the familiar simplicity of classical bivalent semantics. I question this assumption, showing that there is an intelligible motivation for adopting a (...) many-valued semantics even if one accepts a form of epistemicism. The resulting hybrid view has advantages over both classical epistemicism and traditional many-valued approaches. (shrink)
This paper re-examines the relevance of three academic norms to contemporary academic life – communism, universalism and disinterestedness – based on the work of Robert Merton. The results of a web-based survey elicited responses to a series of value statements and were analysed using the weighted average method and through cross-tabulation. Results indicate strong support for communism as an academic norm defined in relation to sharing research results and teaching materials as opposed to protecting intellectual copyright and withholding access. There (...) is more limited support for universalism based on the belief that academic knowledge should transcend national, political, or religious boundaries. Disinterestedness, defined in terms of personal detachment from truth claims, is the least popular contemporary academic norm. Here, the impact of a performative culture is linked to the need for a large number of academics to align their research interests with funding opportunities. The paper concludes by considering the claims of an alternate set of contemporary academic norms including capitalism, particularism and interestedness. (shrink)
One of the central themes of Brandom’s work is that we should construct our sematic theories around material validity and incompatibility, rather than reference, truth, and satisfaction. This approach to semantics is motivated in part by Brandom’s pragmatism about the relation between semantics and the more general study of language use—what he calls “pragmatics”: Inferring is a kind of doing. . . . The status of inference as something that can be done accordingly holds out the promise of securing an (...) appropriate relation between pragmatics, the study of the practices, and semantics, the study of the corresponding contents. (MIE, 91)1 Although Brandom does not go so far as to say that a pragmatist attitude to the relation between semantics and pragmatics requires an inferentialist semantics, his motivating arguments strongly suggest that a pragmatist ought to be an inferentialist. In what follows, I discuss the connections between Brandom’s pragmatism and his inferentialism. I’ll argue that pragmatism, as Brandom initially describes it—the view that “semantics must answer to pragmatics”—does not favor an inferentialist approach to semantics over a truth-conditional one. I’ll then consider whether inferentialism might be.. (shrink)
Much of The Reason’s Proper Study is devoted to defending the claim that simply by stipulating an abstraction principle for the “number-of” functor, we can simultaneously fix a meaning for this functor and acquire epistemic entitlement to the stipulated principle. In this paper, I argue that the semantic and epistemological principles Hale and Wright offer in defense of this claim may be too strong for their purposes. For if these principles are correct, it is hard to see why they do (...) not justify platonist strategies that are not in any way “neo-Fregean,” e.g. strategies that treat “the number of Fs” as a Russellian definite description rather than a singular term, or employ axioms that do not have the form of abstraction principles. (shrink)
In this in ter view, the pres ti gious an thro - pol o gist, his to rian and T.V. anaouncer, Alan Macfarlane com ments on some of the is sues that have been ad dressed in his writ ings. His main the o ret i cal con cern has been to study the pe cu - liar con di tions that gave rise to the mod e..
In “Double Vision Two Questions about the Neo-Fregean Programme”, John MacFarlane’s raises two main questions: (1) Why is it so important to neo-Fregeans to treat expressions of the form ‘the number of Fs’ as a species of singular term? What would be lost, if anything, if they were analysed instead as a type of quantifier-phrase, as on Russell’s Theory of Definite Descriptions? and (2) Granting—at least for the sake of argument—that Hume’s Principle may be used as a means of (...) implicitly defining the number operator, what advantage, if any, does adopting this course possess over a direct stipulation of the Dedekind-Peano axioms? This paper attempts to answer them. In response to the first, we spell out the links between the recognition of numerical terms as vehicles of singular reference and the conception of numbers as possible objects of singular, or object-directed, thought, and the role of the acknowledgement of numbers as objects in the neo-Fregean attempt to justify the basic laws of arithmetic. In response to the second, we argue that the crucial issue concerns the capacity of either stipulation—of Hume’s Principle, or of the Dedekind-Peano axioms—to found knowledge of the principles involved, and that in this regard there are crucial differences which explain why the former stipulation can, but the latter cannot, play the required foundational role. (shrink)