After a brief account of the problem of higher-order vagueness, and its seeming intractability, I explore what comes of the issue on a linguistic, contextualist account of vagueness. On the view in question, predicates like ‘borderline red’ and ‘determinately red’ are, or at least can be, vague, but they are different in kind from ‘red’. In particular, ‘borderline red’ and ‘determinately red’ are not colours. These predicates have linguistic components, and invoke notions like ‘competent user of the language’. On my (...) view, so-called ‘higher-order vagueness’ is actually ordinary, ﬁrst-order vagueness in different predicates. I explore the possibility that, nevertheless, a pernicious regress ensues. (shrink)
Logical pluralism is the view that different logics are equally appropriate, or equally correct. Logical relativism is a pluralism according to which validity and logical consequence are relative to something. Stewart Shapiro explores various such views. He argues that the question of meaning shift is itself context-sensitive and interest-relative.
My argument will be that our understanding of human beings, which is what I take the Christian doctrine of man to be concerned with, will benefit considerably from an examination of two different but related clusters of human attitudes which can be found respectively under the headings ‘optimism’ and ‘pessimism’. There are many pitfalls in the way of such an enterprise, and occasionally some prejudices to be overcome. For example L. E. Loemker in the relevant articles in the Encyclopedia of (...) Philosophy concludes a fairly lengthy discussion with the rather terminal judgement. (shrink)
It was, I believe, Thomas Arnold who wrote: ‘Educate men without religion and all you make of them is clever devils’. Thus the Headmaster of one famous school summarized pithily the view of the relationship between religion and ethics which informed educational theory and practice in this country for at least a further century. There is a confusion of two different assumptions usually to be found in this context. The first is that religious belief can provide an intellectual foundation for (...) moral belief; the second is that the effect of religious teaching is to improve behaviour according to the norms of some particular set of moral beliefs. (shrink)
It is of course true that the articulation of religious and theological views depends upon and often masks philosophical presuppositions. For example, those who quote with approval Anselm's ‘credo ut intelligam’, ‘I believe so that I may understand’, seldom follow the good example set by Anselm, and make explicit, as Anselm does in the following sentence, the fact that this principle rests upon a further principle: ‘For I believe this also, that “unless I believe, I shall not understand”’ . This (...) paper is an attempt to track down and expose one very pervasive set of views about the nature of experience which is implicit in a wide range of religious and theological claims. (shrink)
In the last ten years or so there has been some lively discussion of the questions of immortality and resurrection. Within the Christian tradition there has been debate at theological and exegetical level over the relative merits of belief in the immortality of the soul, and belief in the resurrection of the dead as an account of life after death. Further to this, however, there has been the suggestion that there may be good philosophical reasons for preferring the latter to (...) the former. It is just this contention which I propose to discuss. (shrink)
The title of this paper proclaims its central interest—the relationship which holds between the concept of integrity and the concept of the identity of the self, or, for short, self-identity. Unreflective speech often suggests a close relationship between the two, but in the latter half of this century, notwithstanding one or two notable exceptions, they have been discussed with minimum cross-reference as if they belonged to two rather different philosophical menus which tended not to be available at the same restaurant (...) on the same night. My intention is to argue that our account of the one carried implications for the other and that this relationship is reflexive. My argument will proceed by stating and criticizing a common account of the relationship between each of these concepts which tends to offer mutual support for the implied account of each. Thereafter an alternative account will be outlined. (shrink)
In this article I shall concern myself with the question ‘Is some type of justification required in order for belief in God to be rational?’ Many philosophers and theologians in the past would have responded affirmatively to this question. However, in our own day, there are those who maintain that natural theology in any form is not necessary. This is because of the rise of a different understanding of the nature of religious belief. Unlike what most people in the past (...) thought, religious belief is not in any sense arrived at or inferred on the basis of other known propositions. On the contrary, belief in God is taken to be as basic as a person's belief in the existence of himself, of the chair in which he is sitting, or the past. The old view that there must be a justification of religious belief, whether known or unknown, is held to be mistaken. One of the most outspoken advocates of this view is Alvin Plantinga. According to Plantinga the mature theist ought not to accept belief in God as a conclusion from other things he believes. Rather, he should accept it as basic, as a part of the bedrock of his noetic structure. ‘The mature theist commits himself to belief in God; this means that he accepts belief in God as basic.’. (shrink)
Philosophers have devoted much attention to a series of issues grouped under the heading ‘the problem of personal identity’. In most of these discussions the focus has been the question of identity over time and the issues confronted have been basically logical or metaphysical. Students enrolled in philosophy classes dealing with such topics often express a sense of disappointment or frustration, for, of course, they belong to a culture in which the jargon of ‘self’ or ‘personal’ identity belongs to a (...) rather different intellectual context heavy with the overtones of existentialism or with the suggestion of psychoanalysis. Anglo-Saxon philosophers have tended to bypass these ways of construing questions of personal identity; sometimes for good reason, sometimes not. (shrink)
Recent writing on the idea of a form of life has tended to be critical of the use made of this notion by writers such as Peter Winch, D. Z. Phillips and Norman Malcolm. Rightly or wrongly these writers have been regarded as meaning by ‘a form of life’, something like ‘a way or style of life’, and recent explicatory work on the notion has largely tended to discount this as a plausible interpretation of what Wittgenstein meant in his use (...) of the expression. The intention of this paper is not that of direct intervention in this particular dispute, though the conclusions drawn, if correct, would have some bearing on it. The intention is rather to develop, in order to make use of it, the idea of a form of life within the context of a number of philosophical difficulties. I should certainly claim to be drawing upon the remarks made by Wittgenstein in his own use of the expression: but I should not claim to be expounding Wittgenstein. Hence I do not wish to enter into the disputes referred to above about what precisely Wittgenstein meant by the expression, though again, what I say, if not wholly misguided, should have some bearing on these disputes. (shrink)
"The picture of Hume clinging timidly to a raft of custom and artifice, because, poor skeptic, he has no alternative, is wrong," writes John Stewart. "Hume was confident that by experience and reflection philosophers can achieve true principles." In this revisionary work Stewart surveys all of David Hume's major writings to reveal him as a liberal moral and political philosopher. Against the background of seventeenth-and eighteenth-century history and thought, Hume emerges as a proponent not of conservatism but of (...) reform. Stewart first presents the dilemma over morals in the modern natural-law school, then examines the new approach to moral and political philosophy adopted by Hume's precursors Shaftesbury, Mandeville, Hutcheson, and Butler. Illuminating Hume's explanation of the standards and rules that should govern private and public life, the author challenges interpretations of Hume's philosophy as conservative by demonstrating that he did not dismiss reason as a key factor determining right and wrong in moral and political contexts. Stewart goes on to show that Hume viewed private property, the market, contracts, and the rule of law as essential to genuine civilized society, and explores Hume's criticism of contemporary British beliefs concerning government, religion, commerce, international relations, and social structure. Originally published in 1992. The Princeton Legacy Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These editions preserve the original texts of these important books while presenting them in durable paperback and hardcover editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905. (shrink)
Guthrie contends that religion can best be understood as systematic anthropomorphism - the attribution of human characteristics to nonhuman things and events. Religion, he says, consists of seeing the world as human like. He offers a fascinating array of examples to show how this strategy pervades secular life and how it characterizes religious experience.
What is honor? Is it the same as reputation? Or is it rather a sentiment? Is it a character trait, like integrity? Or is it simply a concept too vague or incoherent to be fully analyzed? In the first sustained comparative analysis of this elusive notion, Frank Stewart writes that none of these ideas is correct. Drawing on information about Western ideas of honor from sources as diverse as medieval Arthurian romances, Spanish dramas of the sixteenth and seventeenth centuries, (...) and the writings of German jurists of the nineteenth and twentieth centuries, and comparing the European ideas with the ideas of a non-Western society—the Bedouin—Stewart argues that honor must be understood as a right, basically a right to respect. He shows that by understanding honor this way, we can resolve some of the paradoxes that have long troubled scholars, and can make sense of certain institutions that have not hitherto been properly understood. Offering a powerful new way to understand this complex notion, _Honor_ has important implications not only for the social sciences but also for the whole history of European sensibility. (shrink)
Do numbers, sets, and so forth, exist? What do mathematical statements mean? Are they literally true or false, or do they lack truth values altogether? Addressing questions that have attracted lively debate in recent years, Stewart Shapiro contends that standard realist and antirealist accounts of mathematics are both problematic. As Benacerraf first noted, we are confronted with the following powerful dilemma. The desired continuity between mathematical and, say, scientific language suggests realism, but realism in this context suggests seemingly intractable (...) epistemic problems. As a way out of this dilemma, Shapiro articulates a structuralist approach. On this view, the subject matter of arithmetic, for example, is not a fixed domain of numbers independent of each other, but rather is the natural number structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle. Using this framework, realism in mathematics can be preserved without troublesome epistemic consequences. Shapiro concludes by showing how a structuralist approach can be applied to wider philosophical questions such as the nature of an "object" and the Quinean nature of ontological commitment. Clear, compelling, and tautly argued, Shapiro's work, noteworthy both in its attempt to develop a full-length structuralist approach to mathematics and to trace its emergence in the history of mathematics, will be of deep interest to both philosophers and mathematicians. (shrink)
Second-language learners rarely arrive at native proficiency in a number of linguistic domains, including morphological and syntactic processing. Previous approaches to understanding the different outcomes of first- versus second-language learning have focused on cognitive and neural factors. In contrast, we explore the possibility that children and adults may rely on different linguistic units throughout the course of language learning, with specific focus on the granularity of those units. Following recent psycholinguistic evidence for the role of multiword chunks in online language (...) processing, we explore the hypothesis that children rely more heavily on multiword units in language learning than do adults learning a second language. To this end, we take an initial step toward using large-scale, corpus-based computational modeling as a tool for exploring the granularity of speakers' linguistic units. Employing a computational model of language learning, the Chunk-Based Learner, we compare the usefulness of chunk-based knowledge in accounting for the speech of second-language learners versus children and adults speaking their first language. Our findings suggest that while multiword units are likely to play a role in second-language learning, adults may learn less useful chunks, rely on them to a lesser extent, and arrive at them through different means than children learning a first language. (shrink)
Stewart Shapiro's ambition in Vagueness in Context is to develop a comprehensive account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary according to their context: a person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The key feature of Shapiro's account is that the extensions of vague (...) terms also vary in the course of conversations and that, in some cases, a competent speaker can go either way without sinning against the meaning of the words or the non-linguistic facts. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak; but vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. (shrink)
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall with respect to professional basketball players. The main (...) feature of Shapiro's account is that the extensions of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker of the language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann.The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox.Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems.As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects. (shrink)
This unique book by Stewart Shapiro looks at a range of philosophical issues and positions concerning mathematics in four comprehensive sections. Part I describes questions and issues about mathematics that have motivated philosophers since the beginning of intellectual history. Part II is an historical survey, discussing the role of mathematics in the thought of such philosophers as Plato, Aristotle, Kant, and Mill. Part III covers the three major positions held throughout the twentieth century: the idea that mathematics is logic (...) (logicism), the view that the essence of mathematics is the rule-governed manipulation of characters (formalism), and a revisionist philosophy that focuses on the mental activity of mathematics (intuitionism). Finally, Part IV brings the reader up-to-date with a look at contemporary developments within the discipline. This sweeping introductory guide to the philosophy of mathematics makes these fascinating concepts accessible to those with little background in either mathematics or philosophy. (shrink)
The dominant response to this problem of the criterion focuses on the alleged requirement that we need to know a belief source is reliable in order for us to acquire knowledge by that source. Let us call this requirement, “The KR principle”.
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed description of higher-order logic, including a comprehensive discussion of its semantics. He goes on to demonstrate the prevalence of second-order concepts in mathematics and the extent to which mathematical ideas can be formulated in higher-order logic. He also shows how first-order languages are often insufficient to codify (...) many concepts in contemporary mathematics, and thus that both first- and higher-order logics are needed to fully reflect current work. Throughout, the emphasis is on discussing the associated philosophical and historical issues and the implications they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic comparable to that provided in a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in the field today. (shrink)
In the early twentieth century an apparently obscure philosophical debate took place between F. H. Bradley and Bertrand Russell. The historical outcome was momentous: the demise of the movement known as British Idealism, and its eventual replacement by the various forms of analytic philosophy. Since then, a conception of this debate and its rights and wrongs has become entrenched in English-language philosophy. Stewart Candlish examines afresh the events of this formative period in twentieth-century thought and comes to some surprising (...) conclusions. (shrink)
I argue that epistemologists’ use of the term ‘epistemic’ has led to serious confusion in the discussion of epistemological issues. The source of the problem is that ‘epistemic’ functions largely as an undefined technical term. I show how this confusion has infected discussions of the nature of epistemic justification, epistemic norms for evidence gathering, and knowledge norms for assertion and belief.
Moving beyond both realist and anti-realist accounts of mathematics, Shapiro articulates a "structuralist" approach, arguing that the subject matter of a mathematical theory is not a fixed domain of numbers that exist independent of each other, but rather is the natural structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle.
The dominant response to this problem of the criterion focuses on the alleged requirement that we need to know a belief source is reliable in order for us to acquire knowledge by that source. Let us call this requirement, “The KR principle”.
Timothy Williamson has fruitfully exploited formal resources to shed considerable light on the nature of knowledge. In the paper under examination, Williamson turns his attention to Gettier cases, showing how they can be motivated formally. At the same time, he disparages the kind of justification he thinks gives rise to these cases. He favors instead his own notion of justification for which Gettier cases cannot arise. We take issue both with his disparagement of the kind of justification that figures in (...) Gettier cases and the specifics of the formal motivation. (shrink)
The problem of easy knowledge arises for theories that have what I call a "basic knowledge structure". S has basic knowledge of P just in case S knows P prior to knowing that the cognitive source of S's knowing P is reliable. Our knowledge has a basic knowledge structure just in case we have basic knowledge and we come to know our faculties are reliable on the basis of our basic knowledge. The problem I raised in "Basic Knowledge and the (...) Problem of Easy Knowledge" is that once we allow for basic knowledge, we can come to know our faculties are reliable in ways that intuitively are too easy. This raises a serious doubt about whether we had the basic knowledge in the first place. (shrink)
We examine George Boolos's proposed abstraction principle for extensions based on the limitation-of-size conception, New V, from several perspectives. Crispin Wright once suggested that New V could serve as part of a neo-logicist development of real analysis. We show that it fails both of the conservativeness criteria for abstraction principles that Wright proposes. Thus, we support Boolos against Wright. We also show that, when combined with the axioms for Boolos's iterative notion of set, New V yields a system equivalent to (...) full Zermelo-Fraenkel set theory with a principle of global choice. This advances Boolos's longstanding interest in the foundations of set theory. (shrink)
This paper compares two kinds of epistemic principles-an underdetermination principle and a deductive closure principle. It argues that each principle provides the basis for an independently motivated skeptical argument. It examines the logical relations between the premises of the two kinds of skeptical argument and concludes that the deductive closure argument cannot be refuted without refuting the underdetermination argument. The underdetermination argument, however, can be refuted without refuting the deductive closure argument. In this respect, the deductive closure argument is the (...) stronger of the two. (shrink)
This paper defends the view that standards, which are typically social in nature, play a role in determining whether a subject has knowledge. While the argument focuses on standards that pertain to reasoning, I also consider whether there are similar standards for memory and perception.Ultimately, I argue that the standards are context sensitive and, as such, we must view attributions of knowledge as indexical. I exploit similarities between this view and a version of the relevant alternatives reply to skepticism in (...) order to defend this reply against the objection that it is ad hoc. (shrink)