In this paper, we focus on whether and to what extent we judge that people are responsible for the consequences of their forgetfulness. We ran a series of behavioral studies to measure judgments of responsibility for the consequences of forgetfulness. Our results show that we are disposed to hold others responsible for some of their forgetfulness. The level of stress that the forgetful agent is under modulates judgments of responsibility, though the level of care that the agent exhibits toward performing (...) the forgotten action does not. We argue that this result has important implications for a long-running debate about the nature of responsible agency. (shrink)
To mark the 200th anniversary of the birth of Søren Kierkegaard, I review in this essay the relationship between Kierkegaard and the Catholic tradition. First, I look back to consider both Kierkegaard’s encounter with Catholicism and the influence of his work upon Catholics. Second, I look around to consider some of the recent work on Kierkegaard and Catholicism, especially Jack Mulder’s recent book, Kierkegaard and the Catholic Tradition, and the many articles that examine Kierkegaard’s relation to Catholicism in the multi-volume (...) Kierkegaard Research series edited by Jon Stewart. Finally, I look ahead to consider possible directions in which the conversation between Catholics and Kierkegaardians might continue. (shrink)
After a brief account of the problem of higher-order vagueness, and its seeming intractability, I explore what comes of the issue on a linguistic, contextualist account of vagueness. On the view in question, predicates like ‘borderline red’ and ‘determinately red’ are, or at least can be, vague, but they are different in kind from ‘red’. In particular, ‘borderline red’ and ‘determinately red’ are not colours. These predicates have linguistic components, and invoke notions like ‘competent user of the language’. On my (...) view, so-called ‘higher-order vagueness’ is actually ordinary, first-order vagueness in different predicates. I explore the possibility that, nevertheless, a pernicious regress ensues. (shrink)
Do numbers, sets, and so forth, exist? What do mathematical statements mean? Are they literally true or false, or do they lack truth values altogether? Addressing questions that have attracted lively debate in recent years, Stewart Shapiro contends that standard realist and antirealist accounts of mathematics are both problematic. As Benacerraf first noted, we are confronted with the following powerful dilemma. The desired continuity between mathematical and, say, scientific language suggests realism, but realism in this context suggests seemingly intractable (...) epistemic problems. As a way out of this dilemma, Shapiro articulates a structuralist approach. On this view, the subject matter of arithmetic, for example, is not a fixed domain of numbers independent of each other, but rather is the natural number structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle. Using this framework, realism in mathematics can be preserved without troublesome epistemic consequences. Shapiro concludes by showing how a structuralist approach can be applied to wider philosophical questions such as the nature of an "object" and the Quinean nature of ontological commitment. Clear, compelling, and tautly argued, Shapiro's work, noteworthy both in its attempt to develop a full-length structuralist approach to mathematics and to trace its emergence in the history of mathematics, will be of deep interest to both philosophers and mathematicians. (shrink)
Why do we need government? A common view is that government is necessary to constrain people's conduct toward one another, because people are not sufficiently virtuous to exercise the requisite degree of control on their own. This view was expressed perspicuously, and artfully, by liberal thinker James Madison, in The Federalist, number 51, where he wrote: “If men were angels, no government would be necessary.” Madison's idea is shared by writers ranging across the political spectrum. It finds clear expression in (...) the Marxist view that the state will gradually wither away after a communist revolution, as unalienated “communist man” emerges. And it is implied by the libertarian view that government's only legitimate function is to control the unfortunate and immoral tendency of some individuals to violate the moral rights of others. (shrink)
This unique book by Stewart Shapiro looks at a range of philosophical issues and positions concerning mathematics in four comprehensive sections. Part I describes questions and issues about mathematics that have motivated philosophers since the beginning of intellectual history. Part II is an historical survey, discussing the role of mathematics in the thought of such philosophers as Plato, Aristotle, Kant, and Mill. Part III covers the three major positions held throughout the twentieth century: the idea that mathematics is logic (...) (logicism), the view that the essence of mathematics is the rule-governed manipulation of characters (formalism), and a revisionist philosophy that focuses on the mental activity of mathematics (intuitionism). Finally, Part IV brings the reader up-to-date with a look at contemporary developments within the discipline. This sweeping introductory guide to the philosophy of mathematics makes these fascinating concepts accessible to those with little background in either mathematics or philosophy. (shrink)
It is commonplace to suppose that the theory of individual rational choice is considerably less problematic than the theory of collective rational choice. In particular, it is often assumed by philosophers, economists, and other social scientists that an individual's choices among outcomes accurately reflect that individual's underlying preferences or values. Further, it is now well known that if an individual's choices among outcomes satisfy certain plausible axioms of rationality or consistency, that individual's choice-behavior can be interpreted as maximizing expected utility (...) on a utility scale that is unique up to a linear transformation. Hence, there is, in principle, an empirically respectable method of measuring individuals' values and a single unified schema for explaining their actions as value maximizing. (shrink)
My argument will be that our understanding of human beings, which is what I take the Christian doctrine of man to be concerned with, will benefit considerably from an examination of two different but related clusters of human attitudes which can be found respectively under the headings ‘optimism’ and ‘pessimism’. There are many pitfalls in the way of such an enterprise, and occasionally some prejudices to be overcome. For example L. E. Loemker in the relevant articles in the Encyclopedia of (...) Philosophy concludes a fairly lengthy discussion with the rather terminal judgement. (shrink)
In the last ten years or so there has been some lively discussion of the questions of immortality and resurrection. Within the Christian tradition there has been debate at theological and exegetical level over the relative merits of belief in the immortality of the soul, and belief in the resurrection of the dead as an account of life after death. Further to this, however, there has been the suggestion that there may be good philosophical reasons for preferring the latter to (...) the former. It is just this contention which I propose to discuss. (shrink)
The title of this paper proclaims its central interest—the relationship which holds between the concept of integrity and the concept of the identity of the self, or, for short, self-identity. Unreflective speech often suggests a close relationship between the two, but in the latter half of this century, notwithstanding one or two notable exceptions, they have been discussed with minimum cross-reference as if they belonged to two rather different philosophical menus which tended not to be available at the same restaurant (...) on the same night. My intention is to argue that our account of the one carried implications for the other and that this relationship is reflexive. My argument will proceed by stating and criticizing a common account of the relationship between each of these concepts which tends to offer mutual support for the implied account of each. Thereafter an alternative account will be outlined. (shrink)
Guthrie contends that religion can best be understood as systematic anthropomorphism - the attribution of human characteristics to nonhuman things and events. Religion, he says, consists of seeing the world as human like. He offers a fascinating array of examples to show how this strategy pervades secular life and how it characterizes religious experience.
In this article I shall concern myself with the question ‘Is some type of justification required in order for belief in God to be rational?’ Many philosophers and theologians in the past would have responded affirmatively to this question. However, in our own day, there are those who maintain that natural theology in any form is not necessary. This is because of the rise of a different understanding of the nature of religious belief. Unlike what most people in the past (...) thought, religious belief is not in any sense arrived at or inferred on the basis of other known propositions. On the contrary, belief in God is taken to be as basic as a person's belief in the existence of himself, of the chair in which he is sitting, or the past. The old view that there must be a justification of religious belief, whether known or unknown, is held to be mistaken. One of the most outspoken advocates of this view is Alvin Plantinga. According to Plantinga the mature theist ought not to accept belief in God as a conclusion from other things he believes. Rather, he should accept it as basic, as a part of the bedrock of his noetic structure. ‘The mature theist commits himself to belief in God; this means that he accepts belief in God as basic.’. (shrink)
Gregory of Nyssa made important contributions to both theological thought and the understanding of the spiritual life. He was especially significant in adapting the thought of Origen to fourth century orthodoxy. The early treatise on the inscriptions of the Psalms shows the early stages of the development of Gregory's thought. This book presents the first translation of the treatise in a modern language. The annotations show Gregory's indebtedness to the thought of classical antiquity as well as to (...) the Bible. The Introduction sets forth the structure of Gregory's treatise, and places it in the context of earlier Christian commentaries on the Psalms. It shows how his hermeneutical approach was influenced by both Iamblichus the Neo-Platonist and Origen. Finally, Dr Heine compares Gregory's understanding of the stages of the spiritual life in the treatise with that in his later and more widely known writings on the life of Moses and the Song of Songs. (shrink)
This book presents the framework for a new, comprehensive approach to cognitive science. The proposed paradigm, enaction, offers an alternative to cognitive science's classical, first-generation Computational Theory of Mind. _Enaction_, first articulated by Varela, Thompson, and Rosch in _The Embodied Mind_, breaks from CTM's formalisms of information processing and symbolic representations to view cognition as grounded in the sensorimotor dynamics of the interactions between a living organism and its environment. A living organism enacts the world it lives in; its embodied (...) action in the world constitutes its perception and thereby grounds its cognition. _Enaction_ offers a range of perspectives on this exciting new approach to embodied cognitive science. Some chapters offer manifestos for the enaction paradigm; others address specific areas of research, including artificial intelligence, developmental psychology, neuroscience, language, phenomenology, and culture and cognition. Three themes emerge as testimony to the originality and specificity of enaction as a paradigm: the relation between first-person lived experience and third-person natural science; the ambition to provide an encompassing framework applicable at levels from the cell to society; and the difficulties of reflexivity. Taken together, the chapters offer nothing less than the framework for a far-reaching renewal of cognitive science. Contributors: Renaud Barbaras, Didier Bottineau, Giovanna Colombetti, Diego Cosmelli, Hanne De Jaegher, Ezequiel A. Di Paolo. Andreas K. Engel, Olivier Gapenne, Véronique Havelange, Edwin Hutchins, Michel Le Van Quyen, Rafael E. Núñez, Marieke Rohde, Benny Shanon, Maxine Sheets-Johnstone, Adam Sheya, Linda B. Smith, John Stewart, Evan Thompson. (shrink)
Logical pluralism is the view that different logics are equally appropriate, or equally correct. Logical relativism is a pluralism according to which validity and logical consequence are relative to something. Stewart Shapiro explores various such views. He argues that the question of meaning shift is itself context-sensitive and interest-relative.
Stewart Shapiro's ambition in Vagueness in Context is to develop a comprehensive account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary according to their context: a person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The key feature of Shapiro's account is that the extensions of vague (...) terms also vary in the course of conversations and that, in some cases, a competent speaker can go either way without sinning against the meaning of the words or the non-linguistic facts. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak; but vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. (shrink)
It is, perhaps, a propitious time to discuss the economic rights of disabled persons. In recent years, the media in the United States have re-ported on such notable events as: students at the nation's only college for the deaf stage a successful protest campaign to have a deaf individual ap-pointed president of their institution; a book by a disabled British physicist on the origins of the universe becomes a best seller; a pitcher with only one arm has a successful rookie (...) season in major league baseball; a motion-picture actor wins an Oscar for his portrayal of a wheelchair-bound person, beating out another nominee playing another wheelchair-bound person; a cancer patient wins an Olympic gold medal in wrestling; a paralyzed mother trains her children to accept discipline by inserting their hands in her mouth to be gently bitten when punishment is due; and a paraplegic rock climber scales the sheer four-thousand-foot wall of Yosemite Valley's El Capitan. Most significantly, in 1990, the United States Congress passed an important bill – the Americans with Disabili-ties Act – extending to disabled people employment and access-related protections afforded to members of other disadvantaged groups by the Civil Rights Act of 1964. (shrink)
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed description of higher-order logic, including a comprehensive discussion of its semantics. He goes on to demonstrate the prevalence of second-order concepts in mathematics and the extent to which mathematical ideas can be formulated in higher-order logic. He also shows how first-order languages are often insufficient to codify (...) many concepts in contemporary mathematics, and thus that both first- and higher-order logics are needed to fully reflect current work. Throughout, the emphasis is on discussing the associated philosophical and historical issues and the implications they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic comparable to that provided in a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in the field today. (shrink)
It was, I believe, Thomas Arnold who wrote: ‘Educate men without religion and all you make of them is clever devils’. Thus the Headmaster of one famous school summarized pithily the view of the relationship between religion and ethics which informed educational theory and practice in this country for at least a further century. There is a confusion of two different assumptions usually to be found in this context. The first is that religious belief can provide an intellectual foundation for (...) moral belief; the second is that the effect of religious teaching is to improve behaviour according to the norms of some particular set of moral beliefs. (shrink)
It is of course true that the articulation of religious and theological views depends upon and often masks philosophical presuppositions. For example, those who quote with approval Anselm's ‘credo ut intelligam’, ‘I believe so that I may understand’, seldom follow the good example set by Anselm, and make explicit, as Anselm does in the following sentence, the fact that this principle rests upon a further principle: ‘For I believe this also, that “unless I believe, I shall not understand”’ . This (...) paper is an attempt to track down and expose one very pervasive set of views about the nature of experience which is implicit in a wide range of religious and theological claims. (shrink)
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall with respect to professional basketball players. The main (...) feature of Shapiro's account is that the extensions of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker of the language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann.The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox.Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems.As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects. (shrink)
In fact, it requires two major social institutions--morality and government--working in a coordinated fashion to do so. This is one of the main themes of Hobbes's philosophy that will be developed in this book.
The dominant response to this problem of the criterion focuses on the alleged requirement that we need to know a belief source is reliable in order for us to acquire knowledge by that source. Let us call this requirement, “The KR principle”.
Understanding the Infinite is a loosely connected series of essays on the nature of the infinite in mathematics. The chapters contain much detail, most of which is interesting, but the reader is not given many clues concerning what concepts and ideas are relevant for later developments in the book. There are, however, many technical cross-references, so the reader can expect to spend much time flipping backward and forward.
Jon Stewart's study is a major re-evaluation of the complex relations between the philosophies of Kierkegaard and Hegel. The standard view on the subject is that Kierkegaard defined himself as explicitly anti-Hegelian, indeed that he viewed Hegel's philosophy with disdain. Jon Stewart shows convincingly that Kierkegaard's criticism was not of Hegel but of a number of contemporary Danish Hegelians. Kierkegaard's own view of Hegel was in fact much more positive to the point where he was directly influenced by (...) some of Hegel's work. Any scholar working in the tradition of Continental philosophy will find this an insightful and provocative book with implications for the subsequent history of philosophy in the twentieth century. The book will also appeal to scholars in religious studies and the history of ideas. (shrink)
In the early twentieth century an apparently obscure philosophical debate took place between F. H. Bradley and Bertrand Russell. The historical outcome was momentous: the demise of the movement known as British Idealism, and its eventual replacement by the various forms of analytic philosophy. Since then, a conception of this debate and its rights and wrongs has become entrenched in English-language philosophy. Stewart Candlish examines afresh the events of this formative period in twentieth-century thought and comes to some surprising (...) conclusions. (shrink)