The arts of rule cover the exercise of power by princes and popular sovereigns, but they range beyond the domain of government itself, extending to civil associations, political parties, and religious institutions. Making full use of political philosophy from a range of backgrounds, this festschrift for Harvey Mansfield recognizes that although the arts of rule are comprehensive, the best government is a limited one.
As an emerging discipline, neuroeconomics faces considerable methodological and practical challenges. In this paper, I suggest that these challenges can be understood by exploring the similarities and dissimilarities between the emergence of neuroeconomics and the emergence of cognitive and computational neuroscience two decades ago. From these parallels, I suggest the major challenge facing theory formation in the neural and behavioural sciences is that of being under-constrained by data, making a detailed understanding of physical implementation necessary for theory construction in neuroeconomics. (...) Rather than following a top-down strategy, neuroeconomists should be pragmatic in the use of available data from animal models, information regarding neural pathways and projections, computational models of neural function, functional imaging and behavioural data. By providing convergent evidence across multiple levels of organization, neuroeconomics will have its most promising prospects of success. (shrink)
After more than a decade of reflection on obedience experiments based on a laboratory model of his own design, the social psychologist Stanley Milgram is clearly confident that the experimental results make a substantial and striking contribution towards understanding human nature: Something … dangerous is revealed: the capacity for man to abandon his humanity, indeed, the inevitability that he does so, as he merges his unique personality into larger institutional structures.
The article contests Affeldt's critique of Mulhall's "Stanley Cavell: Philosophy's Recounting of the Ordinary," by asking how deep the conflict between what Affeldt proposes as Cavell's account of Wittgenstein's notion of grammar and that of Baker and Hacker really goes. It argues that Affeldt's critique is successful against one interpretation of the claims that grammar consists of a framework of rules and that criteria function as a basis for judgment, but that other interpretations of these claims are available and appear (...) consistent with both Cavell's and Wittgenstein's positions. It concludes by suggesting that the real issue is how to combine a sense of the normativity of grammar with that of the role of the personal in grounding grammatical remarks. (shrink)
The prisoner 's dilemma game has acquired large literatures in several disciplines. It is surprising, therefore, that a good definition of the game is hard to find. Typically an author relates a story about captured criminals or military rivals, provides a particular payoff matrix and asserts that the PD is characterized, or illustrated, by that matrix. In the few cases in which characterizing conditions are given, the conditions, and the motivations for them, do not always agree with each other or (...) with the paradigm examples elsewhere. In this paper we describe several varieties of PD's. In particular, we suggest there are two distinctions among PD's with philosophical significance, the pure/impure and the utilitarian/nonutilitarian distinctions. In the first section, we explain and characterize the two distinctions. In the second, we discuss an issue of moral philosophy that illustrates the significance of the former. (shrink)
Steven French articulates and defends the bold claim that there are no objects in the world. He draws on metaphysics and philosophy of science to argue for structural realism--the position that we live in a world of structures--and defends a form of eliminativism about objects that sets laws and symmetry principles at the heart of ontology.
The physics and metaphysics of identity and individuality Content Type Journal Article DOI 10.1007/s11016-010-9463-7 Authors Don Howard, Department of Philosophy and Graduate Program in History and Philosophy of Science, University of Notre Dame, Notre Dame, IN 46556, USA Bas C. van Fraassen, Philosophy Department, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132, USA Otávio Bueno, Department of Philosophy, University of Miami, Coral Gables, FL 33124, USA Elena Castellani, Department of Philosophy, University of Florence, Via Bolognese 52, 50139 (...) Florence, Italy Laura Crosilla, Department of Pure Mathematics, School of Mathematics, University of Leeds, Leeds, LS2 9JT UK Steven French, Department of Philosophy, University of Leeds, Leeds, UK Décio Krause, Department of Philosophy, Federal University of Santa Catarina, 88040-900 Campus Trindade, Florianópolis, SC Brazil Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796. (shrink)
Theists believe that God is eternal, but they differ as to just what God's eternality means . The traditional, historic view of most Christian philosophers is that eternality means that God is timeless. He is ‘outside’ of time and not subject to any kind of temporal change. Indeed, God is the creator of time. Lets call this view divine timelessness.
Steven French and Decio Krause examine the metaphysical foundations of quantum physics. They draw together historical, logical, and philosophical perspectives on the fundamental nature of quantum particles and offer new insights on a range of important issues. Focusing on the concepts of identity and individuality, the authors explore two alternative metaphysical views; according to one, quantum particles are no different from books, tables, and people in this respect; according to the other, they most certainly are. Each view comes with (...) certain costs attached and after describing their origins in the history of quantum theory, the authors carefully consider whether these costs are worth bearing. Recent contributions to these discussions are analyzed in detail and the authors present their own original perspective on the issues. The final chapter suggests how this perspective can be taken forward in the context of quantum field theory. (shrink)
A provocative assessment of human thought and behavior, reissued with a new afterword, explores a range of conundrums from the ability of the mind to perceive three dimensions to the nature of consciousness, in an account that draws on ...
Should a theory of meaning state what sentences mean, and can a Davidsonian theory of meaning in particular do so? Max Ko¨lbel answers both questions affirmatively. I argue, however, that the phenomena of non-homophony, non-truth-conditional aspects of meaning, semantic mood, and context-sensitivity provide prima facie obstacles for extending Davidsonian truth-theories to yield meaning-stating theorems. Assessing some natural moves in reply requires a more fully developed conception of the task of such theories than Ko¨lbel provides. A more developed conception is also (...) required to defend his positive answer to the first question above. I argue that, however Ko¨lbel might elaborate his position, it can’t be by embracing the sort of cognitivist account of Davidsonian semantics to which he sometimes alludes. (shrink)
Many people have argued that the evolution of the human language faculty cannot be explained by Darwinian natural selection. Chomsky and Gould have suggested that language may have evolved as the by-product of selection for other abilities or as a consequence of as-yet unknown laws of growth and form. Others have argued that a biological specialization for grammar is incompatible with every tenet of Darwinian theory – that it shows no genetic variation, could not exist in any intermediate forms, confers (...) no selective advantage, and would require more evolutionary time and genomic space than is available. We examine these arguments and show that they depend on inaccurate assumptions about biology or language or both. Evolutionary theory offers clear criteria for when a trait should be attributed to natural selection: complex design for some function, and the absence of alternative processes capable of explaining such complexity. Human language meets these criteria: Grammar is a complex mechanism tailored to the transmission of propositional structures through a serial interface. Autonomous and arbitrary grammatical phenomena have been offered as counterexamples to the position that language is an adaptation, but this reasoning is unsound: Communication protocols depend on arbitrary conventions that are adaptive as long as they are shared. Consequently, language acquisition in the child should systematically differ from language evolution in the species, and attempts to analogize them are misleading. Reviewing other arguments and data, we conclude that there is every reason to believe that a specialization for grammar evolved by a conventional neo-Darwinian process. (shrink)
Steven Crowell has been for many years a leading voice in debates on twentieth-century European philosophy. This volume presents thirteen recent essays that together provide a systematic account of the relation between meaningful experience and responsiveness to norms. They argue for a new understanding of the philosophical importance of phenomenology, taking the work of Husserl and Heidegger as exemplary, and introducing a conception of phenomenology broad enough to encompass the practices of both philosophers. Crowell discusses Husserl's analyses of first-person (...) authority, the semantics of conscious experience, the structure of perceptual content, and the embodied subject, and shows how Heidegger's interpretation of the self addresses problems in Husserl's approach to the normative structure of meaning. His volume will be valuable for upper-level students and scholars interested in phenomenological approaches to philosophical questions in both the European and the analytic traditions. (shrink)
John Finnis's powerfully and deservedly influential modern classic, Natural Law and Natural Rights, expounds a theory of law and morality that is based on a picture of “persons” using practical reason to pursue certain “basic goods.” While devoting much attention to practical reason and to the goods, however, Finnis says little about the nature of personhood. This relative inattention to what “persons” are creates a risk—one that Finnis himself notices—of assuming or importing an inadequate anthropology. This essay suggests that the (...) “new natural law” developed by Finnis suffers in places from the inadvertent adoption of a flawed anthropology—an anthropology under the thrall of modern individualistic commitments. To explain this suspicion, this article discusses three difficulties in his natural law theory: difficulties in accounting for the basic good of friendship, for obligations we owe to others, and for legal authority. These difficulties may seem disconnected, but this article suggests that they may all reflect an inadequate anthropology—one that Finnis does not exactly embrace but that is pervasive today and that in places may affect his theorizing. (shrink)
1. Introduction The policy of deterrence, at least to avert nuclear war between the superpowers, has been a controversial one. The main controversy arises from the threat of each side to visit destruction on the other in response to an initial attack. This threat would seem irrational if carrying it out would lead to a nuclear holocaust – the worst outcome for both sides. Instead, it would seem better for the side attacked to suffer some destruction rather than to retaliate (...) in kind and, in the process of devastating the other side, seal its own doom in an all-out nuclear exchange. Yet, the superpowers persist in their adherence to deterrence, by which we mean a policy of threatening to retaliate to an attack by the other side in order to deter such an attack in the first place. To be sure, nuclear doctrine for implementing deterrence has evolved over the years, with such appellations as “massive retaliation,” “flexible response,” “mutual assured destruction”, and “counterforce” giving some flavor of the changes in United States strategic thinking. All such doctrines, however, entail some kind of response to a Soviet nuclear attack. They are operationalized in terms of preselected targets to be hit, depending on the perceived nature and magnitude of the attack. Thus, whether U.S. strategic policy at any time stresses a retaliatory attack on cities and industrial centers or on weapons systems and armed forces, the certainty of a response of some kind to an attack is not the issue. (shrink)
A paradox, according to the OED, is ‘a statement seemingly self-contradictory or absurd, though possibly well-founded or essentially true’. In this article I shall try to show that the classical orthodox Marxist view of morality is a paradox. I shall seek to resolve the paradox by trying to show that it is only seemingly self-contradictory or absurd. But I shall not claim the standard Marxist view of morality to be well-founded or essentially true. On the contrary, I shall suggest that, (...) though coherent, it is ill-founded and illusory. (shrink)
While examining the important role of imagination in making moral judgments, John Dewey and Moral Imagination focuses new attention on the relationship between American pragmatism and ethics. Steven Fesmire takes up threads of Dewey's thought that have been largely unexplored and elaborates pragmatism's distinctive contribution to understandings of moral experience, inquiry, and judgment. Building on two Deweyan notions—that moral character, belief, and reasoning are part of a social and historical context and that moral deliberation is an imaginative, dramatic rehearsal (...) of possibilities—Fesmire shows that moral imagination can be conceived as a process of aesthetic perception and artistic creativity. Fesmire's original readings of Dewey shed new light on the imaginative process, human emotional make-up and expression, and the nature of moral judgment. This original book presents a robust and distinctly pragmatic approach to ethics, politics, moral education, and moral conduct. (shrink)
The grand and sweeping claims of many relativists might seem to amount to the argument that everything is relative--except the thesis of relativism. In this book, Steven Hales defends relativism, but in a more circumscribed form that applies specifically to philosophical propositions. His claim is that philosophical propositions are relatively true--true in some perspectives and false in others. Hales defends this argument first by examining rational intuition as the method by which philosophers come to have the beliefs they do. (...) Analytic rationalism, he claims, has a foundational reliance on rational intuition as a method of acquiring basic beliefs. He then argues that there are other methods that people use to gain beliefs about philosophical topics that are strikingly analogous to rational intuition and examines two of these: Christian revelation and the ritual use of hallucinogens. Hales argues that rational intuition is not epistemically superior to either of these alternative methods. There are only three possible outcomes: we have no philosophical knowledge ; there are no philosophical propositions ; or there are knowable philosophical propositions, but our knowledge of them is relative to doxastic perspective. Hales defends relativism against the charge that it is self-refuting and answers a variety of objections to this account of relativism. Finally, he examines the most sweeping objection to relativism: that philosophical propositions are not merely relatively true, because there are no philosophical propositions--all propositions are ultimately empirical, as the naturalists contend. Hales's somewhat disturbing conclusion--that intuition-driven philosophy does produce knowledge, but not absolute knowledge--is sure to inspire debate among philosophers. (shrink)
Winner of 2002 Edward Goodwin Ballard Prize In a penetrating and lucid discussion of the enigmatic relationship between the work of Edmund Husserl and Martin Heidegger, Steven Galt Crowell proposes that the distinguishing feature of twentieth-century philosophy is not so much its emphasis on language as its concern with meaning. Arguing that transcendental phenomenology is indispensable to the philosophical explanation of the space of meaning, Crowell shows how a proper understanding of both Husserl and Heidegger reveals the distinctive contributions (...) of each to that ongoing phenomenological project. (shrink)
Spinoza's Ethics is one of the most remarkable, important, and difficult books in the history of philosophy: a treatise simultaneously on metaphysics, knowledge, philosophical psychology, moral philosophy, and political philosophy. It presents, in Spinoza's famous 'geometric method', his radical views on God, Nature, the human being, and happiness. In this wide-ranging 2006 introduction to the work, Steven Nadler explains the doctrines and arguments of the Ethics, and shows why Spinoza's endlessly fascinating ideas may have been so troubling to his (...) contemporaries, as well as why they are still highly relevant today. He also examines the philosophical background to Spinoza's thought and the dialogues in which Spinoza was engaged - with his contemporaries, with ancient thinkers, and with his Jewish rationalist forebears. His book is written for the student reader but will also be of interest to specialists in early modern philosophy. (shrink)
This book offers a discussion about how people think, talk, learn, and explain things in causal terms in terms of action and manipulation. Sloman also reviews the role of causality, causal models, and intervention in the basic human cognitive functions: decision making, reasoning, judgement, categorization, inductive inference, language, and learning.
Contemporary philosophers of mind tend to assume that the world of nature can be reduced to basic physics. Yet there are features of the mind consciousness, intentionality, normativity that do not seem to be reducible to physics or neuroscience. This explanatory gap between mind and brain has thus been a major cause of concern in recent philosophy of mind. Reductionists hold that, despite all appearances, the mind can be reduced to the brain. Eliminativists hold that it cannot, and that this (...) implies that there is something illegitimate about the mentalistic vocabulary. Dualists hold that the mental is irreducible, and that this implies either a substance or a property dualism. Mysterian non-reductive physicalists hold that the mind is uniquely irreducible, perhaps due to some limitation of our self-understanding. In this book, Steven Horst argues that this whole conversation is based on assumptions left over from an outdated philosophy of science. While reductionism was part of the philosophical orthodoxy fifty years ago, it has been decisively rejected by philosophers of science over the past thirty years, and for good reason. True reductions are in fact exceedingly rare in the sciences, and the conviction that they were there to be found was an artifact of armchair assumptions of 17th century Rationalists and 20th century Logical Empiricists. The explanatory gaps between mind and brain are far from unique. In fact, in the sciences it is gaps all the way down.And if reductions are rare in even the physical sciences, there is little reason to expect them in the case of psychology. Horst argues that this calls for a complete re-thinking of the contemporary problematic in philosophy of mind. Reductionism, dualism, eliminativism and non-reductive materialism are each severely compromised by post-reductionist philosophy of science, and philosophy of mind is in need of a new paradigm. Horst suggests that such a paradigm might be found in Cognitive Pluralism: the view that human cognitive architecture constrains us to understand the world through a plurality of partial, idealized, and pragmatically-constrained models, each employing a particular representational system optimized for its own problem domain. Such an architecture can explain the disunities of knowledge, and is plausible on evolutionary grounds. (shrink)
Are liberalism and perfectionism compatible? In this study Steven Wall presents and defends a perfectionist account of political morality that takes issue with many currently fashionable liberal ideas but retains the strong liberal commitment to the ideal of personal autonomy. He begins by critically discussing the most influential version of anti-perfectionist liberalism, examining the main arguments that have been offered in its defence. He then clarifies the ideal of personal autonomy, presents an account of its value and shows that (...) a strong commitment to personal autonomy is fully compatible with an endorsement of perfectionist political action designed to promote valuable pursuits and discourage base ones. (shrink)
In _Without Criteria_, Steven Shaviro proposes and explores a philosophical fantasy: imagine a world in which Alfred North Whitehead takes the place of Martin Heidegger. What if Whitehead, instead of Heidegger, had set the agenda for postmodern thought? Heidegger asks, "Why is there something, rather than nothing?" Whitehead asks, "How is it that there is always something new?" In a world where everything from popular music to DNA is being sampled and recombined, argues Shaviro, Whitehead's question is the truly (...) urgent one. _Without Criteria_ is Shaviro's experiment in rethinking postmodern theory, especially the theory of aesthetics, from a point of view that hearkens back to Whitehead rather than Heidegger. In working through the ideas of Whitehead and Deleuze, Shaviro also appeals to Kant, arguing that certain aspects of Kant's thought pave the way for the philosophical "constructivism" embraced by both Whitehead and Deleuze. Kant, Whitehead, and Deleuze are not commonly grouped together, but the juxtaposition of them in _Without Criteria_ helps to shed light on a variety of issues that are of concern to contemporary art and media practices. (shrink)
Standard philosophical explanations of the concept of knowledge invoke a personal goal of having true beliefs, and explain the other requirements for knowledge as indicating the best way to achieve that goal. In this highly original book, Steven L. Reynolds argues instead that the concept of knowledge functions to express a naturally developing kind of social control, a complex social norm, and that the main purpose of our practice of saying and thinking that people 'know' is to improve our (...) system for exchanging information, which is testimony. He makes illuminating comparisons of the knowledge norm of testimony with other complex social norms - such as those requiring proper clothing, respectful conversation, and the complementary virtues of tact and frankness - and shows how this account fits with our concept of knowledge as studied in recent analytic epistemology. His book will interest a range of readers in epistemology, psychology, and sociology. (shrink)