Risk management of nanotechnology is challenged by the enormous uncertainties about the risks, benefits, properties, and future direction of nanotechnology applications. Because of these uncertainties, traditional risk management principles such as acceptable risk, cost–benefit analysis, and feasibility are unworkable, as is the newest risk management principle, the precautionary principle. Yet, simply waiting for these uncertainties to be resolved before undertaking risk management efforts would not be prudent, in part because of the growing public concerns about nanotechnology driven by risk perception (...) heuristics such as affect and availability. A more reflexive, incremental, and cooperative risk management approach is required, which not only will help manage emerging risks from nanotechnology applications, but will also create a new risk management model for managing future emerging technologies. (shrink)
As policy makers struggle to develop regulatory oversight models for nanotechnologies, there are important lessons that can be drawn from previous attempts to govern other emerging technologies. Five such lessons are the following: public confidence and trust in a technology and its regulatory oversight is probably the most important factor for the commercial success of a technology; regulation should avoid discriminating against particular technologies unless there is a scientifically based rationale for the disparate treatment; regulatory systems need to be flexible (...) and adaptive to rapidly changing technologies; ethical and social concerns of the public about emerging technologies need to be expressly acknowledged and addressed in regulatory oversight; and international harmonization of regulation may be beneficial in a rapidly globalizing world. (shrink)
Nanotechnology is the latest in a growing list of emerging technologies that includes nuclear technologies, genetics, reproductive biology, biotechnology, information technology, robotics, communication technologies, surveillance technologies, synthetic biology, and neuroscience. As was the case for many of the technologies that came before, a key question facing nanotechnology is what type of regulatory oversight is appropriate for this emerging technology. As two of us wrote several years ago, the question facing nanotechnology is not whether it will be regulated, but when and (...) how.Yet, appropriate regulation of nanotechnology will be challenging. The term “nanotechnology” incorporates a broad, diverse range of materials, technologies, and products, with an even greater spectrum of potential risks and benefits. This technology slashes across the jurisdiction of many existing regulatory statutes and regulatory agencies, and does so across the globe. Nanotechnology is developing at an enormously rapid rate, perhaps surpassing the capability of any potential regulatory framework to keep pace. Finally, the risks of nanotechnology remain largely unknown, both because of the multitude of variations in the technology and because of the limited applicability of traditional toxicological approaches such as structure-activity relationship to nanotechnology products. (shrink)
In the TACITUS project for using commonsense knowledge in the understanding of texts about mechanical devices and their failures, we have been developing various commonsense theories that are needed to mediate between the way we talk about the behavior of such devices and causal models of their operation. Of central importance in this effort is the axiomatization of what might be called commonsense metaphysics. This includes a number of areas that figure in virtually every domain of discourse, such as granularity, (...) scales, time, space, material, physical objects, shape, causality, functionality, and force. Our effort has been to construct core theories of each of these areas, and then to define, or at least characterize, a large number of lexical items in terms provided by the core theories. In this paper we discuss our methodological principles and describe the key ideas in the various domains we are investigating. (shrink)
This distinctive collection by scholars from around the world focuses upon the cultural, educational, and political significance of Richard Rorty's thought. The nine essays which comprise the collection examine a variety of related themes: Rorty's neopragmatism, his view of philosophy, his philosophy of education and culture, Rorty's comparison between Dewey and Foucault, his relation to postmodern theory, and, also his form of political liberalism.
In this dissertation, the author articulates and defends a version of the historically important view that all consciousness involves self-consciousness. In Chapter 1, the author defends a certain conception of the role of phenomenology in the theory of consciousness. The author argues that any theory of consciousness must account for the properties that phenomenology reveals consciousness to have. The most important properties in this regard are structural: temporality, synchronic unity, and self-referentiality. It is argued that these properties can be given (...) a rigorous description and that mathematical models of the structure of consciousness can be developed. ;In Chapter 2, the author argues that there are no phenomenological data that preclude a theory that identifies consciousness with a brain process. By carefully attending to what consciousness can and cannot reveal about itself in introspection, the author argues, one can see clearly that phenomenology can play its necessary role in the theory of consciousness without at the same time undermining physicalism. ;In Chapter 3, the author discusses the structural phenomenological data. The discussion is informed by the classic work of William James as well as the work of the Phenomenological philosophers, Edmund Husserl, Jean-Paul Sartre, and Aron Gurwitsch. ;In Chapter 4, the author examines various versions of the view that all consciousness involves self-consciousness. The author argues that any adequate theory of the type must maintain that states or phases of consciousness must be genuinely self-referential. The author then goes on to present a model of the self-referential structure of consciousness using some tools from contemporary Set Theory. ;In Chapter 5, the author replies to certain important objections to the self-referentialist theory of consciousness. ;In Chapter 6, the author examines some connections the theory defended bears to the mathematico-computational theory of consciousness suggested by Douglas Hofstadter and to the outstanding neurobiological theories of consciousness presented by Antonio Damasio and Gerald Edelman. ;In Chapter 7, the author argues that the supposition that consciousness is the embodiment of a certain self-referential structure implies elegant solutions to some outstanding problems in the philosophy of mind. (shrink)
Johnstone, H. W., Jr. Rhetoric and communication in philosophy.--Smith, C. R. and Douglas, D. G. Philosophical principles in the traditional and emerging views of rhetoric.--Wallace, K. R. Bacon's conception of rhetoric.--Thonssen, L. W. Thomas Hobbes's philosophy of speech.--Walter, O. M., Jr. Descartes on reasoning.--Douglas, D. G. Spinoza and the methodology of reflective knowledge in persuasion.--Howell, W. S. John Locke and the new rhetoric.--Doering, J. F. David Hume on oratory.--Douglas, D. G. A neo-Kantian approach to the epistomology of (...) judgment in criticism.--Bevilacqua, V. M. Lord Kames's theory of rhetoric.--Brockriede, W. E. Bentham's philosophy of rhetoric.--Anderson, R. E. Kierkegaard's theory of communication.--Macksoud, S. J. Ludwig Wittgenstein, radical operationism and rhetorical stance.--Stewart, J. J. L. Austin's speech act analysis.--Torrence, D. L. A philosophy of rhetoric from Bertrand Russell.--Clark, A. Martin Buber, dialogue, and the philosophy of rhetoric.--Bennett, W. Kenneth Burke--a philosophy in defense of un-reason.--Dearin, R. D. The philosophical basis of Chaim Perelman's theory of rhetoric. (shrink)
This volume brings together a diverse range of perspectives reflecting the international appeal and multi-disciplinary interest that Oakeshott now attracts. The essays offer a variety of approaches to Oakeshott’s thought — testament to the abiding depth, originality, suggestiveness and complexity of his writings. The essays include contributions from well-known Oakeshott scholars along with ample representation from a new generation. As a collection these essays challenge Oakeshott’s reputation as merely a ‘critic of social planning’.Contributors include Josiah Lee Auspitz, Debra Candreva, Wendell (...) John Coats Jr., Douglas DenUyl, George Feaver, Paul Franco, Richard Friedman, Timothy Fuller, Robert Grant, Eric S. Kos, Leslie Marsh, Kenneth Minogue, Terry Nardin, Keith Sutherland, Martyn Thompson and Gerhard Wolmarans. (shrink)
ABSTRACT In this wide-ranging interview Professor Douglas V. Porpora discusses a number of issues. First, how he became a Critical Realist through his early work on the concept of structure. Second, drawing on his Reconstructing Sociology, his take on the current state of American sociology. This leads to discussion of the broader range of his work as part of Margaret Archer’s various Centre for Social Ontology projects, and on moral-macro reasoning and the concept of truth in political discourse.
If “perfectionism” in ethics refers to those normative theories that treat the fulfillment or realization of human nature as central to an account of both goodness and moral obligation, in what sense is “human flourishing” a perfectionist notion? How much of what we take “human flourishing” to signify is the result of our understanding of human nature? Is the content of this concept simply read off an examination of our nature? Is there no place for diversity and individuality? Is the (...) belief that the content of such a normative concept can be determined by an appeal to human nature merely the result of epistemological naiveté? What is the exact character of the connection between human flourishing and human nature? These questions are the ultimate concern of this essay, but to appreciate the answers that will be offered it is necessary to understand what is meant by “human flourishing.” “Human flourishing” is a relatively recent term in ethics. It seems to have developed in the last two decades because the traditional translation of the Greek term eudaimonia as “happiness” failed to communicate clearly that eudaimonia was an objective good, not merely a subjective good. (shrink)
Discussions of Karl Popper's falsificationist philosophy of science appear regularly in the recent literature on economic methodology. In this literature, there seem to be two fundamental points of agreement about Popper. First, most economists take Popper's falsificationist method of bold conjecture and severe test to be the correct characterization of scientific conduct in the physical sciences. Second, most economists admit that economic theory fails miserably when judged by these same falsificationist standards. As Latsis states, “the development of economic analysis would (...) look a dismal affair through falsificationist spectacles.”. (shrink)
In this article, I argue that Brad Hooker's rule-consequentialism implausibly implies that what earthlings are morally required to sacrifice for the sake of helping their less fortunate brethren depends on whether or not other people exist on some distant planet even when these others would be too far away for earthlings to affect.
Faced with the choice between creating a risk of harm and taking a precaution against that risk, should I take the precaution? Does the proper analysis of this trade-off require a maximizing, utilitarian approach? If not, how does one properly analyze the trade-off? These questions are important, for we often are uncertain about the effects of our actions. Accordingly, we often must consider whether our actions create an unreasonable risk of injury — that is, whether our actions are negligent.
In the extended mind literature, one sometimes finds the claim that there is no neural correlate of consciousness. Instead, there is a biological or ecological correlate of consciousness. Consciousness, it is claimed, supervenes on an entire organism in action. Alva Noë is one of the leading proponents of such a view. This paper resists Noë's view. First, it challenges the evidence he offers from neuroplasticity. Second, it presses a problem with paralysis. Third, it draws attention to a challenge from the (...) existence of metamers and visual illusions. (shrink)
The Oxford English Dictionary says that a rite is ‘a formal procedure or act in a religious or other solemn observance’. The word comes into English through the French rite from the Latin ritus . Its original meaning escapes etymologists; and this is a mixed blessing, for we neither can nor must attempt a retrieval of its hidden roots. We are told by respectable etymologists that the word is associated from earliest times with Latin religious usage, but that even in (...) the early Latin it was already extended to ‘custom, usage, manner or way’ of a non-religious sort. [Lewis and Short, A Latin Dictionary .] So, too, in modern languages the terms ‘rite’ and ‘ritual’ have specifically religious meaning, but they are also used in social and cultural settings that we would not call religious. What first strikes us about the terms ’ and ‘ritual’ is an emphasis upon a certain formality, upon a regular and stable way in which an action or set of actions is to be performed. A ritual is more than a formalism, however, since there are formalisms that are not rites, such as the logical rules for making a valid argument. Moreover, the term is frequently associated with the terms ‘myth’, ‘symbol’ and ‘faith’. These, too, are primarily religious, but are also extended to non-religious contexts. Indeed, there seems to be a network of such terms whose usage touches upon some extraordinary quality in things. Like them, the term ‘ritual’ shares both a wide variety of meanings and a certain hint of impropriety. The variety of ritual forms is notorious, ranging from the most sacred religious liturgies to the absurdities of a fraternity initiation; and the impropriety of the term breaks out whenever we brand a certain action ‘ritualistic’, just as we sometimes refer slightingly to an assertion, saying it is ‘mythical’, ‘merely symbolic’ or ‘credulous’. (shrink)
A rational defense of the criminal law must provide a comprehensive theory of culpability. A comprehensive theory of culpability must resolve several difficult issues; in this article I will focus on only one. The general problem arises from the lack of a systematic account of relative culpability. An account of relative culpability would identify and defend a set of considerations to assess whether, why, under what circumstances, and to what extent persons who perform a criminal act with a given culpable (...) state are more or less blameworthy than persons who perform that act with a different culpable state. (shrink)
I. Beyond Utilitarianism In the summer of 1982, I published an article called “Missiles and Morals,” in which I argued on utilitarian grounds that nuclear deterrence in its present form is not morally justifiable. The argument of “Missiles and Morals” compared the most likely sort of nuclear war to develop under nuclear deterrence with the most likely sort of nuclear war to develop under American unilateral nuclear disaramament. For a variety of reasons, I claimed diat the number of casualties in (...) a two-sided nuclear war developing under DET would be at least fifteen times greater than the number of casualties in a one-sided nuclear attack developing under UND. If one assumes that human lives lost or saved is the principal criterion by which nuclear weapons policies should be measured, it follows that DET is morally superior to UND on utilitarian grounds only if the chance of a two-sided nuclear war under DET is more than fifteen times less dian the chance of a one-sided nuclear attack under UND. Since I did not believe that the chance of nuclear war under deterrence is fifteen times less than the chance of nuclear war under unilateral nuclear disarmament, I inferred diat utilitaranism failed to justify DET. Indeed, on utilitarian grounds, DET stood condemned. (shrink)
Commonsense Consequentialism is a book about morality, rationality, and the interconnections between the two. In it, Douglas W. Portmore defends a version of consequentialism that both comports with our commonsense moral intuitions and shares with other consequentialist theories the same compelling teleological conception of practical reasons. Broadly construed, consequentialism is the view that an act's deontic status is determined by how its outcome ranks relative to those of the available alternatives on some evaluative ranking. Portmore argues that outcomes should (...) be ranked, not according to their impersonal value, but according to how much reason the relevant agent has to desire that each outcome obtains and that, when outcomes are ranked in this way, we arrive at a version of consequentialism that can better account for our commonsense moral intuitions than even many forms of deontology can. What's more, Portmore argues that we should accept this version of consequentialism, because we should accept both that an agent can be morally required to do only what she has most reason to do and that what she has most reason to do is to perform the act that would produce the outcome that she has most reason to want to obtain.Although the primary aim of the book is to defend a particular moral theory, Portmore defends this theory as part of a coherent whole concerning our commonsense views about the nature and substance of both morality and rationality. Thus, it will be of interest not only to those working on consequentialism and other areas of normative ethics, but also to those working in metaethics. Beyond offering an account of morality, Portmore offers accounts of practical reasons, practical rationality, and the objective/subjective obligation distinction. (shrink)
To be truly provocative and outrageous the superior philosophical sophistry will commonly possess four somewhat adventitious features. I shall rate it as classic if it has all four. First, and least adventitiously, the argument will be crisp and initially seductive. Second, by the standard the sophistry sets direct rebuttal will be laborious and diffuse. Third, the recipe for the latter will prescribe that we pick out some hitherto unarticulated logical principle such that if the principle be true then the sophistical (...) argument must be invalid, and then, on the strength of that consequence assume the principle to be true. Consequently and fourth, as an antidote parody is supreme. With a persuasive absence of fuss and bias we can turn the tables if we show that, if the sophistical argument were really valid, then some structurally similar argument would prove just as consummately far too much. In short, from the rhetorical point of view at least, Gaunilo is more lethal than Kant. Even if the similarity is defective, the sophist will lose some of his adventitious and insufferable poise, if he ventures to show why. (shrink)
Since something cannot be conscious without being a conscious subject, a complete physicalist explanation of consciousness must resolve an issue first raised by Thomas Nagel, namely to explain why a particular mass of atoms that comprises my body gives rise to me as conscious subject, rather than someone else. In this essay, I describe a thought-experiment that suggests that physicalism lacks the resources to address Nagel's question and seems to pose a counter-example to any form of non-reductive physicalism relying on (...) the mind–body supervenience thesis, which would include William Hasker's emergent dualism. Since the particular thought-experiment does not pose any problems for classical substance dualism and since the problem, as I call it, of explaining subjectivity is the central problem of mind, I conclude that CSD is better supported than any form of non-reductive physicalism. (shrink)
Old philosophical problems never die, but they can be reinterpreted. In this paper, I offer a reinterpretation of the problem of reconciling divine omniscience and human free will. Classical discussions of this problem concentrate on the nature of God and the concept of free will. The present discussion will focus attention on the concept of knowledge, drawing on developments in epistemology that resulted from the posing of a certain problem by Edmund Gettier in 1963.
One version of the free-will argument relies on the claim that, other things being equal, a world in which free beings exist is morally preferable to a world in which free beings do not exist . I argue that this version of the free-will argument cannot support a theodicy that should alleviate the doubts about God's existence to which the problems of evil give rise. In particular, I argue that the value thesis has no foundation in common intuitions about morality. (...) Without some sort of intuitive support, the value thesis lacks the resources to serve as the foundation for a theodicy that addresses the powerful intuition, which affects believers and non-believers alike, that a perfect God would not allow so much evil. (shrink)
Ever since the Proslogion was first circulated , critics have been bemused by St Anselm's brazen attempt to establish a matter of fact, namely, God's existence, from the simple analysis of a term or concept. Yet every critic who has proposed to ‘write the obituary’ of the Ontological Argument has found it to be remarkably resilient . At the risk of adding to a record of failures, I want to venture a new method for attacking this durable argument. Neither the (...) common version of Anselm's argument from Chapter II of the Proslogion nor the previously unrecognized modal version uncovered by Norman Malcolm from Pros , III can possibly get under way without Anselm's celebrated assertion that God is that than which no greater can be conceived. (shrink)
During the Gulf war, CNN correspondent Peter Arnett distinguished himself with its courageous reporting in Iraq while under fire by the U.S.-led coalition which dropped more bombs on Iraq than were unleashed in World War II. Reporting live from Baghdad throughout the war, Arnett provided vivid daily accounts of life in Iraq during one of the most sustained air attacks in history. From his live telephone reporting of the early hours of the U.S. attack on Iraq in January 1991 through (...) his live satellite reports of the effects of the daily bombing of Iraq, Arnett distinguished himself through his attempts to cut through the lies and disinformation of both sides and to provide accurate reporting on the effects of the U.S.-led coalition assault against Iraq. (shrink)
Fundamentals of Critical Argumentation presents the basic tools for the identification, analysis, and evaluation of common arguments for beginners. The book teaches by using examples of arguments in dialogues, both in the text itself and in the exercises. Examples of controversial legal, political, and ethical arguments are analyzed. Illustrating the most common kinds of arguments, the book also explains how to evaluate each kind by critical questioning. Douglas Walton shows how arguments can be reasonable under the right dialogue conditions (...) by using critical questions to evaluate them. The book teaches by example, both in the text itself and in exercises, but it is based on methods that have been developed through the author's thirty years of research in argumentation studies. (shrink)
Kenneth F. Schaffner compares the practice of biological and medical research and shows how traditional topics in philosophy of science--such as the nature of theories and of explanation--can illuminate the life sciences. While Schaffner pays some attention to the conceptual questions of evolutionary biology, his chief focus is on the examples that immunology, human genetics, neuroscience, and internal medicine provide for examinations of the way scientists develop, examine, test, and apply theories. Although traditional philosophy of science has regarded scientific (...) discovery--the questions of creativity in science--as a subject for psychological rather than philosophical study, Schaffner argues that recent work in cognitive science and artificial intelligence enables researchers to rationally analyze the nature of discovery. As a philosopher of science who holds an M.D., he has examined biomedical work from the inside and uses detailed examples from the entire range of the life sciences to support the semantic approach to scientific theories, addressing whether there are "laws" in the life sciences as there are in the physical sciences. Schaffner's novel use of philosophical tools to deal with scientific research in all of its complexity provides a distinctive angle on basic questions of scientific evaluation and explanation. (shrink)
Although fallacies have been common since Aristotle, until recently little attention has been devoted to identifying and defining them. Furthermore, the concept of fallacy itself has lacked a sufficiently clear meaning to make it a useful tool for evaluating arguments. Douglas Walton takes a new analytical look at the concept of fallacy and presents an up-to-date analysis of its usefulness for argumentation studies. Walton uses case studies illustrating familiar arguments and tricky deceptions in everyday conversation where the charge of (...) fallaciousness is at issue. The numerous case studies show in concrete terms many practical aspects of how to use textual evidence to identify and analyze fallacies and to evaluate arguments as fallacious. Walton looks at how an argument is used in the context of conversation. He defines a fallacy as a conversational move, or sequence of moves, that is supposed to be an argument that contributes to the purpose of the conversation but in reality interferes with it. The view is a pragmatic one, based on the assumption that when people argue, they do so in a context of dialogue, a conventionalized normative framework that is goal-directed. Such a contextual framework is shown to be crucial in determining whether an argument has been used correctly. Walton also shows how examples of fallacies given in the logic textbooks characteristically turn out to be variants of reasonable, even if defeasible or questionable arguments, based on presumptive reasoning. This is the essence of the evaluation problem. A key thesis of the book, which must not be taken for granted as previous textbooks have so often done, is that you can spot a fallacy from how it was used in a context of dialogue. This is an innovative and even, as Walton notes, "a radical and controversial" theory of fallacy. (shrink)
Properties and objects are everywhere, but remain a philosophical mystery. Douglas Ehring argues that the idea of tropes--properties and relations understood as particulars--provides the best foundation for a metaphysical account of properties and objects. He develops and defends a new theory of trope nominalism.