Risk management of nanotechnology is challenged by the enormous uncertainties about the risks, benefits, properties, and future direction of nanotechnology applications. Because of these uncertainties, traditional risk management principles such as acceptable risk, cost–benefit analysis, and feasibility are unworkable, as is the newest risk management principle, the precautionary principle. Yet, simply waiting for these uncertainties to be resolved before undertaking risk management efforts would not be prudent, in part because of the growing public concerns about nanotechnology driven by risk perception (...) heuristics such as affect and availability. A more reflexive, incremental, and cooperative risk management approach is required, which not only will help manage emerging risks from nanotechnology applications, but will also create a new risk management model for managing future emerging technologies. (shrink)
Like all technologies, nanotechnology will inevitably present risks, whether they result from unintentional effects of otherwise beneficial applications, or from the malevolent misuse of technology. Increasingly, risks from new and emerging technologies are being regulated at the international level, although governments and private experts are only beginning to consider the appropriate international responses to nanotechnology. In this paper, we explore both the potential risks posed by nanotechnology and potential regulatory frameworks that law may impose. In so doing, we also explore (...) the various rationales for international regulation including the potential for cross-boundary harms, sharing of regulatory expertise and resources, controlling protectionism and trade conflicts, avoiding a “race to the bottom” in which governments seek economic advantage through lax regulation, and limiting the “nano divide” between North and South. Finally, we examine some models for international regulation and offer tentative thoughts on the prospects for each. (shrink)
There is much we do not know about nanotechnology. Despite its tremendous promise, nanotechnology today is mostly forecast and fervent hope. Predictions that spending on nanotechnology will increase from current levels of $13 billion to more than $1 trillion by 2015 are no more than that – simply predictions. Hopes that nanotechnology will be an essential part of solving the globe's energy, food, and water problems should be tempered by recalling a century of revolutionary technologies that failed to live up (...) to their early promise such as nuclear energy, supersonic airplanes, or gene therapy. Many other questions continue to nip at nanotechnology's heels, not the least of which are debates about what is and is not technically feasible. Despite these uncertainties, we can have complete confidence in one aspect of nanotechnology's future – it will be subject to a host of regulations. (shrink)
Scientific research is subject to a number of regulations which impose incidental (time, place), rather than substantive (type of research), restrictions on scientific research and the knowledge created through such research. In recent years, however, the premise that scientific research and knowledge should be free from substantive regulation has increasingly been called into question. Some have suggested that the law should be used as a tool to substantively restrict research which is dual-use in nature or which raises moral objections. There (...) are, however, some problems with using law to restrict or prohibit certain types of scientific research, including (i) the inherent imprecision of law for regulating complex and rapidly evolving scientific research; (ii) the difficulties of enforcing legal restrictions on an activity that is international in scope; (iii) the limited predictability of the consequences of restricting specific branches of scientific research; (iv) inertia in the legislative process; and (v) the susceptibility of legislators and regulators to inappropriate factors and influence. Rather than using law to restrict scientific research, it may be more appropriate and effective to use a combination of non-traditional legal tools including norms, codes of conduct, restrictions on publication, and scientist-developed voluntary standards to regulate problematic scientific research. (shrink)
As policy makers struggle to develop regulatory oversight models for nanotechnologies, there are important lessons that can be drawn from previous attempts to govern other emerging technologies. Five such lessons are the following: public confidence and trust in a technology and its regulatory oversight is probably the most important factor for the commercial success of a technology; regulation should avoid discriminating against particular technologies unless there is a scientifically based rationale for the disparate treatment; regulatory systems need to be flexible (...) and adaptive to rapidly changing technologies; ethical and social concerns of the public about emerging technologies need to be expressly acknowledged and addressed in regulatory oversight; and international harmonization of regulation may be beneficial in a rapidly globalizing world. (shrink)
Nanotechnology is the latest in a growing list of emerging technologies that includes nuclear technologies, genetics, reproductive biology, biotechnology, information technology, robotics, communication technologies, surveillance technologies, synthetic biology, and neuroscience. As was the case for many of the technologies that came before, a key question facing nanotechnology is what type of regulatory oversight is appropriate for this emerging technology. As two of us wrote several years ago, the question facing nanotechnology is not whether it will be regulated, but when and (...) how.Yet, appropriate regulation of nanotechnology will be challenging. The term “nanotechnology” incorporates a broad, diverse range of materials, technologies, and products, with an even greater spectrum of potential risks and benefits. This technology slashes across the jurisdiction of many existing regulatory statutes and regulatory agencies, and does so across the globe. Nanotechnology is developing at an enormously rapid rate, perhaps surpassing the capability of any potential regulatory framework to keep pace. Finally, the risks of nanotechnology remain largely unknown, both because of the multitude of variations in the technology and because of the limited applicability of traditional toxicological approaches such as structure-activity relationship to nanotechnology products. (shrink)
As the health care system transitions to a precision medicine approach that tailors clinical care to the genetic profile of the individual patient, there is a potential tension between the clinical uptake of new technologies by providers and the legal system's expectation of the standard of care in applying such technologies. We examine this tension by comparing the type of evidence that physicians and courts are likely to rely on in determining a duty to recommend pharmacogenetic testing of patients prescribed (...) the oral anti-coagulant drug warfarin. There is a large body of inconsistent evidence and factors for and against such testing, but physicians and courts are likely to weigh this evidence differently. The potential implications for medical malpractice risk are evaluated and discussed. (shrink)
Clinical trials of nanotechnology medical products present complex risk management challenges that involve many uncertainties and important risk-risk trade-offs. This paper inquires whether the precautionary principle can help to inform risk management approaches to nanomedicine clinical trials. It concludes that prudent precaution may be appropriate for ensuring the safety of such trials, but that the precautionary principle itself, especially in its more extreme forms, does not provide useful guidance for specific safety measures.
Medical technologies, including nanomedicine products, are intended to improve health but in many cases may also create their own health risks. Medical products that create their own health risks differ from most other risk-creating technologies in that the very purpose of the medical technology is to prevent or treat health risks. This paradox of technologies intended to reduce existing risks that may have the effect of creating new risks has two conflicting implications. On one hand, we may be more tolerant (...) of health risks from medical technologies because these products are intended to, and often do, reduce overall health risks and improve our health. The health benefits of a medical technology may outweigh the unavoidable adverse effects of that same technology in an individual patient or in the overall treated population. (shrink)
This book offers a powerful response to what Varner calls the "two dogmas of environmental ethics"--the assumptions that animal rights philosophies and anthropocentric views are each antithetical to sound environmental policy. Allowing that every living organism has interests which ought, other things being equal, to be protected, Varner contends that some interests take priority over others. He defends both a sentientist principle giving priority to the lives of organisms with conscious desires and an anthropocentric principle giving priority to certain very (...) inclusive interests which only humans have. He then shows that these principles not only comport with but provide significant support for environmental goals. (shrink)
Drawing heavily on recent empirical research to update R.M. Hare's two-level utilitarianism and expand Hare's treatment of "intuitive level rules," Gary Varner considers in detail the theory's application to animals while arguing that Hare should have recognized a hierarchy of persons, near-persons, & the merely sentient.
E-Z Reader 7 is a processing model of eye-movement control. One constraint imposed on the model is that high-level cognitive processes do not influence eye movements unless normal reading processes are disturbed. I suggest that this constraint is unnecessary, and that the model provides a sensible architecture for explaining how both low- and high-level processes influence eye movements.
Much of the scientific literature on vegetarian nutrition leaves one with the impression that vegan diets are significantly more risky than omnivorous ones, especially for individuals with high metabolic demands (such as pregnant or lactating women and children). But nutrition researchers have tended to skew their study populations toward new vegetarians, members of religious sects with especially restrictive diets and tendencies to eschew fortified foods and medical care, and these are arguably the last people we would expect to thrive on (...) vegan diets. Researchers also have some tendency to play up weakly confirmed risks of vegan dietsvis-à-vis equally weakly confirmed benefits. And, in spite of these methodological and rhetorical biases, for every nutrient which vegans are warned to be cognizant of, there is reason to believe that they are not at significantly greater risk of nutritional deficiency than omnivores. (shrink)
In his recent essay on moral pluralism in environmental ethics, J. Baird Callicott exaggerates the advantages of monism, ignoring the environmentally unsound implications of Leopold’s holism. In addition, he fails to see that Leopold’s view requires the same kind of intellectual schitzophrenia for which he criticizes the version of moral pluralism advocated by Christopher D. Stone in Earth and Other Ethics. If itis plausible to say that holistic entities like ecosystems are directly morally considerable-and that is a very big if-it (...) must be for a very different reason than is usually given for saying that individual human beings are directly morally considerable. (shrink)
The standard means of seeking the classical limit in Bohmian mechanics is through the imposition of vanishing quantum force and quantum potential for pure states. We argue that this approach fails, and that the Bohmian classical limit can be realized only by combining narrow wave packets, mixed states, and environmental decoherence.
In this essay I criticize recent attempts to prove that the concept of lying does not include the intent to deceive. I argue that examples by Isenberg and Carson fail to prove that one can lie without intending to deceive and, furthermore, that untoward consequences would follow if these authors were correct. I conclude that since intending to deceive is indeed a necessary condition of lying, the class of statements that constitute lies is smaller than what Isenberg et al. would (...) suggest. Hence the class of deceptive advertisements is also correspondingly smaller. (shrink)
Without looking beyond the conditions under which laying hens typically live in the contemporary U.S. egg industry, we can understand why the production and consumption of factory farmed eggs could be judged immoral. However, the question, What (if anything) is wrong with animal by-products? cannot always be adequately answered by looking at the conditions under which animals live out their productive lives. For the dairy industry looks benign in those terms, but if we look beyond the conditions under which milk (...) cows live, we can better understand some animal rights activists' reasons for objecting to dairy products. The contemporary U.S. dairy industry requires a slaughter industry between one-seventh and one-third the size of the contemporary beef industry. Today, beef slaughter is vastly more humane than poultry slaughter, but if today's beef slaughter industry is judged emmoral, the contemporary dairy industry should be judged similarly immoral, because the two are wedded. This is the deep reason for moral suspicion of the dairy industry. (shrink)
In his recent article Should Trees Have Standing? Revisited" Christopher D. Stone has effectively withdrawn his proposal that natural objects be granted legal rights, in response to criticism from the Feinberg/McCloskey camp. Stone now favors a weaker proposal that natural objects be granted what he calls legal "considerateness". I argue that Stone's retreat is both unnecessary and undesirable. I develop the notion of a "de facto" legal right and argue that species already have de facto legal rights as statutory beneficiaries (...) of the "Endangered Species Act of 1973." I conclude that granting certain nonhuman natural entities legal rights is both more important and less costly that Stone and his critics have realized, and that it is not Stone's original proposal which needs rethinking, but the concept of interests at work in the Feinberg/McCloskey position. (shrink)
Standard economic theory does not capture trust among anonymous Internet traders. But when traders are allowed to have social preferences, uncertainty about a seller's morals opens the door for trust, reward, exploitation and reputation building. We report experiments suggesting that sellers' intrinsic motivations to be trustworthy are not sufficient to sustain trade when not complemented by a feedback system. We demonstrate that it is the interaction of social preferences and cleverly designed reputation mechanisms that solves to a large extent the (...) trust problem on Internet market platforms. However, economic theory and social preference models tend to underestimate the difficulties of promoting trust in anonymous online trading communities. (shrink)
In Use and Abuse Revisited: Response to Pluhar and Varner, Kathryn Paxton George misunderstands the point of my essay, In Defense of the Vegan Ideal: Rhetoric and Bias in the Nutrition Literature. I did not claim that the nutrition literature unambiguously confirms that vegans are not at significantly greater risk of deficiencies than omnivores. Rather than settling any empirical controversy, my aim was to show how the literature can give the casual reader a skewed impression of what is known about (...) the risks of a vegan diet. In this brief rejoinder, I illustrate how two essays by nutritionists in the same volume as George's and my essays, and a referee's report on my manuscript which was authored by a nutritionist, confirm the soundness of this basic insight. (shrink)
Charles S. Peirce sketches "a nest of three arguments for the Reality of God" in his article "A Neglected Argument for the Reality of God." I provide careful analysis and explication of Peirce's argument, along with consideration of some objections. I argue that there are significant differences between Peirce's neglected argument and the traditional arguments for God's existence; Peirce's analysis of the neglected argument into three arguments is misleading; there are two distinct levels of argument that Peirce does not recognize; (...) and it is doubtful whether the argument meets all the criteria set by Peirce himself. (shrink)
In this essay it will be argued that if preferential treatment for individuals who have suffered from past discrimination is permissible in any context, it should be extended to the allocation of scarce medical resources. This contention will be based on two facts: one, that health care, in particular certain life-saving operations, constitutes a scarce social good similar to but more important than other social goods such as desirable jobs and positions in desirable professional schools; secondly, that a claim can (...) plausibly be made that the greater incidence of death due to heart disease among blacks is a result of the effect of past discrimination. In addition, an argument to the effect that preferential treatment is indeed permissible will be sketched that is based upon a critique of the decision in the Bakke case. (shrink)
The qigong state of bigu is believed to be supported by the absorption of qi from the universe. Gamma radiation is ubiquitous in the cosmos and, according to some, may be a possible source of energy for cellular functioning. When the concept of energy is integrated with the concept of dynamical systems, the logic leads to the theory—termed systemic memory—that predicts all systems, from the micro to macro, store information and energy to various degrees. New research indicates that the human (...) body absorbs gamma radiation from the environment and emits high-frequency X rays. There are substantial individual differences in these effects that appear to be related, in part, to the psychological state of the person. Future research can determine if bigu is associated with increased absorption of gamma radiation and/or decreased emission of high frequency X rays. The hypothesis that qi can be viewed as quality information is proposed. (shrink)
Laboratory studies find a strategic component to moral behaviour that differs in significant ways from common perceptions of how morality works. Models based on a preference for relative payoffs offer an explanation.
This volume presents readings in philosophy from around the world and across history, from the Buddha to bell hooks, organized around traditional Anglo-European philosophical themes such as freedom and the existence of God. An introductory section discusses the nature of philosophy and gives advice on reading philosophical texts, and introductions to selections provide background and questions for thought.
Edmund Husserl's historical importance is marked by a curious conjunction. He is easily among the most influential philosophers of the twentieth century and yet no one has taken up his view. The first of these has received a monumental amount of consideration, the second virtually none. But the second, in its own way, is at least equally remarkable. In this essay I will consider why his view of philosophy found no subscribers and what we might make of this legacy of (...) failure. (shrink)
This book addresses the question of deconstruction by asking what it is and discussing its alternatives. To what extent does deconstruction derive from a philosophical stance, and to what extent does it depend upon a set of strategies, moves, and rhetorical practices that result in criticism? Special attention is given to the formulations offered by Jacques Derrida and by Paul de Man . And what, in deconstructive terms, does it mean to translate from one textual corpus into another? Is it (...) a matter of different theories of translation or of different practices? And what of difference itself? Does not difference already invoke the possibility of deconstruction’s “others”? Althusser, Adorno, and Deleuze are offered as exemplary cases. The essays in this volume examine in detail these differences and alternatives. The Textual Sublime is particularly concerned with how a text sets its own limits, borders, and margins, how it delimits what constitutes the text per se and how it invokes at the same time what is not determinately in the text. The textual sublime is that aspect of a text that deconstruction shows to be both an element of the text and what surpasses the text, what takes it outside itself and what ties it to differing philosophical, rhetorical, historical, and critical practices. (shrink)