How are we to appraise new technological developments that may bring revolutionary social changes? Currently this is often done by trying to predict or anticipate social consequences and to use these as a basis for moral and regulatory appraisal. Such an approach can, however, not deal with the uncertainties and unknowns that are inherent in social changes induced by technological development. An alternative approach is proposed that conceives of the introduction of new technologies into society as a social experiment. An (...) ethical framework for the acceptability of such experiments is developed based on the bioethical principles for experiments with human subjects: non-maleficence, beneficence, respect for autonomy, and justice. This provides a handle for the moral and regulatory assessment of new technologies and their impact on society. (shrink)
Three philosophical perspectives on the relation between technology and society are distinguished and discussed: 1) technology as an autonomous force that determines society; 2) technology as a human construct that can be shaped by human values, and 3) a co-evolutionary perspective on technology and society where neither of them determines the other. The historical evolution of the three perspectives is discussed and it is argued that all three are still present in current debates about technological change and how it may (...) affect society. This is illustrated for the case of Artificial Intelligence (AI). It is argued that each of the three perspectives contributes to the debate of AI but that the third has the strongest potential to uncover blind spots in the current debate. (shrink)
Safe-by-design aims at addressing safety issues already during the R&D and design phases of new technologies. SbD has increasingly become popular in the last few years for addressing the risks of emerging technologies like nanotechnology and synthetic biology. We ask to what extent SbD approaches can deal with uncertainty, in particular with indeterminacy, i.e., the fact that the actual safety of a technology depends on the behavior of actors in the value chain like users and operators. We argue that while (...) indeterminacy may be approached by designing out users as much as possible in attaining safety, this is often not a good strategy. It will not only make it more difficult to deal with unexpected risks; it also misses out on the resources that users can bring for achieving safety, and it is undemocratic. We argue that rather than directly designing for safety, it is better to design for the responsibility for safety, i.e., designers should think where the responsibility for safety is best situated and design technologies accordingly. We propose some heuristics that can be used in deciding how to share and distribute responsibility for safety through design. (shrink)
This paper aims at contributing to a research agenda in engineering ethics by exploring the ethical aspects of engineering design processes. A number of ethically relevant topics with respect to design processes are identified. These topics could be a subject for further research in the field of engineering ethics. In addition, it is argued that the way design processes are now organised and should be organised from a normative point of view is an important topic for research.
Courses on ethics and technology have become compulsory for many students at the three Dutch technical universities during the past few years. During this time, teachers have faced a number of didactic problems, which are partly due to a growing number of students. In order to deal with these challenges, teachers in ethics at the three technical universities in the Netherlands â in Delft, Eindhoven and Twente â have developed a web-based computer program called Agora (see www.ethicsandtechnology.com). This program enables (...) students to exercise their ethical understanding and skills extensively. The program makes it possible for students to participate actively in moral reflection and reasoning, and to develop the moral competencies that are needed in their later professional practice. The developers of the program have tried to avoid two traps. Firstly, they rejected, from the outset, a cookbook style of dealing with ethical problems that applied ethics is often taken to be and, secondly, they wanted to design a flexible program that respects the studentâs as well as the teacherâs creativity, and that tries to engage students in moral reflection. Agora meets these requirements. The program offers possibilities that extend beyond the requirements that are usually accepted for case-exercises in applied ethics, and that have been realised in several other computer models for teaching ethics. In this article, we describe the main considerations in the development of Agora and the features of the resulting program. (shrink)
The Safe-by-Design approach in synthetic biology holds the promise of designing the building blocks of life in an organism guided by the value of safety. This paves a new way for using biotechnologies safely. However, the Safe-by-Design approach moves the bulk of the responsibility for safety to the actors in the research and development phase. Also, it assumes that safety can be defined and understood by all stakeholders in the same way. These assumptions are problematic and might actually undermine safety. (...) This research explores these assumptions through the use of a Group Decision Room. In this set up, anonymous and non-anonymous deliberation methods are used for different stakeholders to exchange views. During the session, a potential synthetic biology application is used as a case for investigation: the Food Warden, a biosensor contained in meat packaging for indicating the freshness of meat. Participants discuss what potential issues might arise, how responsibilities should be distributed in a forward-looking way, who is to blame if something would go wrong. They are also asked what safety and responsibility mean at different phases, and for different stakeholders. The results of the session are not generalizable, but provide valuable insights. Issues of safety cannot all be taken care of in the R&D phase. Also, when things go wrong, there are proximal and distal causes to consider. In addition, capacities of actors play an important role in defining their responsibilities. Last but not least, this research provides a new perspective on the role of instruction manuals in achieving safety. (shrink)
In recent years, informed consent has been suggested as a way to deal with risks posed by engineered nanomaterials. We argue that while we can learn from experiences with informed consent in treatment and research contexts, we should be aware that informed consent traditionally pertains to certain features of the relationships between doctors and patients and researchers and research participants, rather than those between producers and consumers and employers and employees, which are more prominent in the case of engineered nanomaterials. (...) To better understand these differences, we identify three major relational factors that influence whether valid informed consent is obtainable, namely dependency, personal proximity, and existence of shared interests. We show that each type of relationship offers different opportunities for reflection and therefore poses distinct challenges for obtaining valid informed consent. Our analysis offers a systematic understanding of the possibilities for attaining informed consent in the context of nanomaterial risks and makes clear that measures or regulations to improve the obtainment of informed consent should be attuned to the specific interpersonal relations to which it is supposed to apply. (shrink)
Erratum to: Book Symposium on Peter Paul Verbeek’s Moralizing Technology: Understanding and Designing the Morality of Things . Chicago: University of Chicago Press, 2011 Content Type Journal Article Category Erratum Pages 1-27 DOI 10.1007/s13347-011-0058-z Authors Evan Selinger, Dept. Philosophy, Rochester Institute of Technology, Rochester, NY, USA Don Ihde, Dept. Philosophy, Stony Brook University, Stony Brook, NY, USA Ibo van de Poel, Delft University of Technology, Delft, the Netherlands Martin Peterson, Eindhoven University of Technology, Eindhoven, the Netherlands Peter-Paul Verbeek, Dept. Philosophy, (...) Twente University, Enschede, the Netherlands Journal Philosophy & Technology Online ISSN 2210-5441 Print ISSN 2210-5433. (shrink)