In the age of Big Data, companies and governments are increasingly using algorithms to inform hiring decisions, employee management, policing, credit scoring, insurance pricing, and many more aspects of our lives. Artificial intelligence systems can help us make evidence-driven, efficient decisions, but can also confront us with unjustified, discriminatory decisions wrongly assumed to be accurate because they are made automatically and quantitatively. It is becoming evident that these technological developments are consequential to people’s fundamental human rights. Despite increasing attention to (...) these urgent challenges in recent years, technical solutions to these complex socio-ethical problems are often developed without empirical study of societal context and the critical input of societal stakeholders who are impacted by the technology. On the other hand, calls for more ethically and socially aware AI often fail to provide answers for how to proceed beyond stressing the importance of transparency, explainability, and fairness. Bridging these socio-technical gaps and the deep divide between abstract value language and design requirements is essential to facilitate nuanced, context-dependent design choices that will support moral and social values. In this paper, we bridge this divide through the framework of Design for Values, drawing on methodologies of Value Sensitive Design and Participatory Design to present a roadmap for proactively engaging societal stakeholders to translate fundamental human rights into context-dependent design requirements through a structured, inclusive, and transparent process. (shrink)
When thinking about ethics, technology is often only mentioned as the source of our problems, not as a potential solution to our moral dilemmas. When thinking about technology, ethics is often only mentioned as a constraint on developments, not as a source and spring of innovation. In this paper, we argue that ethics can be the source of technological development rather than just a constraint and technological progress can create moral progress rather than just moral problems. We show this by (...) an analysis of how technology can contribute to the solution of so-called moral overload or moral dilemmas. Such dilemmas typically create a moral residue that is the basis of a second-order principle that tells us to reshape the world so that we can meet all our moral obligations. We can do so, among other things, through guided technological innovation. (shrink)
Information technology is an integral part of the practices and institutions of post-industrial society. It is also a source of hard moral questions and thus is both a probing and relevant area for moral theory. In this volume, an international team of philosophers sheds light on many of the ethical issues arising from information technology, including informational privacy, digital divide and equal access, e-trust and tele-democracy. Collectively, these essays demonstrate how accounts of equality and justice, property and privacy benefit from (...) taking into account how information technology has shaped our social and epistemic practices and our moral experiences. Information technology changes the way that we look at the world and deal with one another. It calls, therefore, for a re-examination of notions such as friendship, care, commitment and trust. (shrink)
It has been argued that the Internet and social media increase the number of available viewpoints, perspectives, ideas and opinions available, leading to a very diverse pool of information. However, critics have argued that algorithms used by search engines, social networking platforms and other large online intermediaries actually decrease information diversity by forming so-called “filter bubbles”. This may form a serious threat to our democracies. In response to this threat others have developed algorithms and digital tools to combat filter bubbles. (...) This paper first provides examples of different software designs that try to break filter bubbles. Secondly, we show how norms required by two democracy models dominate the tools that are developed to fight the filter bubbles, while norms of other models are completely missing in the tools. The paper in conclusion argues that democracy itself is a contested concept and points to a variety of norms. Designers of diversity enhancing tools must thus be exposed to diverse conceptions of democracy. (shrink)
Many of our interactions in the twenty-first century - both good and bad - take place by means of institutions, technology, and artefacts. We inhabit a world of implements, instruments, devices, systems, gadgets, and infrastructures. Technology is not only something that we make, but is also something that in many ways makes us. The discipline of ethics must take this constitutive feature of institutions and technology into account; thus, ethics must in turn be embedded in our institutions and technology. The (...) contributors to this book argue that the methodology of 'designing in ethics' - addressing and resolving the issues raised by technology through the use of appropriate technological design - is the way to achieve this integration. They apply their original methodology to a wide range of institutions and technologies, using case studies from the fields of healthcare, media and security. Their volume will be important for philosophical practitioners and theorists alike. (shrink)
Although applications are being developed and have reached the market, nanopharmacy to date is generally still conceived as an emerging technology. Its concept is ill-defined. Nanopharmacy can also be construed as a converging technology, which combines features of multiple technologies, ranging from nanotechnology to medicine and ICT. It is still debated whether its features give rise to new ethical issues or that issues associated with nanopharma are merely an extension of existing issues in the underlying fields. We argue here that, (...) regardless of the alleged newness of the ethical issues involved, developments occasioned by technological advances affect the roles played by stakeholders in the field of nanopharmacy to such an extent that this calls for a different approach to responsible innovation in this field. Specific features associated with nanopharmacy itself and features introduced to the associated converging technologies- bring about a shift in the roles of stakeholders that call for a different approach to responsibility. We suggest that Value Sensitive Design is a suitable framework to involve stakeholders in addressing moral issues responsibly at an early stage of development of new nanopharmaceuticals. (shrink)
In the twenty-first century, the urgent problems the world is facing are increasingly related to vast and intricate ‘systems of systems’, which comprise both socio-technical and eco-systems. In order for engineers to adequately and responsibly respond to these problems, they cannot focus on only one technical or any other aspect in isolation, but must adopt a wider and multidisciplinary perspective of these systems, including an ethical and social perspective. Engineering curricula should therefore focus on what we call ‘comprehensive engineering’. Comprehensive (...) engineering implies ethical coherence, consilience of scientific disciplines, and cooperation between parties. (shrink)
In the twenty-first century, the urgent problems the world is facing are increasingly related to vast and intricate ‘systems of systems’, which comprise both socio-technical and eco-systems. In order for engineers to adequately and responsibly respond to these problems, they cannot focus on only one technical or any other aspect in isolation, but must adopt a wider and multidisciplinary perspective of these systems, including an ethical and social perspective. Engineering curricula should therefore focus on what we call ‘comprehensive engineering’. Comprehensive (...) engineering implies ethical coherence, consilience of scientific disciplines, and cooperation between parties. (shrink)
In this article, we report on eight grand challenges for value sensitive design, which were developed at a one-week workshop, Value Sensitive Design: Charting the Next Decade, Lorentz Center, Leiden, The Netherlands, November 14–18, 2016. A grand challenge is a substantial problem, opportunity, or question that motives sustained research and design activity. The eight grand challenges are: Accounting for Power, Evaluating Value Sensitive Design, Framing and Prioritizing Values, Professional and Industry Appropriation, Tech policy, Values and Human Emotions, Value Sensitive Design (...) and Intelligent Algorithms, and Value Tensions. Each grand challenge consists of a discussion of its importance and a set of tractable key questions. (shrink)
In this article, we introduce the Special Issue, Value Sensitive Design: Charting the Next Decade, which arose from a week-long workshop hosted by Lorentz Center, Leiden, The Netherlands, November 14–18, 2016. Forty-one researchers and designers, ranging in seniority from doctoral students to full professors, from Australia, Europe, and North America, and representing a wide range of academic fields participated in the workshop. The first article in the special issue puts forward eight grand challenges for value sensitive design to help guide (...) and shape the field. It is followed by 16 articles consisting of value sensitive design nuggets—short pieces of writing on a new idea, method, challenge, application, or other concept that engages some aspect of value sensitive design. The nuggets are grouped into three clusters: theory, method, and applications. Taken together the grand challenges and nuggets point the way forward for value sensitive design into the next decade and beyond. (shrink)
This paper presents the first bibliometric mapping analysis of the field of computer and information ethics (C&IE). It provides a map of the relations between 400 key terms in the field. This term map can be used to get an overview of concepts and topics in the field and to identify relations between information and communication technology concepts on the one hand and ethical concepts on the other hand. To produce the term map, a data set of over thousand articles (...) published in leading journals and conference proceedings in the C&IE field was constructed. With the help of various computer algorithms, key terms were identified in the titles and abstracts of the articles and co-occurrence frequencies of these key terms were calculated. Based on the co-occurrence frequencies, the term map was constructed. This was done using a computer program called VOSviewer. The term map provides a visual representation of the C&IE field and, more specifically, of the organization of the field around three main concepts, namely privacy, ethics, and the Internet. (shrink)
Information technology is widely used to fulfill societal goals such as safety and security. These application areas put ever changing demands on the functionality of the technology. Designing technological appliances to be reconfigurable, thereby keeping them open to functionalities yet to be determined, will possibly allow the technology to fulfill these changing demands in an efficient way. In this paper we present a first exploration of potential societal and moral issues of reconfigurable sensors developed for application in the safety and (...) security domain, in the context of a large scale R&D-project in the Netherlands. We discuss the subtle distinction between the relevant notions of reconfigurability, function creep, and unrestricted or unforeseen technological affordances. We argue that the feature of reconfigurability makes context of use the central issue in the assessment of the societal and moral impact of the technology. It follows that the design of good policies for new application contexts has to be central in a value sensitive design approach to reconfigurable technology. (shrink)
Ethical issues of information and communication technologies (ICTs) are important because they can have significant effects on human liberty, happiness, and people’s ability to lead a good life. They are also of functional interest because they can determine whether technologies are used and whether their positive potential can unfold. For these reasons, policy makers are interested in finding out what these issues are and how they can be addressed. The best way of creating ICT policy that is sensitive to ethical (...) issues pertain to being proactive in addressing such issues at an early stage of the technology life cycle. The present paper uses this position as a starting point and discusses how knowledge of ethical aspects of emerging ICTs can be gained. It develops a methodology that goes beyond established futures methodologies to cater for the difficult nature of ethical issues. The authors outline how the description of emerging ICTs can be used for an ethical analysis. (shrink)
The EDPS Ethics Advisory Group (EAG) has carried out its work against the backdrop of two significant social-political moments: a growing interest in ethical issues, both in the public and in the private spheres and the imminent entry into force of the General Data Protection Regulation (GDPR) in May 2018. For some, this may nourish a perception that the work of the EAG represents a challenge to data protection professionals, particularly to lawyers in the field, as well as to companies (...) struggling to adapt their processes and routines to the requirements of the GDPR. What is the purpose of a report on digital ethics, if the GDPR already provides all regulatory requirements to protect European citizens with regard to the processing of their personal data? Does the existence of this EAG mean that a new normative ethics of data protection will be expected to fill regulatory gaps in data protection law with more flexible, and thus less easily enforceable ethical rules? Does the work of the EAG signal a weakening of the foundation of legal doctrine, such as the rule of law, the theory of justice, or the fundamental values supporting human rights, and a strengthening of a more cultural approach to data protection? Not at all. The reflections of the EAG contained in this report are not intended as the continuation of policy by other means. It neither supersedes nor supplements the law or the work of legal practitioners. Its aims and means are different. On the one hand, the report seeks to map and analyse current and future paradigm shifts which are characterised by a general shift from analogue experience of human life to a digital one. On the other hand, and in light of this shift, it seeks to re-evaluate our understanding of the fundamental values most crucial to the well-being of people, those taken for granted in a data-driven society and those most at risk. The objective of this report is thus not to generate definitive answers, nor to articulate new norms for present and future digital societies but to identify and describe the most crucial questions for the urgent conversation to come. This requires a conversation between legislators and data protection experts, but also society at large - because the issues identified in this report concern us all, not only as citizens but also as individuals. They concern us in our daily lives, whether at home or at work and there isn’t a place we could travel to where they would cease to concern us as members of the human species. (shrink)
We argue that nano-technology in the form of invisible tags, sensors, and Radio Frequency Identity Chips (RFIDs) will give rise to privacy issues that are in two ways different from the traditional privacy issues of the last decades. One, they will not exclusively revolve around the idea of centralization of surveillance and concentration of power, as the metaphor of the Panopticon suggests, but will be about constant observation at decentralized levels. Two, privacy concerns may not exclusively be about constraining information (...) flows but also about designing of materials and nano-artifacts such as chips and tags. We begin by presenting a framework for structuring the current debates on privacy, and then present our arguments. (shrink)
Ethical issues of information and communication technologies (ICTs) are important because they can have significant effects on human liberty, happiness, their ability to lead a good life. They are also of functional interest because they can determine whether technologies are used and whether their positive potential can unfold. For these reasons policy makers are interested in finding out what these issues are and how they can be addressed. The best way of creating ICT policy that is sensitive to ethical issues (...) would be to be proactive and address such issues at early stages of the technology life cycle. The present paper uses this position as a starting point and discusses how knowledge of ethical aspects of emerging ICTs can be gained. It develops a methodology that goes beyond established futures methodologies to cater for the difficult nature of ethical issues. The paper goes on to outline some of the preliminary findings of a European research project that has applied this method. (shrink)
Our lives are increasingly intertwined with the digital realm, and with new technology, new ethical problems emerge. The academic field that addresses these problems—which we tentatively call ‘digital ethics’—can be an important intellectual resource for policy making and regulation. This is why it is important to understand how the new ethical challenges of a digital society are being met by academic research. We have undertaken a scientometric analysis to arrive at a better understanding of the nature, scope and dynamics of (...) the field of digital ethics. Our approach in this paper shows how the field of digital ethics is distributed over various academic disciplines. By first having experts select a collection of keywords central to digital ethics, we have generated a dataset of articles discussing these issues. This approach allows us to generate a scientometric visualisation of the field of digital ethics, without being constrained by any preconceived definitions of academic disciplines. We have first of all found that the number of publications pertaining to digital ethics is exponentially increasing. We furthermore established that whereas one may expect digital ethics to be a species of ethics, we in fact found that the various questions pertaining to digital ethics are predominantly being discussed in computer science, law and biomedical science. It is in these fields, more than in the independent field of ethics, that ethical discourse is being developed around concrete and often technical issues. Moreover, it appears that some important ethical values are very prominent in one field, while being almost absent in others. We conclude that to get a thorough understanding of, and grip on, all the hard ethical questions of a digital society, ethicists, policy makers and legal scholars will need to familiarize themselves with the concrete and practical work that is being done across a range of different scientific fields to deal with these questions. (shrink)
It is argued that Pettit’s conception of “contestatory democracy” is superior to deliberative, direct and epistemic democracy. The strong and weak points of these conceptions are discussed drawing upon the work of a.o Bruce Bimber. It is further argued that ‘contestation’ and ‘information’ are highly relevant notions in thinking about, just, viable and sustainable design for E-democracy.
In this panel, we explore the future of value sensitive design (VSD). The stakes are high. Many in public and private sectors and in civil society are gradually realizing that taking our values seriously implies that we have to ensure that values effectively inform the design of technology which, in turn, shapes people’s lives. Value sensitive design offers a highly developed set of theory, tools, and methods to systematically do so.
In this paper, we consider the meaning, roles, and uses of trust in the economic and public domain, focusing on the task of designing systems for trust in information technology. We analyze this task by means of a survey of what trust means in the economic and public domain, using the model proposed by Lewicki and Bunker, and using the emerging paradigm of value-sensitive design. We explore the difficulties developers face when designing information technology for trust and show how our (...) analysis in conjunction with existing engineering design methods provides means to address these difficulties. Our main case concerns a concrete problem in the economic domain, namely the transfer of control from customs agencies to companies. Control of individual items is increasingly untenable and is replaced by control on the level of companies aimed at determining whether companies can be trusted to be in control of their business and to be in compliance with applicable regulations. This transfer sets the task for companies to establish this trust by means of information technology systems. We argue that this trust can be achieved by taking into account philosophical analyses of trust and by including both parties in the trust relationship as clients for whom the information technology systems are to be designed. (shrink)
The impact of the Internet on democracy is a widely discussed subject. Many writers view the Internet, potentially at least, as a boon to democracy and democratic practices. According to one popular theme, both e-mail and web pages give ordinary people powers of communication that have hitherto been the preserve of the relatively wealthy (Graham 1999, p. 79). So the Internet can be expected to close the influence gap between wealthy citizens and ordinary citizens, a weakness of many procedural democracies.
In discussions about justice, development, well-being and equality, the capability approach (CA)Footnote1 founded by economist Amartya Sen and philosopher Martha Nussbaum attaches central importance to individual human capabilities. These are the effective freedoms or real opportunities of people to achieve valuable ‘beings and doings’ (also called ‘functionings’ by capability theorists). Resources—including technical artifacts—may contribute to the expansion of one’s capabilities, but there may also be all sorts of ‘conversion factors’ in place that prevent this. The approach highlights the ‘multidimensionality’ of (...) well-being and sees people as active agents shaping their own lives. In 1998 Sen won the Nobel Prize in economics for his work, which has deeply influenced the United Nations Development Program (UNDP). In the field of development studies the CA has indeed gained popularity, but this is not the only area of application. (shrink)
This paper introduces the design principle of legibility as means to examine the epistemic and ethical conditions of sensing technologies. Emerging sensing technologies create new possibilities regarding what to measure, as well as how to analyze, interpret, and communicate said measurements. In doing so, they create ethical challenges for designers to navigate, specifically how the interpretation and communication of complex data affect moral values such as autonomy. Contemporary sensing technologies require layers of mediation and exposition to render what they sense (...) as intelligible and constructive to the end user, which is a value-laden design act. Legibility is positioned as both an evaluative lens and a design criterion, making it complimentary to existing frameworks such as value sensitive design. To concretize the notion of legibility, and understand how it could be utilized in both evaluative and anticipatory contexts, the case study of a vest embedded with sensors and an accompanying app for patients with chronic obstructive pulmonary disease is analyzed. (shrink)
This Open Access book shows how value sensitive design (VSD), responsible innovation, and comprehensive engineering can guide the rapid development of technological responses to the COVID-19 crisis. Responding to the ethical challenges of data-driven technologies and other tools requires thinking about values in the context of a pandemic as well as in a post-COVID world. Instilling values must be prioritized from the beginning, not only in the emergency response to the pandemic, but in how to proceed with new societal precedents (...) materializing, new norms of health surveillance, and new public health requirements. -/- The contributors with expertise in VSD bridge the gap between ethical acceptability and social acceptance. By addressing ethical acceptability and societal acceptance together, VSD guides COVID-technologies in a way that strengthens their ability to fight the virus, and outlines pathways for the resolution of moral dilemmas. This volume provides diachronic reflections on the crisis response to address long-term moral consequences in light of the post-pandemic future. Both contact-tracing apps and immunity passports must work in a multi-system environment, and will be required to succeed alongside institutions, incentive structures, regulatory bodies, and current legislation. This text appeals to students, researchers and importantly, professionals in the field. (shrink)
The development of ever smaller integrated circuits at the sub-micron and nanoscale—in accordance with Moore’s Law—drives the production of very small tags, smart cards, smart labels and sensors. Nanoelectronics and submicron technology supports surveillance technology which is practically invisible. I argue that one of the most urgent and immediate concerns associated with nanotechnology is privacy. Computing in the twenty-first century will not only be pervasive and ubiquitous, but also inconspicuous. If these features are not counteracted in design, they will facilitate (...) ubiquitous surveillance practices which are widely available, cheap, and intrusive. RFID technology is an instructive example of what nanotechnology has in store for privacy. (shrink)