Skip to main content
Log in

A framework for the ethical impact assessment of information technology

  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

This paper proposes a framework for an ethical impact assessment which can be performed in regard to any policy, service, project or programme involving information technology. The framework is structured on the four principles posited by Beauchamp and Childress together with a separate section on privacy and data protection. The framework identifies key social values and ethical issues, provides some brief explanatory contextual information which is then followed by a set of questions aimed at the technology developer or policy-maker to facilitate consideration of ethical issues, in consultation with stakeholders, which may arise in their undertaking. In addition, the framework includes a set of ethical tools and procedural practices which can be employed as part of the ethical impact assessment. Although the framework has been developed within a European context, it could be applied equally well beyond European borders.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. Helft (2010).

  2. Hofmann refers specifically to health technology, but his observation may well be applicable to any technology. Hofmann (2005, p. 288).

  3. Moor (1985).

  4. Nissenbaum (2004).

  5. http://cordis.europa.eu/fp7/ethics_en.html#ethics_cl

  6. Treasury Board of Canada Secretariat 2002.

  7. [UK] Information Commissioner’s Office (ICO) 2009.

  8. Marx (1998). Van Gorp also proposed a list of questions “that helps researchers doing research in technological fields to identify ethical aspects of their research.” Van Gorp (2009).

  9. Dekker says ethical reflection in technology assessment requires an engagement of experts from different disciplines for two reasons: “Firstly, the technical, economical, legal and social aspects are deeply cross-correlated with the ethical reflection. And secondly, participating in such interdisciplinary discussions enables an ethical reflection which keeps in touch with the real world.” See Dekker (2004).

  10. http://eurlex.europa.eu/JOHtml.do?uri=OJ:C:2007:306:SOM:EN:HTML

  11. http://www.europarl.europa.eu/charter/pdf/text_en.pdf

  12. European Commission 2007.

  13. For a state-of-the-art review, see Renn (2008).

  14. Technology assessments as an instrument for counselling political decision-makers were given a major impetus with the establishment of the Office for Technology Assessment (OTA) by the US Congress in 1972. Similar organisations were subsequently established in Europe, both at the Member State level (e.g., the Danish Board of Technology) and at the European level (e.g., the European Parliament’s office of Science and Technology Options Assessment (STOA)). STOA is a member of the European Parliamentary Technology Assessment Network (EPTA). Other EPTA members are the national parliamentary technology assessment bodies of Denmark, Finland, France, Germany, Greece, Italy, the Netherlands and the United Kingdom.

  15. For a good overview of developments in this area, see Kirkpatrick and Parker (2007).

  16. Skorupinski and Ott (2002, p. 97).

  17. Skorupinski and Ott, p. 98.

  18. Beekman et al. (2006), p. 13).

  19. Palm and Hansson (2006). An extensive set of criteria, some of which are ethical, for assessing emerging technologies can be found in Kuzma et al. (2008). Kuzma et al. also use a question approach for assessing emerging technologies.

  20. Sollie and Düwell (2009).

  21. Sollie and Düwell, p. 4.

  22. Verbeek (2009, p. 67, 71).

  23. Hofmann, p. 289. He observes (p. 288) that there appears to be broad agreement among scholars that technology is value-laden.

  24. Orlikowski and Iacono (2001, p. 131).

  25. Orlikowski and Iacono, p. 130.

  26. Orlikowski and Iacono, p. 131.

  27. Orlikowski and Iacono, p. 133.

  28. Beauchamp and Childress (2001).

  29. Beauchamp and Childress, p. 58.

  30. www.un.org/Overview/rights.html

  31. Boddy (2004, p. 39). LOCOMOTION was a project funded by the European Commission’s Fifth Framework Programme (FP5).

  32. Boddy, p. 40.

  33. Boddy, p. 48.

  34. For ethical considerations re implants, see the European Group on Ethics in Science and New Technologies (EGE) 2005.

  35. Goldberg et al. (2001).

  36. Beauchamp and Childress 2001, p. 113 and p. 115.

  37. European Council resolution on e-Inclusion 2001.

  38. Palm and Hansson, p. 552.

  39. Social sorting is a process of classifying people and populations according to varying criteria, to determine who should be targeted for special treatment, suspicion, eligibility, inclusion, access and so on. See Lyon (2003, p. 20).

  40. For more on profiling and social sorting, see Hildebrant and Gutwirth (2008) as well Lyon, op. cit.

  41. Beauchamp and Childress, p. 165.

  42. European Parliament and Council 2002.

  43. On 28 January 2009, the European Commission announced its aim to achieve 100 per cent high-speed Internet coverage for all citizens by 2010. See European Commission 2009. http://europa.eu/rapid/pressReleasesAction.do?reference=MEMO/09/35

  44. Johnson (2009).

  45. European Commission 2007.

  46. See the statement by Oracle: “Oracle Welcomes New EU Policy on e-Inclusion.” http://www.oracle.com/global/eu/public-policy/fs/new-e-inclusion-policy.html

  47. European Commission 2007, p. 4.

  48. Flanagan et al. (2008).

  49. Flanagan, et al., p. 335.

  50. Palm and Hansson, p. 553. See also Anke van Gorp who also includes sustainability in his checklist of ethical issues and in this sense. van Gorp, op. cit., p. 41.

  51. Beauchamp and Childress 2001, p. 226.

  52. Maiese (2003).

  53. Marx, p. 174.

  54. Marx, p. 174.

  55. http://conventions.coe.int/treaty/en/Treaties/Html/005.htm

  56. Brey (2000). Previous to this, Moor commented that “From the point of view of ethical theory, privacy is a curious value. On the one hand, it seems to be something of very great importance and something vital to defend, and, on the other hand, privacy seems to be a matter of individual preference, culturally relative, and difficult to justify in general.” He goes onto argue that privacy has both instrumental value (that which is good because it leads to something else which is good) and intrinsic value (that which is good in itself). Moor (1997).

  57. Clarke (2007).

  58. The Guidelines don’t specify or define what “where appropriate” means.

  59. European Parliament and Council 2006.

  60. Marx, p. 174.

  61. Marx, p. 174.

  62. Vedder and Custers (2009, p. 25).

  63. Brey, op. cit., p. 126.

  64. International Organization for Standardization 1999.

  65. Article 29 Data Protection Working Party 1997.

    http://ec.europa.eu/justice_home/fsj/privacy/workinggroup/wpdocs/1997_en.htm

  66. Article 29 Working Party 2008.

  67. Beekman (2006).

  68. Beekman et al., p. 14.

  69. Beekman and Brom (2007, pp. 3–4).

  70. Beekman et al., p. 21.

  71. Beekman et al., p. 6. Although Rowe and Frewer do not focus specifically on ethical tools, nevertheless, they do provide a long list of different mechanisms for engaging stakeholders, including the public, some of which could be used to facilitate an ethical impact assessment. See Rowe and Frewer (2005). Also of interest in this regard is Essays 9 & 10 in Chap. 8 in Renn, op. cit., pp. 273–352. Renn says, “A combination of analytic and deliberative instruments (or stakeholders and the public) is instrumental in reducing complexity, necessary for handling uncertainty and mandatory for dealing with ambiguity. Uncertainty and ambiguity cannot be resolved by expertise only” (p. 350). The two essays are useful guidance for ethical impact assessment as well as risk governance.

  72. Beekman and Brom, p. 6.

  73. Beekman et al., p. 46.

  74. Beekman et al., p. 20.

  75. ENISA is the acronym of the European Network and Information Security Agency. www.enisa.europa.eu.

  76. van Gorp, op. cit.

  77. See Beekman et al., p. 21, pp. 28–29. The ethical matrix concept was developed by Ben Mepham. See Mepham (2005).

  78. Skorupinski and Ott (2002, p. 119).

  79. Stern and Fineberg (1996).

  80. Sollie (2007, p. 302). Moor 2005, op. cit., p. 118, also supports better collaboration among ethicists, scientists, social scientists and technologists.

  81. Palm and Hansson, p. 547.

  82. US National Research Council 1989, p. 9.

  83. Stern and Fineberg, pp. 23–24.

  84. Moor (2005). In his paper, Moor proposes the following hypothesis, which he calls “Moor’s Law: As technological revolutions increase their social impact, ethical problems increase.”

  85. Palm and Hansson, pp. 550–551.

  86. Beekman et al., p. 26.

  87. Renn, op. cit.

  88. Sollie 2007, op. cit., p. 295.

  89. European Commission 2000.

  90. Verbeek, p. 72, uses this example.

  91. Vedder and Custers, p. 30.

  92. Ibid., p. 32.

  93. von Schomberg (2007).

  94. Article 2 of the mandate given to the EGE states: “The task of the EGE shall be to advise the Commission on ethical questions relating to sciences and new technologies, either at the request of the Commission or on its own initiative. The Parliament and the Council may draw the Commission’s attention to questions which they consider to be of major ethical importance. The Commission shall, when seeking the opinion of the EGE, set a time limit within which an opinion shall be given.” http://ec.europa.eu/european_group_ethics/mandate/index_en.htm

  95. http://ec.europa.eu/european_group_ethics/link/index_en.htm#4

  96. Palm and Hansson, p. 550.

  97. Skorupinski and Ott (2002, pp. 117–120).

  98. Verbeek indirectly offers at least two reasons supporting an ethical impact assessment. “Two forms of designer responsibility can be distinguished here. First, designers can anticipate the impact, side-effects and mediating roles of the technology they are designing. On the basis of such anticipations, they could adapt the original design, or refrain from the design at all. Second, designers can also take a more radical step and deliberately design technologies in terms of their mediating roles. In that case, they explicitly design behavior-influencing or ‘moralizing’ technologies: designers then inscribe desirable mediating effects in technologies.” Verbeek, p. 70.

  99. Verbeek, op. cit.

  100. Palm and Hansson, op. cit., pp. 547–548, p. 550. Moor (2005, p. 118), makes a similar point: “We can foresee only so far into the future… We cannot anticipate every ethical issue that will arise from the developing technology… our ethical understanding of developing technology will never be complete. Nevertheless, we can do much to unpack the potential consequences of new technology. We have to do as much as we can while realizing applied ethics is a dynamic enterprise that continually requires reassessment of the situation.” See also Brey, op. cit.

References

  • Article 29 Data Protection Working Party, Recommendation 3/97: Anonymity on the Internet (WP 6), Adopted on 3 December 1997. http://ec.europa.eu/justice_home/fsj/privacy/.

  • Article 29 Working Party, Opinion on data protection issues related to search engines, 00737/EN, WP 148, Adopted on 4 April 2008. http://ec.europa.eu/justice_home/fsj/privacy/workinggroup/wpdocs/2008_en.htm.

  • Beauchamp, T. L., & Childress, J. F. (2001). Principles of biomedical ethics (5th ed.). New York: Oxford University Press.

    Google Scholar 

  • Beekman, V., et al. (2006). Ethical bio-technology assessment tools for agriculture and food production, Final Report of the Ethical Bio-TA Tools project, LEI, The Hague, February. http://www.ethicaltools.info.

  • Beekman, V., & Brom, F. W. A. (2007). Ethical tools to support systematic public deliberations about the ethical aspects of agricultural biotechnologies. Journal of Agricultural and Environmental Ethics, 20(1), 3–12.

    Article  Google Scholar 

  • Boddy, Dr Ken, LOCOMOTION Ethical Study Report, Deliverable D 3.3, Final Version, September 2004. http://cordis.europa.eu/search/index.cfm?fuseaction=proj.document&PJ_LANG=EN&PJ_RCN=6099060&pid=37&q=6AF6FCCDA9FE6C99B48B10861AFEBDDA&type=sim.

  • Brey, P. (2000). Method in computer ethics: Towards a multi-level interdisciplinary approach. Ethics and Information Technology, 2(2), 125–129.

    Article  Google Scholar 

  • Clarke, R. (2007). Introduction to dataveillance and information privacy, and definitions of terms, Aug. http://www.rogerclarke.com/DV/Intro.html.

  • Dekker, M. (2004). The role of ethics in interdisciplinary technology assessment. Poiesis & Praxis, 2(2–3), 139–156.

    MathSciNet  Google Scholar 

  • European Commission, Ageing well in the Information Society, Action Plan on Information and Communication Technologies and Ageing, An i2010 Initiative, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, COM (2007) 332 final, Brussels, 14 June 2007.

  • European Commission, Communication on the precautionary principle, COM (2000)1, Brussels, 2 Feb 2000.

  • European Commission, Commission earmarks €1bn for investment in broadband—Frequently Asked Questions, Press release, MEMO/09/35, Brussels, 28 January 2009. http://europa.eu/rapid/pressReleasesAction.do?reference=MEMO/09/35.

  • European Commission, The European Research Area: New Perspectives, Green Paper, COM(2007) 161 final, Brussels, 4 Apr 2007.

  • European Commission, European i2010 initiative on e-Inclusion: “To be part of the information society”, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, COM (2007) 694 final, Brussels, 8 Nov 2007.

  • European Council resolution on e-Inclusion, exploiting the opportunities of the information society for social inclusion, 2001/C 292/02, OJ 18 Oct 2001.

  • European Group on Ethics in Science and New Technologies (EGE), Opinion No. 20 on Ethical Aspects of ICT Implants in the Human Body, Adopted on 16 March 2005.

  • European Parliament and Council, Directive 2001/20/EC of 4 April 2001 on the approximation of the laws, regulations and administrative provisions of the Member States relating to the implementation of good clinical practice in the conduct of clinical trials on medicinal products for human use, OJ L 121/34, Brussels, 1 May 2001.

  • European Parliament and Council, Directive 2002/22/EC of 7 March 2002 on universal service and users’ rights relating to electronic communications networks and services (Universal Service Directive), Official Journal L 108 of 24 April 2002.

  • European Parliament and Council, Directive 2006/24/EC on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC, 15 March 2006.

  • European Parliament and Council, Directive 95/46/EC of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, OJ L281/31 of 23 Nov 1995.

  • Flanagan, M., Howe, D. C., & Nissenbaum, H. (2008). Embodying values in technology: theory and practice. In J. van den Hoven & J. Weckert (Eds.), Information technology and moral philosophy (pp. 322–353). Cambridge: Cambridge University Press.

    Google Scholar 

  • Goldberg, I., Hill, A., & Shostack, A. (2001). Trust, ethics, and privacy. Boston University Law Review, 81, 101–116.

    Google Scholar 

  • Helft, M. (2010). Critics say Google invades privacy with new service. The New York Times, 12 Feb. http://www.nytimes.com/2010/02/13/technology/internet/13google.html.

  • Hildebrant, M., & Gutwirth, S. (2008). Profiling the European Citizen. Dordrecht: Springer.

    Book  Google Scholar 

  • Hofmann, B. (2005). On value-judgements and ethics in health technology assessment. Poiesis & Praxis, 3(4), 277–295.

    Article  Google Scholar 

  • International Organization for Standardization, ISO/IEC 15408, Information technology—Security techniques—Evaluation criteria for IT security, First edition, International Organization for Standardization, Geneva, 1999.

  • Johnson, B. (2009). Finland makes broadband access a legal right. The Guardian, 14 Oct. http://www.guardian.co.uk/technology/2009/oct/14/finland-broadband.

  • Kirkpatrick, C., & Parker, D. (Eds.). (2007). Regulatory impact assessment: towards better regulation?. Cheltenham, UK: Edward Elgar.

    Google Scholar 

  • Kuzma, J., et al. (2008). An integrated approach to oversight assessment for emerging technologies. Risk Analysis, 28(5), 1197–1219.

    Article  MathSciNet  Google Scholar 

  • Lyon, D. (2003). Surveillance as social sorting: privacy, risk, and digital discrimination. London: Routledge.

    Google Scholar 

  • Maiese, M. (2003) Principles of Justice and Fairness, Beyond Intractability.org, July. http://www.beyondintractability.org/essay/principles_of_justice/.

  • Marx, G. T. (1998). Ethics for the new surveillance. The Information Society, 14, 171–185.

    Article  Google Scholar 

  • Mepham, T. B. (2005). Bioethics: An introduction for the biosciences. Oxford: Oxford University Press.

    Google Scholar 

  • Moor, J. H. (1985). What is Computer Ethics? In T. W. Bynum (Ed.), Computers & Ethics (pp. 266–275). Oxford: Blackwell.

    Google Scholar 

  • Moor, J. H. (1997). Towards a theory of privacy in the information age. Computers and Society, 27, 27–32.

    Google Scholar 

  • Moor, J. H. (2005). Why we need better ethics for emerging technologies. Ethics and Information Technology, 7(3), 111–119.

    Article  Google Scholar 

  • Nissenbaum, H. (2004). Privacy as contextual integrity. Washington Law Review, 79(1), 101–139.

    Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD), Guidelines on the Transborder Flows of Personal Data, Paris, 23 Sept 1980. http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html.

  • Orlikowski, W. J., & Iacono, C. S. (2001). Research commentary: Desperately seeking the “IT” in IT research—a call to theorizing the IT artifact. Information Systems Research, 12(2), 121–134.

    Article  Google Scholar 

  • Palm, E., & Hansson, S. O. (2006). The case for ethical technology assessment (eTA). Technological Forecasting & Social Change, 73, 543–558.

    Article  Google Scholar 

  • Renn, O. (2008). Risk governance: coping with uncertainty in a complex world. London: Earthscan.

    Google Scholar 

  • Rowe, G., & Frewer, L. J. (2005). A Typology of Public Engagement Mechanisms. Science, Technology & Human Values, 30(2), 251–290. http://sth.sagepub.com/cgi/content/abstract/30/2/251.

  • Skorupinski, B., & Ott, K. (2002). Technology assessment and ethics. Poiesis & Praxis, 1, 95–122.

    Google Scholar 

  • Sollie, P. (2007). Ethics, technology development and uncertainty: an outline for any future ethics of technology. Journal of Information Communications & Ethics in Society, 5(4), 293–306.

    Article  MathSciNet  Google Scholar 

  • Sollie, P., & Düwell, M. (2009). Evaluating new technologies: Methodological problems for the ethical assessment of technology developments. Dordrecht: Springer.

    Google Scholar 

  • Stern, P. C., & Fineberg, H. V. (Eds.). (1996). Understanding risk: Informing decisions in a democratic society. Washington, DC: Committee on Risk Characterization, National Research Council, National Academy Press.

    Google Scholar 

  • Treasury Board of Canada Secretariat, Privacy Impact Assessment Guidelines: A framework to Manage Privacy Risks, Ottawa, 31 Aug 2002.

  • UK Information Commissioner’s Office (ICO), Privacy Impact Assessment Handbook, Version 2.0, June 2009. http://www.ico.gov.uk/for_organisations/topic_specific_guides/pia_handbook.aspx.

  • US National Research Council, Committee on Risk Perception and Communications, Improving Risk Communication, National Academy Press, Washington, D.C.,1989. http://www.nap.edu/openbook.php?record_id=1189&page=R1.

  • Van Gorp, A. (2009). Ethics in and during technological research; An addition to IT ethics and science ethics. In P. Sollie & M. Düwell (Eds.), Evaluating new technologies (pp. 35–50). Dordrecht: Springer.

    Chapter  Google Scholar 

  • Vedder, A., & Custers, B. (2009). Whose responsibility is it anyway? Dealing with the consequences of new technologies. In P. Sollie & M. Düwell (Eds.), Evaluating new technologies. Dordrecht: Springer.

    Google Scholar 

  • Verbeek, P.-P. (2009). The moral relevance of technological artifacts. In P. Sollie & M. Düwell (Eds.), Evaluating new technologies: methodological problems for the ethical assessment of technology developments (pp. 63–79). Dordrecht: Springer.

    Google Scholar 

  • von Schomberg, R. (2007). From the ethics of technology towards an ethics of knowledge policy & knowledge assessment. Working document from the European Commission Services, Jan.

Download references

Acknowledgments

The author acknowledges with thanks the thoughtful and detailed comments of the three anonymous reviewers as well as those of Guido van Steendam, professor of ethics at the University of Leuven, which have helped improve this paper. This paper is based in part on work undertaken by the author in two projects funded under the European Commission’s Seventh Framework Programme: SENIOR (Social Ethical and Privacy Needs in ICT for older People: a Dialogue Roadmap, grant agreement no. 216820) and PRESCIENT (Privacy and emerging fields of science and technology: Towards a common framework for privacy and ethical assessment, grant agreement no. 244779). The views in this paper are those of the author alone and are in no way intended to reflect those of the European Commission.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Wright.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wright, D. A framework for the ethical impact assessment of information technology. Ethics Inf Technol 13, 199–226 (2011). https://doi.org/10.1007/s10676-010-9242-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-010-9242-6

Keywords

Navigation