Abstract
Emotions play a significant role in human relations, decision-making, and the motivation to act on those decisions. There are ongoing attempts to use artificial intelligence (AI) to read human emotions, and to predict human behavior or actions that may follow those emotions. However, a person’s emotions cannot be easily identified, measured, and evaluated by others, including automated machines and algorithms run by AI. The ethics of emotional AI is under research and this study has examined the emotional variables as well as the perception of emotional AI in two large random groups of college students in an international university in Japan, with a heavy representation of Japanese, Indonesian, Korean, Chinese, Thai, Vietnamese, and other Asian nationalities. Surveys with multiple close-ended questions and an open-ended essay question regarding emotional AI were administered for quantitative and qualitative analysis, respectively. The results demonstrate how ethically questionable results may be obtained through affective computing and by searching for correlations in a variety of factors in collected data to classify individuals into certain categories and thus aggravate bias and discrimination. Nevertheless, the qualitative study of students’ essays shows a rather optimistic view over the use of emotional AI, which helps underscore the need to increase awareness about the ethical pitfalls of AI technologies in the complex field of human emotions.
Similar content being viewed by others
References
Barrett, Lisa Feldman, Ralph Adolphs, Stacy Marsella, Alex M. Martinez, and Seth D. Pollak. 2019. Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological Science in the Public Interest 20: 1–68. https://doi.org/10.1177/1529100619832930.
Burton, Neel. 2015. Heaven and hell: The psychology of the emotions. Oxford: Acheron Press.
Ekman, Paul. 2017. Facial expressions. In The science of facial expression, ed. Jose-Miguel Fernández-Dols, and James A. Russell, 39–56. New York: Oxford University Press.
Fernández-Dols, Jose-Miguel, and James A. Russell (Eds.). 2017. The science of facial expression. New York: Oxford University Press.
Greene, Gretchen. 2020. The Ethics of AI and Emotional Intelligence: Data sources applications and questions for evaluating ethics risk. Partnership on AI, 30 July 2020. https://partnershiponai.org/paper/the-ethics-of-ai-and-emotional-intelligence/. Accessed 19 Nov 2022.
Keltner, Dacher, and Jennifer S. Lerner. 2010. Emotion. In Handbook of social psychology, ed. Susan T. Fiske, Daniel T. Gilbert, and Gardner Lindzey, 317–352. New York: John Wiley & Sons. https://doi.org/10.1002/9780470561119.socpsy001009..
Lerner, Jennifer S., Ye Li, Piercarlo Valdesolo, and Karim S. Kassam. 2015. Emotion and decision making. Annual Review of Psychology 66: 799–823. https://doi.org/10.1146/annurev-psych-010213-115043.
McStay, Andrew. 2020. Emotional AI and EdTech: Serving the public good? Learning, Media and Technology 45 (3): 270–283. https://doi.org/10.1080/17439884.2020.1686016.
Pang, Bo, and Lillian Lee. 2008. Opinion mining and sentiment analysis. Foundations and Trends in Information Retrieval 2(1–2): 1–135. https://www.cs.cornell.edu/home/llee/omsa/omsa.pdf. Accessed 19 Nov 2022.
Scopelliti, Irene, H. Lauren Min, Erin McCormick, Karim S. Kassam, and Carey K. Morewedge. 2018. Individual differences in correspondence bias: Measurement, consequences, and correction of biased interpersonal attributions. Management Science 64(4): 1879–1910. https://doi.org/10.1287/mnsc.2016.2668.
Strapparava, Carlo, and Rada Mihalcea. 2008. Learning to identify emotions in text. Proceedings of the 2008 ACM Symposium on Applied Computing, 1556–1560. https://web.eecs.umich.edu/~mihalcea/papers/strapparava.acm08.pdf. Accessed 19 Nov 2022.
Terzimehić, Nada, Svenja Yvonne Schött, Florian Bemmann, and Daniel Buschek. 2021. MEMEories: Internet memes as means for daily journaling. In DIS '21: Designing Interactive Systems Conference 2021, 538–548. New York, NY: Association for Computing Machinery. https://doi.org/10.1145/3461778.3462080.
Zetlin, Minda. 2018. AI is now analyzing candidates’ facial expressions during video job interviews. Inc., 28 February 2018. https://www.inc.com/minda-zetlin/ai-is-now-analyzing-candidates-facial-expressions-during-video-job-interviews.html. Accessed 19 Nov 2022.
Funding
This study received support from the “Emotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for an Ethical Life” funded by the JST-UKRI Joint Call on Artificial Intelligence and Society (2019).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethical Approval
The research project presented its research methodology to the ethical committee of the university with detailed explanation and received approval.
Consent to Participate
All students were free to participate in either the survey or the essay contest and could withdraw from the study whenever they wanted. The study was done anonymously.
Consent for Publication
The author and students who anonymously contributed to the study consent to publication of the research results considering the anonymity and respect to privacy of the participants.
Conflict of Interest
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Ghotbi, N. The Ethics of Emotional Artificial Intelligence: A Mixed Method Analysis. ABR 15, 417–430 (2023). https://doi.org/10.1007/s41649-022-00237-y
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s41649-022-00237-y