Skip to main content
Log in

No Need for Alarm: A Critical Analysis of Greene’s Dual-Process Theory of Moral Decision-Making

  • Original Paper
  • Published:
Neuroethics Aims and scope Submit manuscript

Abstract

Joshua Greene and his colleagues have proposed a dual-process theory of moral decision-making to account for the effects of emotional responses on our judgments about moral dilemmas that ask us to contemplate causing direct personal harm. Early formulations of the theory contrast emotional and cognitive decision-making, saying that each is the product of a separable neural system. Later formulations emphasize that emotions are also involved in cognitive processing. I argue that, given the acknowledgement that emotions inform cognitive decision-making, a single-process theory can explain all of the data that have been cited as evidence for Greene’s theory. The emotional response to the thought of causing harm may differ in degree, but not in kind, from other emotions influencing moral decision-making.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. They also use a set of nonmoral dilemmas (e.g. scenarios that asked participants to choose whether to take the bus or the train, given prevailing time constraints). Patterns of neural activity were similar for nonmoral and for impersonal moral dilemmas.

  2. These cognition-related areas found to be more active during contemplation of “impersonal” moral dilemmas were the right middle frontal gyrus (BA 46) and bilateral parietal areas (BA 7/40). The emotion-related areas active during “personal” moral dilemmas were the medial prefrontal cortex, Brodmann areas (BA) 9 and 10; the posterior cingulate cortex (BA 31) and bilateral angular gyrus (BA 39). In their second paper, some areas of the posterior cingulate (BA 31) were more active during contemplation of personal moral dilemmas, while others (BA 23, 31) were more active when impersonal moral dilemmas were presented.

  3. Greene et al. originally interpret these reaction time results as further evidence for the operation of two distinct processes, but later retract this claim in response to a reanalysis of their data by McGuire et al. [8]. (See [9] for Greene’s response regarding the implications of the reanalysis. I discuss this exchange further later on in this paper.) Berker [10] also provides important criticism of the reaction time data.

  4. Greene et al. refer to “cognitive” versus emotional representations and “cognitive” versus emotional processes. They don’t elaborate on this terminology but, roughly, it seems that “cognitive” processing is carried out over “cognitive” representations (with their attached affective states), while emotional processing occurs over intrinsically-emotional representations.

  5. The idea that the emotion system functions like an alarm bell is first introduced in [2]. Greene also discusses here the idea that “cognitive” representations are inherently neutral and are, or perhaps must be, contingently attached to emotional representations in order to exert any influence on behavior.

  6. Reflecting his main interests in his paper, Kahane characterizes the two versions of the theory as making claims about deontological versus utilitarian judgment, but I will use the distinction without reference to Greene’s conclusions about the relationship between his dual-process theory and normative ethical theories.

  7. Beyond the dual-process theory, the opposition between cognition and emotion has also been criticized. For a recent overview of this issue, see [14].

  8. Whether a dilemma counted as easy or difficult for a particular participant depending on the time they took to respond whether the action was morally appropriate or inappropriate. Greene et al. assumed that a longer reaction time indicated a more difficult decision. The infanticide dilemma was easy for the majority of participants, while the crying baby dilemma was difficult for most people. It should also be noted that there was less agreement among the responses to difficult than to easy dilemmas.

  9. See also Sauer [15] on this point. Cordelia Fine also notes that one neuroscientist of her acquaintance refers to the anterior cingulate as the “on button” because it is active in so many kinds of situations [16, p. 152].

  10. In fact, there is another important difference between this study and that of Koenigs et al.; Ciaramelli et al. used a random selection of dilemmas from Greene’s set), but did not distinguish between high conflict and low conflict dilemmas in their analysis. (Thank you to an anonymous reviewer of this paper for pointing out this difference.) Recall that Greene’s original set of personal moral dilemmas included “low conflict” ones in which participants were asked whether they would cause personal harm in scenarios in which that harm did not prevent greater harm to others (e.g. abandoning a baby you did not want to care for). Since Koenigs et al. found statistically significant differences between the groups only with regard to high conflict dilemmas, it may be that Ciaramelli et al. did not find a significant difference between the groups because they analyzed all of the (high and low conflict) personal moral dilemmas together.

  11. It is easy to see why Cushman et al. were misled, given the way that the study results were reported. Despite failing to find a statistically significant difference in the responses each group gave to personal moral dilemmas, Ciaramelli et al. still conducted post hoc analyses “for completeness.” On the basis of these post hoc results, they concluded that “patients were faster and more inclined than normal controls to authorize moral violations in personal moral dilemmas [emphasis mine]” [20, p. 88]. Actually, their data only demonstrate a statistically significant difference in the speed of responses.

  12. This observation may be difficult to reconcile with the suggestion, above, that these patients retain knowledge of social norms.

  13. Damasio explains his choice of terminology: “Because the feeling is about the body, I gave the phenomena the technical term somatic state…and because it ‘marks’ an image, I called it a marker…I use the term somatic in the most general sense (that which pertains to the body) and I include both visceral and nonvisceral sensations when I refer to somatic markers” [22, p. 173]. Damasio’s proposed mechanism for attaching emotional significance to possible outcomes is therefore based on a general ability to read bodily reactions, whereas Greene’s emotional process is explained in terms of a dedicated neural system.

  14. Damasio’s use of the phrase “alarm signal” should not be confused with Greene’s “alarm bell” terminology. Somatic markers are not, as the next paragraph will show, responses to very specific problems like the thought of causing personal harm; they are attached to all sorts of cognitive representations.

  15. I will return to this point later in the paper.

  16. Bartels follows Greene in identifying emotional/intuitive judgments with deontology and deliberate “cognitive” judgments with utilitarianism. Because I am not concerned here with the extension of Greene’s dual-process theory to normative moral theories, I will not consider this aspect of Bartels’s paper.

  17. Oddly, in a later paper these authors seem much more sympathetic to Greene’s theory, though they do not address the reason for this change [34]. The primary conclusion of this second paper is that the distinction between personal and impersonal moral dilemmas is, contra McGuire et al. [9], reliable. Since Greene’s own discussions of the dual-process theory have moved away from this distinction, I will not discuss the second paper by Moore et al. any further.

  18. Evolutionary psychology is itself controversial. (For criticisms, see ([40]; [41]), among others.) I will not address this controversy here, but will simply examine whether Greene’s hypothesized module meets the criteria for modularity that have been set out by evolutionary psychologists themselves.

  19. Greene’s suggestion that the system responds to perceived unfairness comes from work by Sanfey et al. [45] that shows that perceived unfairness is associated with activation in brain areas similar to those activated by the footbridge problem. See Crockett et al. [30] for an alternative explanation of the emotional response to unfairness.

  20. Greene describes the emotional response as modular in the context of his “modular myopia” hypothesis. This hypothesis is one of the major new theoretical developments presented in the book. Briefly, the “myopia” of the emotional response module occurs because it is not able to see harmful side effects of contemplated actions (thus explaining why the switch dilemma does not trigger a strong emotional response). Because this new development does not change the dual-process theory itself, I do not consider it further in this paper.

  21. Fodor draws a distinction between informational encapsulation (other cognitive processes do not influence the operation of the module) and inaccessibility to central processes (the operation of the module is not open to introspection). I have suggested here that Greene thinks that both of these are true of the emotional process: the first because the alarm bell is nonnegotiable and the second because of moral dumbfounding.

References

  1. Greene, J.D., Sommerville, R.B., Nystrom, L.E., Darley, J.M., & Cohen, J.D. (2001) An fMRI investigation of emotional engagement in moral Judgment. Science, Vol. 293, Sept. 14, 2001, 2105–2108.

  2. Greene, J.D. 2008. The Secret Joke of Kant’s Soul, in Moral Psychology, vol. 3: The Neuroscience of Morality: Emotion, Disease, and Development, W. Sinnott-Armstrong, ed., 35–79. Cambridge: MIT Press.

    Google Scholar 

  3. Greene, J.D. 2013. Moral Tribes: Emotion, Reason, and the Gap Between Us and Them. New York: Penguin Press.

    Google Scholar 

  4. Greene, J.D., and J.D. Cohen. 2006. For the law, Neuroscience Changes Nothing and Everything. In Law and the Brain, ed. S. Zeki and O. Goodenough, 207–226. New York: Oxford University Press.

    Google Scholar 

  5. Greene, J.D. 2011. Social Neuroscience and the soul’s Last Stand. In Social Neuroscience: Toward Understanding the Underpinnings of the Social Mind, ed. A. Todorov, S. Fiske, and D. Prentice, 263–273. New York: Oxford University Press.

    Chapter  Google Scholar 

  6. Waldmann, M.R., J. Nagel, and A. Wiegmann. 2012. Moral Judgment. In The Oxford Handbook of Thinking and Reasoning, ed. K.J. Holyoak and R.G. Morrison, 274–299. New York: Oxford University Press.

    Google Scholar 

  7. Greene, J.D., L.E. Nystrom, A.D. Engell, J.M. Darley, and J.D. Cohen. 2004. The Neural Bases of Cognitive Conflict and Control in Moral Judgment. Neuron 44: 389–400.

    Article  Google Scholar 

  8. McGuire, J., R. Langdon, M. Coltheart, and C. Mackenzie. 2009. A Reanalysis of the Personal/Impersonal Distinction in Moral Psychology Research. J Exp Soc Psychol 45(3): 577–580.

    Article  Google Scholar 

  9. Greene, J.D. (2009) Dual-process theory and the personal/impersonal distinction: A reply to McGuire, Langdon, Colthart, and Mackenzie. Journal of Experimental Social Psychology

  10. Berker, S. 2009. The Normative Insignificance of Neuroscience. Philos Public Aff 37(4): 294–239.

    Article  Google Scholar 

  11. Cushman, F., L. Young, and J.D. Greene. 2010. Multi-System Moral Psychology. In The Oxford Handbook of Moral Psychology, ed. J. Doris, G. Harman, S. Nichols, J. Prinz, W. Sinnott-Armstrong, and S. Stich, 47–71. New York: Oxford University Press.

    Chapter  Google Scholar 

  12. Kahane, G. 2012. On the wrong track: Process and content in moral psychology. Mind Lang 27(5): 519–545.

    Article  Google Scholar 

  13. Klein, C. 2011. The Dual Track Theory of Moral Decision-Making: A Critique of the Neuroimaging Evidence. Neuroethics 4: 143–162.

    Article  Google Scholar 

  14. Pessoa, L. 2013. The Cognitive-Emotional Brain: From Interactions to Integration. Cambridge: MIT Press.

    Book  Google Scholar 

  15. Sauer, H. 2012. Morally Irrelevant Factors: What’s Left of the Dual Process-Model of Moral Cognition? Philos Psychol 26(5): 783–811.

    Article  Google Scholar 

  16. Fine, C. 2010. Delusions of Gender: How Our Minds, Society, and Neurosexism Create Difference. New York: W.W. Norton & Company.

    Google Scholar 

  17. Shallice, T. 1988. From Neuropsychology to Mental Structure. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  18. Mendez, M.F., E. Anderson, and J.S. Shapira. 2005. An Investigation of Moral Judgment in Frontotemporal Dementia. Cogn. Behav Neurol 18(4): 193–197.

    Article  Google Scholar 

  19. Koenigs, M., L. Young, R. Adolphs, D. Tranel, F. Cushman, M. Hauser, and A. Damasio. 2007. Damage to the Prefrontal Cortex Increases Utilitarian Moral Judgments. Nature 446(7138): 908–911.

    Article  Google Scholar 

  20. Ciaramelli, E., M. Muccioli, E. Làdavas, and G. di Pellegrino. 2007. Selective Deficit in Personal Moral Judgment Following Damage to Ventromedial Prefrontal Cortex. Soc Cogn Affect Neurosci 2: 84–92.

    Article  Google Scholar 

  21. Greene, J.D. (2010) Notes on “The Normative Insignificance of Neuroscience” by Selim Berker. (Draft). Retrieved from http://www.wjh.harvard.edu/~jgreene/GreeneWJH/Greene-Notes-on-Berker-Nov10.pdf on February 1, 2014.

  22. Damasio, A. 1994. Descartes’ Error: Reason, Emotion, and the Human Brain. New York: G.P. Putnam Publishing.

    Google Scholar 

  23. Moll, J., and R. de Oliveira-Souza. 2007. Moral Judgments, Emotions, and the Utilitarian Brain. Trends Cogn Sci 11(8): 319–321.

    Article  Google Scholar 

  24. Moll, J., R. Zahn, R. de Oliveira-Souza, F. Krueger, and J. Grafman. 2005. The Neural Basis of Human Moral Cognition. Nat Rev Neurosci 6: 799–809.

    Article  Google Scholar 

  25. Moll, J., de Oliveira-Souza, R., Zahn, R., Grafman, J. (2008) The cognitive neuroscience of moral emotions. in Moral Psychology, Vol. 3: The Neuroscience of Morality: Emotion, Disease, and Development, W. Sinnott-Armstrong, ed., 1–18. Cambridge, MA: MIT Press.

  26. Greene, J.D. 2007. Why are VMPFC Patients More Utilitarian? A Dual-Process Theory of Moral Judgment Explains. Trends Cogn Sci 11(8): 322–323.

    Article  Google Scholar 

  27. Moll, J., and R. de Oliveira-Souza. 2007. Response to Greene: Moral Sentiments and Reason: Friends or Foes? Trends Cogn Sci 11(8): 323–324.

    Article  Google Scholar 

  28. Moretto, G., E. Làdavas, F. Mattioli, and G. di Pellegrino. 2009. A Psychophysiological Investigation of Moral Judgment After Ventromedial Prefrontal Damage. J Cogn Neurosci 22(8): 1888–1899.

    Article  Google Scholar 

  29. Valdesolo, P., and D. DeSteno. 2006. Manipulations of Emotional Context Shape Moral Judgment. Psychol Sci 17(6): 476–477.

    Article  Google Scholar 

  30. Crockett, M.J., L. Clark, M. Hauser, and T.W. Robbins. 2010. Serotonin Selectively Influences Moral Judgment and Behavior Through Effects on Harm Aversion. Proc Natl Acad Sci 107(40): 17444–17438.

    Article  Google Scholar 

  31. Greene, J.D., S.A. Morelli, K. Lowenberg, L.E. Nystrom, and J.D. Cohen. 2008. Cognitive Load Selectively Interferes With Utilitarian Moral Judgment. Cognition 107(3): 1144–1154.

    Article  Google Scholar 

  32. Bartels, D.M. 2008. Principled Moral Sentiment and the Flexibility of Moral Judgment and Decision Making. Cognition 108: 381–417.

    Article  Google Scholar 

  33. Moore, A.B.., B.A. Clark, and M.J. Kane. 2008. Who Shalt not Kill? Individual Differences in Working Memory Capacity, Executive Control, and Moral Judgment. Psychol Sci 19(6): 549–557.

    Article  Google Scholar 

  34. Moore, A.B.., N.Y.L. Lee, B.A.M. Clark, and A.R.A. Conway. 2011. In Defense of the Personal/Impersonal Distinction in Moral Psychology Research: Cross-Cultural Validation of the Dual Process Model of Moral Judgment. Judgment Decis Making 6(3): 186–195.

    Google Scholar 

  35. Axton, J.M., L. Ungar, and J.D. Greene. 2012. Reflection and Reasoning in Moral Judgment. Cogn Sci 36: 163–177.

    Article  Google Scholar 

  36. Suter, R.S., and R. Hertwig. 2011. Time and Moral Judgment. Cognition 119: 454–458.

    Article  Google Scholar 

  37. Carruthers, P. 2006. The Architecture of the Mind. Oxford: Oxford University Press.

    Book  Google Scholar 

  38. Fodor, J. 1983. The Modularity of Mind. Cambridge: MIT Press.

    Google Scholar 

  39. Cosmides, L., Tooby J. (1997) Evolutionary psychology: A primer. http://www.psych.ucsb.edu/research/cep/primer.html

  40. Buller, D.J. 2005. Adapting Minds: Evolutionary Psychology and the Persistent Quest for Human Nature. Cambridge: MIT Press.

    Google Scholar 

  41. Richardson, R.C. 2007. Evolutionary Psychology as Maladapted Psychology. Cambridge: MIT Press.

    Google Scholar 

  42. Robbins, P. (2009) Modularity of Mind. Stanford Encyclopedia of Philosophy. http://plato.stanford.edu/entries/modularity-mind/

  43. Greene, J. 2005. Emotion and Cognition in Moral Judgment: Evidence from Neuroimaging. In Neurobiology of Human Values, ed. J.P. Changeux, A.R. Damasio, W. Singer, and Y. Christen, 57–66. Berlin: Springer.

    Chapter  Google Scholar 

  44. Greene, J.D. (2005) Cognitive neuroscience and the structure of the moral mind, in The Innate Mind: Structure and Contents, S. Laurence, P. Carruthers, and S. Stich. Eds., 338 – 352. New York: Oxford University Press.

  45. Sanfey, A., J. Rilling, J. Aronson, L. Nystrom, and J. Cohen. 2003. The Neural Basis of Economic Decision-Making in the Ultimatum Game. Science 300: 1755–1758.

    Article  Google Scholar 

  46. Bjorkland, F., J. Haidt, and S. Murphy. 2000. Moral Dumbfounding: When Intuition Finds no Reason. Lund Psychol Rep 2: 1–23.

    Google Scholar 

  47. Cushman, F.A., L. Young, and M.D. Hauser. 2006. The Role of Conscious Reasoning and Intuition in Moral Judgment: Testing Three Principles of Harm. Psychol Sci 17(12): 1082–1089.

    Article  Google Scholar 

Download references

Acknowledgments

Work on this paper was supported by a National Endowment for the Humanities Summer Stipend #FT-59,871-12. I am grateful for feedback on earlier versions of this paper provided by Ginger Hoffman, Dale Miller, and the participants at the second annual New Scholars in Bioethics (NSiB) Symposium: Kirstin Borgerson, Danielle Bromwich, Michael Garnett, Douglas MacKay, and Joseph Millum. Three anonymous reviewers for this journal also provided extremely valuable feedback.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robyn Bluhm.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bluhm, R. No Need for Alarm: A Critical Analysis of Greene’s Dual-Process Theory of Moral Decision-Making. Neuroethics 7, 299–316 (2014). https://doi.org/10.1007/s12152-014-9209-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12152-014-9209-0

Keywords

Navigation