Conflict monitoring in dual process theories of thinking☆
Introduction
In the spring of 2006 racial tensions in Belgium rose to a boiling point after a white, Belgian high school student was violently stabbed to death by two youths thought to be of African decent. A striking aspect of the sad case was how readily many civilians, politicians, and media were willing to blame the African community based on some initial rumors. The violent murder fitted with people’s stereotypical (but mistaken) beliefs about Africans’ aggressive and criminal nature. What most people disregarded was that, as in most European countries, African immigrants are just a small minority group in Belgium. They are outnumbered by a factor of ten by people with European roots. Logically speaking, in the absence of clear evidence to the contrary it is far more likely that an assailant will come from another ethnic group. However, many people were tempted to neglect this information and readily believed the initial reports about the involvement of the African youths. The ungrounded accusations backlashed when 2 weeks later the actual culprits were identified as being Europeans.
The above case is a regrettable illustration of a common human tendency to base judgments on prior beliefs and intuition rather than on a logical reasoning process. Over the last decades numerous studies have shown that this tendency is biasing performance in many classic reasoning and decision making tasks (Evans, 2002, Tversky and Kahneman, 1974).
Influential dual process theories of thinking have explained people’s “rational thinking failure” by positing two different human reasoning systems (e.g., Epstein, 1994, Evans, 1984, Evans, 2007b, Evans and Over, 1996, Goel, 1995, Kahneman, 2002, Kahneman and Frederick, 2005, Sloman, 1996, Stanovich and West, 2000). Dual process theories come in many flavors but generally they assume that a first system (often called the heuristic system) will tend to solve a problem by relying on prior knowledge and beliefs whereas a second system (often called the analytic system) allows reasoning according to logical standards. The heuristic default system is assumed to operate fast and automatically whereas the operations of the analytic system would be slow and heavily demanding of people’s computational resources. Dual process theories state that the heuristic and analytic system will often interact in concert. Hence, on these occasions the heuristic default system will provide us with fast, frugal, and correct conclusions. However, the prepotent heuristics can also bias reasoning in situations that require more elaborate, analytic processing. That is, both systems will sometimes conflict and cue different responses. In these cases the analytic system will need to override the belief-based response generated by the heuristic system (Stanovich & West, 2000).
Although the dual process framework has been very influential it has also been criticized. Many researchers have pointed out that the differential processing characteristics of the two systems are not sufficiently specified: Dual process theories nicely describe “what” the two systems do but it is not clear “how” the systems actually operate (Evans, 2007a, Gigerenzer and Regier, 1996, Osman, 2004, Reyna et al., 2003, Stanovich and West, 2000). The characterization of the conflict detection process is a crucial case in point. Dual process theories generally state that the analytic system is monitoring the output of the heuristic system. When a conflict with analytic knowledge (e.g., sample size considerations) is detected, the analytic system will attempt to intervene and inhibit the prepotent heuristic response. However, if one looks at the literature it becomes clear that there are some widely different views on the efficiency of the conflict monitoring component during judgement and decision making. This results in a different characterization of the nature of the dominant reasoning error. The classic work of Evans (1984) and Kahneman and colleagues (e.g., Kahneman & Frederick, 2002), for example, claims that the monitoring of the heuristic system is quite lax. It is assumed that by default people will tend to rely on the heuristic route without taking analytic considerations into account. In some cases people can detect the conflict and the analytic system will intervene but typically this will be quite rare. Most of the time people will simply not be aware that their response might be incorrect from a normative point of view. As Kahneman and Frederick (2005, p. 274) put it: “People who make a casual intuitive judgement normally know little about how their judgment come about and know even less about its logical entailments”. Thus, in this view people mainly err because they fail to detect a conflict.
In the work of Epstein, 1994, Sloman, 1996 one finds a remarkably different view on conflict monitoring and the nature of reasoning errors. These authors assume that in general the heuristic and analytic routes are simultaneously activated and people typically do experience a conflict between two types of reasoning. People would “simultaneously believe two contradictory responses” (Sloman, 1996, p. 11) and therefore “behave against their better judgement” (Denes-Raj & Epstein, 1994, p. 1) when they err. Thus, people would be taking analytic considerations in mind and notice that they conflict with the heuristically cued belief. The problem, however, is that they do not always manage to override the compelling heuristics. In this view there is nothing wrong with the conflict detection process. Errors arise because people fail to inhibit the prepotent heuristic beliefs. Sloman argued that classic reasoning tasks can be thought of as perceptual illusions in this respect. In the Muller–Lyer illusion, for example, perception also tells us that one line is longer than the other while logic tells us that it is not. Even though we can measure the lines and know they are of equal size our perception of them does not change. We simultaneously experience two contradictory beliefs. In order to correctly answer the question about the length of the lines we will need to override the erroneous heuristic perception.
In a recent review, Evans (2007a) has pointed to the inconsistencies in the field. Evans’ work indicates that different views on conflict monitoring are not only linked with different views on the nature of reasoning errors (i.e., conflict detection or inhibition failure) but also with a different characterization of the interaction between the analytic and heuristic system (i.e., parallel or serial). Sloman and Epstein assume that whenever people are confronted with a reasoning problem both routes will process it simultaneously. People take analytic considerations into account right from the start and detect possible conflicts with heuristically cued beliefs. Here it is believed that both systems operate in parallel. In Kahneman’s framework and Evans’ own dual process model, however, only the heuristic route is initially activated. The analytic system is assumed to monitor the output of the heuristic system and might intervene in a later stage when a conflict is detected. As Evans noted, here the interplay between the two systems has a more serial nature.
Based on the available data it is hard to decide between the different models and determine which conflict detection view is correct. Sloman, 1996, Epstein, 1994, for example, refer to the outcome of perspective change and instruction experiments in support of their views. It has indeed been shown that simply instructing people to evaluate problems “from the perspective of a statistician” helps boosting their performance. In the same vein Sloman stresses the casual observation that people often have no trouble recognizing their error once it is explained to them. Such observations do suggest that people have readily access to two different modes of reasoning and that they can easily switch between them. However, they do not show that both routes are activated simultaneously. No matter how easily one takes analytic considerations into account when prompted, one cannot conclude that this knowledge was also activated during reasoning in the absence of these prompts.
More compelling evidence for successful conflict detection during decision making comes from a number of intriguing anecdotes and spontaneous reports. Epstein, 1994, Denes-Raj and Epstein, 1994, Epstein and Pacini, 1999, for example, repeatedly noted that when picking an erroneous answer his participants spontaneously commented that they did “know” that the response was wrong but stated they picked it because it “felt” right. Sloman (1996) cites evolutionary biologist Steven Jay Gould who relates experiencing a similar conflict between his logical knowledge and a heuristically cued stereotypical belief when solving Kahneman’s and Tversky’s infamous “Linda” problem.1 The problem, however, is that spontaneous self-reports and anecdotes are no hard empirical data. This is perhaps best illustrated by the fact that Kahneman (2002, p. 483) also refers to “casual observation” of his participants to suggest that only in “some fraction of cases, a need to correct the intuitive judgements and preferences will be acknowledged”. It is clear that in order to conclude something about the efficiency of the conflict detection we need a straightforward empirical test to establish precisely how frequently people experience this conflict. The present study addresses this issue.
Experiment 1 adopted a thinking aloud procedure (e.g., Ericsson & Simon, 1980). The thinking aloud procedure has been designed to gain reliable information about the course of cognitive processes. Participants are simply instructed to continually speak aloud the thoughts that are in their head as they are solving a task. Thinking aloud protocols have been shown to have a superior validity compared to interpretations that are based on retrospective questioning or people’s spontaneous remarks (Ericsson and Simon, 1993, Payne, 1994).
Participants were asked to solve problems that were modeled after Kahneman and Tversky’s classic (1973) base rate neglect problems. In these problems people first get information about the composition of a sample (e.g., a sample with 995 females and 5 males). People are told that short personality descriptions are made of all the participants and they will get to see one description that was drawn randomly from the sample. Consider the following example:
In a study 1000 people were tested. Among the participants there were 5 men and 995 women. Jo is a randomly chosen participant of this study.
Jo is 23 years old and is finishing a degree in engineering. On Friday nights, Jo likes to go out cruising with friends while listening to loud music and drinking beer.
What is most likely?
a. Jo is a man
b. Jo is a woman
Given Kahneman and Tversky’s (1973) classic findings one can expect that in the majority of cases people will err and pick the heuristically cued response in this task. The crucial question is whether people’s verbal protocols indicate that they nevertheless take analytic considerations into account. In this task “analytic considerations” can be operationalized as referring to the group size information during the reasoning process (e.g., “ … because Jo’s drinking beer and loud I guess Jo’ll be a guy, although there were more women…”). Such basic sample size reference during the reasoning process can be considered as a minimal indication of successful conflict monitoring. It shows that this information is not simply neglected. If Sloman and Epstein’s idea about the parallel operation of the heuristic and analytic route is correct, such references should be found in the majority of cases. If Kahneman and Evans’ idea about the lax nature of the conflict monitoring is correct, people will simply not be aware that the base rates are relevant and should hardly ever mention them during decision making.
It should be noted that both camps in the conflict monitoring debate, as the reasoning field at large, have conceptualized the conflict between the analytic and heuristic system as a consciously experienced, verbalizable event. Conflict monitoring is considered as a controlled process arising from the central executive aspect of working memory. Since James (1890) there is indeed a long tradition in psychology to consider such central, controlled (vs. automatic) processing as being consciously experienced (Feldman Barrett, Tugade, & Engle, 2004). However, the available evidence from the cognitive literature suggests that this needs not always be the case (e.g., Pashler et al., 2001, Shiffrin, 1988). Although controlled processing can occur with a feeling of conscious deliberation and choice, it needs not (Feldman Barrett et al., 2004).
While it is held that thinking-aloud is an excellent method to tap into the content of conscious thinking it cannot provide us with the information about cognitive processes that do not reach the conscious mind (Crutcher, 1994). Consequently, even if participants do not verbalize their experience of the conflict, one cannot exclude that the conflict monitoring might nevertheless have been successful. To capture such implicit detection participants were also presented with an unannounced recall test in our study. After a short break following the thinking-aloud phase participants were asked to answer questions about the group sizes in the previous reasoning task. If people have successfully detected the conflict this implies that the group size has been taken into account and people spent some time processing it. Indeed, the detection of the conflict should trigger analytic system intervention which should result in some further scrutinising of the sample information. In sum, successful conflict detection should be accompanied by a deeper processing of the base rate information which should benefit recall. This recall index does not require that the conflict is consciously experienced and verbalizable.2
To validate the recall hypothesis participants were also presented with additional control problems. In the classic base rate problems the description of the person is composed of common stereotypes of the smaller group so that base rates and description disagree. In addition to these classic problems we also presented problems where base rates and description both cued the same response. In these congruent problems the description of the person was composed of stereotypes of the larger group (e.g., Ferreira, Garcia-Marques, Sherman, & Garrido, 2006). Hence, contrary to the classic (i.e., incongruent) problems base rates and description did not conflict and the response could be rightly based on the salient description without further analytic intervention/processing. For a reasoner who neglects the base rates and does not detect the conflict on the classic problems both types of problems will be completely similar and base rate recall should not differ. However, if one does detect the conflict, the added analytic processing of the base rates should result in a better recall for the classic problems than for the congruent control problems.
In Experiment 2 the conflict monitoring issue is further examined by focusing on participants’ problem processing time. A core characteristic of analytic reasoning is that it is slow and time-consuming (e.g., Evans, 2003, Sloman, 1996). While the analytic base rate scrutinizing associated with conflict detection might benefit subsequent recall, it will also take up some additional processing time. Reasoning latencies thereby provide an additional test of the opposing conflict monitoring views. One may assume that people will be fastest to solve the congruent control items since the response can be fully based on mere heuristic reasoning without any further analytic intervention. Correctly solving the classic problems should be slowest since it requires people to detect the conflict and inhibit the heuristic response which are both conceived as time-demanding processes (e.g., De Neys, 2006a). The crucial question concerns the processing time of erroneously solved incongruent problems (i.e., responses on the classic problems based on the description). If people simply fail to detect the conflict and reason purely heuristically, reasoning latencies for incorrectly solved incongruent and correctly solved congruent problems should not differ. If people do detect the conflict, they should take longer to respond to the incongruent problems. Consequently, reasoning latencies for the incorrectly solved incongruent problems should fall somewhere in between those of correctly solved incongruent problems and congruent control problems.
To validate the idea that upon conflict detection people spend specific time processing the base rates Experiment 2 also introduces a rudimentary “moving window” procedure (e.g., Just, Carpenter, & Wooley, 1982). In the experiment the group size information and the description are presented separately. First, the base rates are presented on a computer screen. Next, the description and question are presented and the base rates disappear. Participants have the option of visualizing the base rates afterwards by holding a specific button down. Such base rate reviewing can be used as an additional conflict detection index. One might expect that when people detect that the description conflicts with the previously presented base rates they will spend extra time scrutinizing or “double checking” the base rates. With the present procedure the time spent visualizing the base rates can be used as a measure of this reviewing tendency. Longer overall response latencies after successful conflict detection should thus be accompanied by a stronger tendency to visualize the base rates. If people simply neglect the base rates, there is also no reason to review and visualize them after the initial presentation.
Section snippets
Experiment 1
Participants in Experiment 1 solved a set of base rate problems while thinking aloud. In the classic, incongruent problems base rates and description conflicted whereas in the congruent problems base rates and description were consistent. In addition, participants also received a set of neutral problems where the description only mentioned characteristics that were neutral with respect to group membership (e.g., “the person has black hair and blue eyes”). In these problems the description will
Experiment 2
In Experiment 2 the findings of Experiment 1 are further validated. Participants solved similar base rate problems but were no longer requested to think aloud. Experiment 2 focused on participants’ problem processing time. While the analytic base rate scrutinizing associated with conflict detection might benefit subsequent recall, it will also take up some additional processing time. Reasoning latencies thereby provide an additional test of the opposing conflict monitoring views. One may assume
General discussion
The present study contrasted opposite views on conflict monitoring in dual process theories of reasoning and decision making. According to Kahneman and colleagues (e.g., Kahneman, 2002, Kahneman and Frederick, 2005) and the classic work of Evans (1984) conflict monitoring is typically quite lax. It is assumed that most of the time people rely exclusively on the heuristic route while making decisions without taking analytic considerations into account. In this view, people are typically biased
Acknowledgments
Wim De Neys is a Post Doctoral Fellow of the Flemish Fund for Scientific Research (Post doctoraal Onderzoeker FWO – Vlaanderen). Experiment 1 was conducted during a stay at Vinod Goel’s lab at York University, Toronto (Canada).
References (53)
- et al.
Conflict monitoring and anterior cingulate cortex: An update
Trends in Cognitive Sciences
(2004) In two minds: Dual process accounts of reasoning
Trends in Cognitive Sciences
(2003)- et al.
Two storage systems in free recall
Journal of Verbal Learning and Verbal Behaviour
(1966) - et al.
Effects of belief and logic on syllogistic reasoning: Eye-movement evidence for selective processing models
Experimental Psychology
(2006) - et al.
The collider principle in causal reasoning: Why the monty hall dilemma is so hard
Journal of Experimental Psychology: General
(2004) - et al.
Conflict monitoring and cognitive control
Psychological Review
(2001) Telling what we know
Psychological Science
(1994)- et al.
Conflict between intuitive and rational processing: When people behave against their better judgement
Journal of Personality and Social Psychology
(1994) Automatic-heuristic and executive-analytic processing in reasoning: Chronometric and dual task considerations
Quarterly Journal of Experimental Psychology
(2006)Dual processing in reasoning: Two systems but one reasoner
Psychological Science
(2006)
Working memory and everyday conditional reasoning: Retrieval and inhibition of stored counterexamples
Thinking & Reasoning
Integration of the cognitive and psychodynamic unconscious
American Psychologists
Some basic issues regarding dual-process theories from the perspective of cognitive–experiential self-theory
Verbal reports as data
Psychological Review
Protocol analysis: Verbal reports as data
Heuristic and analytic processing in reasoning
British Journal of Psychology
Logic and human reasoning: An assessment of the deduction paradigm
Psychological Bulletin
On the resolution of conflict in dual process theories of reasoning
Thinking and Reasoning
The heuristic-analytic theory of reasoning: Extension and Evaluation
Psychonomic Bulletin and Review
Rationality and reasoning
Individual differences in working memory capacity and dual-process theories of the mind
Psychological Bulletin
Automatic and controlled components of judgment under uncertainty
Proceedings of the Cognitive Science Society
Presentation and content: the use of base rates as a continuous variable
Journal of Experimental Psychology: Human Perception and Performance
How do we tell an association from a rule?: Comment on Sloman (1996)
Psychological Bulletin
Sketches of thought
Working memory, inhibitory control, and the development of children’s reasoning
Thinking and Reasoning
Cited by (344)
Debiasing thinking among non-WEIRD reasoners
2024, CognitionThe formation and revision of intuitions
2023, CognitionConflict detection and base-rate extremity
2023, Acta PsychologicaAdvancing theorizing about fast-and-slow thinking
2023, Behavioral and Brain Sciences
- ☆
This manuscript was accepted under the editorship of Jacques Mehler.