Elsevier

Cognition

Volume 106, Issue 3, March 2008, Pages 1093-1108
Cognition

The cost of thinking about false beliefs: Evidence from adults’ performance on a non-inferential theory of mind task

https://doi.org/10.1016/j.cognition.2007.05.005Get rights and content

Abstract

Much of what we know about other people’s beliefs comes non-inferentially from what people tell us. Developmental research suggests that 3-year-olds have difficulty processing such information: they suffer interference from their own knowledge of reality when told about someone’s false belief (e.g., [Wellman, H. M., & Bartsch, K. (1988). Young children’s reasoning about beliefs. Cognition, 30, 239–277.]). The current studies examined for the first time whether similar interference occurs in adult participants. In two experiments participants read sentences describing the real colour of an object and a man’s false belief about the colour of the object, then judged the accuracy of a picture probe depicting either reality or the man’s belief. Processing costs for picture probes depicting reality were consistently greater in this false belief condition than in a matched control condition in which the sentences described the real colour of one object and a man’s unrelated belief about the colour of another object. A similar pattern was observed for picture probes depicting the man’s belief in most cases. Processing costs were not sensitive to the time available for encoding the information presented in the sentences: costs were observed when participants read the sentences at their own pace (Experiment 1) or at a faster or a slower pace (Experiment 2). This suggests that adults’ difficulty was not with encoding information about reality and a conflicting false belief, but with holding this information in mind and using it to inform a subsequent judgement.

Introduction

Theory of mind (ToM) describes a set of abilities used to explain or predict behaviour in terms of mental states such as beliefs, desires and intentions. False belief tasks are a very widely used test of these abilities (Wellman et al., 2001, Wimmer and Perner, 1983). In one typical false belief task a story character, Sally, places her marble in a basket, then goes outside to play. In her absence, a second character, Anne, moves the marble from the basket to a box, with the result that Sally has a false belief about the marble’s location. Participants are then asked test questions that require them to infer Sally’s false belief in order to say where Sally thinks the marble is located, or to predict where Sally will first look to find her marble (Baron-Cohen, Leslie, & Frith, 1985). Three-year-old children commonly fail such tasks by judging from their own knowledge of the object’s location, rather than Sally’s false belief. Four-year-olds typically answer correctly, but the cognitive processes responsible for such success in children or in adults are not well-understood. Only quite recently have researchers begun to investigate the cognitive basis of ToM in typical adults, and surprising gaps remain in our understanding. For example, current studies of adults systematically confound the process of inferring a mental state with any processes involved in simply representing this information. This is an important confound for two reasons. First, much of what we know about other people’s mental states comes non-inferentially when they, or some other person, simply tell us what they know, think, want or intend. Thus, it is important to know how such information is encoded and held in mind. Second, in developmental research it is known that ToM inferences are not children’s only problem: even when simply told about someone’s false belief, three-year-olds make errors when asked to judge what the person will think or do, apparently suffering severe interference from their own knowledge of reality (de Villiers and Pyers, 2002, Flavell et al., 1990, Wellman and Bartsch, 1988). The current study is the first investigation of adults’ mental representation of ToM information that de-confounds the need to hold ToM information in mind from the need to make a ToM inference.

Behavioural studies of ToM in adults have tested participants’ ability to make inferences about mental states. Adults often make errors when asked to evaluate statements that require inferences about mental states with multiple embeddings (e.g., “Bob thinks that John knew that Mary wanted to go to the shop”; Kinderman et al., 1998, Rutherford, 2004). Adults typically make few errors when they only have to infer one person’s belief, or one person’s belief about another’s belief (e.g., Fletcher et al., 1995, Stone et al., 1998). However, the likelihood of error is increased if adults perform a concurrent task designed to tax working memory or other components of executive function (e.g., Bull et al., submitted for publication, McKinnon and Moscovitch, 2007). Relatedly, German and Hehman (2006) suggest that adults are slower and more error-prone when inferring combinations of mental states that place relatively high demands on inhibitory control (e.g., false belief plus negative desire) compared with combinations that make lower demands (e.g., true belief plus positive desire). Interestingly, even in the case of very simple false beliefs where adults would be unlikely to infer the belief incorrectly, adults may nonetheless show biases in the probabilities they attach to the likely behaviour of the person with the false belief (Birch & Bloom, 2007). Overcoming such a “curse of knowledge” may require executive control.

The above findings indicate a significant role for memory and executive resources in adults’ performance of ToM tasks. However, the conclusions warranted by these findings are limited for two important reasons. First, it is widely agreed that ToM tasks make significant demands upon executive processes that have nothing to do with ToM per se (e.g., Apperly et al., 2005, Bloom and German, 2000). Increasing task complexity, or placing adults under cognitive load, may lead to errors on ToM tasks because adults struggle to meet these incidental demands. Firm conclusions about a necessary role for executive processes are only warranted if these incidental demands are adequately controlled for, and the kind of check questions most commonly employed in ToM tasks fall short of achieving this (Apperly et al., 2005). Second, even when tasks do include well-matched comparison conditions or control trials (e.g., German & Hehman, 2006) it is unclear whether executive resources are required for inferring mental states, for holding this information in mind during the task, for formulating an answer to the test questions using the relevant ToM information, or for all of these processes. Separating these ToM processes in tasks with appropriate control trials is essential if the role of memory and executive processes in ToM is to be understood. The current paper speaks to this question by investigating how adults hold ToM information in mind.

Young children have difficulty processing ToM information even when they do not have to make a ToM inference. Wellman and Bartsch (1988) simply told children about a story character’s false belief and the corresponding reality (e.g., “Sam thinks the puppy is in the garage/The puppy is really on the porch”). When asked where Sam would look for his puppy, most 3-year-olds and young 4-year-olds judged that he would look on the porch, making the same “reality” error observed in more standard tasks where it is necessary to infer a false belief (e.g., Wellman et al., 2001; see also Flavell et al., 1990). In a task designed to assess children’s processing of embedded complement syntax de Villiers and Pyers (2002) told children short stories then summarized the false belief of a story character e.g., “He thought he found a ring, but really, it was a bottle cap”. The experimenter then pointed to a picture of the character and asked “What did he think?” Many 3-year-olds judged incorrectly that he thought he found a bottle cap. Clearly, children have difficulty processing ToM information even when no inference is necessary. Although children’s non-inferential ToM reasoning has received relatively little theoretical attention, the literature on children’s general ToM development offers a number of potential interpretations for children’s difficulties, and these make different predictions about the pattern that might be observed in adults.

It is commonly suggested that children fail false belief tasks because they lack a concept of belief (e.g., Gopnik and Wellman, 1992, Perner, 1991, Wellman et al., 2001), and without the necessary concept it would be natural for children to have difficulty with non-inferential processing of information about beliefs. de Villiers and Pyers (2002) suggest that younger children make errors because they incorrectly process the embedded complement clauses of belief statements such as “He thought he found a ring” (though see e.g., Lillard and Flavell, 1992, Perner et al., 2003, Smith et al., 2003, for data that seem inconsistent with this view). Importantly, neither the conceptual nor the syntactic explanation for children’s errors would predict that adults would have any difficulty processing reports of false beliefs, since adults have a mature concept of belief and mature processing of the syntax of belief statements.

Many authors have proposed that children fail false belief tasks because they lack the necessary executive control (e.g., Carlson and Moses, 2001, Leslie et al., 2004, Mitchell, 1996, Russell, 1996, Zelazo et al., 2003). One suggestion is that executive control is necessary for the emergence of ToM concepts such as belief, perhaps by enabling children to disengage from the immediate objects of their attention (e.g., Carlson and Moses, 2001, Russell, 1996). If a concept of belief has not yet emerged in children’s thinking then it is unsurprising that they have difficulty on non-inferential false belief tasks. However, emergence accounts could not explain any difficulty that adults might have on non-inferential false belief tasks since the requisite ToM concepts would already have emerged.

Another suggestion is that executive control is necessary for the expression of a belief concept, perhaps by enabling children to overcome default ascription of true beliefs (e.g., Leslie et al., 2004) or to resist any tendency to respond on the basis of their own knowledge rather than what the other person believes (e.g., Carlson and Moses, 2001, Russell, 1996). This role might exist whether the concept of belief is innate or emergent. It has also been suggested that any reasoning about false beliefs necessarily requires a certain level of executive control, without which children lack a proper concept of beliefs (e.g., Russell, 1996). Either of these accounts could explain young children’s difficulties on non-inferential tasks, and on either account it is possible that ToM in adults also makes demands on executive control.

In sum, accounts of ToM development differ in their ability to explain any processing costs observed in adults’ non-inferential ToM processing. Thus, as well as being informative about how adults process ToM information, data from adults can also play a valuable role in constraining interpretation of the development of ToM.

The developmental literature suggests that children find it hard to resist interference from knowledge of reality when they are told about a false belief and must simply hold this information briefly in mind. The current studies tested whether adults would show analogous difficulties on a non-inferential false belief task. To test this we presented adults with information about a situation, e.g., “Really, the ball on the table is yellow”, and information about someone’s false belief about the situation, e.g., “He thinks that the ball on the table is red”. Note that these sentences are in no way contradictory – it can be simultaneously true that the ball on the table is yellow and that someone thinks (falsely) that it is red. However, as with any false belief, the propositional content of the man’s belief is clearly in conflict with reality, raising the possibility of processing costs for such information. Subsequently, we tested adults’ ability to formulate judgements about this information by asking them to judge the accuracy of a picture probe that either depicted reality, or the man’s false belief. We used pictures rather than sentences as probes to ensure that participants could not judge the probes on the basis of superficial similarity with the initial sentences in which information about belief and reality was presented.

Whereas the developmental literature suggests that young children may actually be unable to process such information correctly, adults would generally be expected to succeed. Thus, any processing costs for adults are likely to be evident in the time it takes to formulate a judgement, with only occasional errors. Importantly, a number of factors that are irrelevant to processing false beliefs per se will also contribute to these processing costs. Besides any problem with holding in mind someone’s false belief, the participant must remember two sets of information about objects, locations and colours, and assign one set to the man and one set to reality (Object on the table is yellow/He thinks object on the table is red). Thus, evaluating the specific processing costs of our False Belief/Reality (False B/R) condition requires comparison against a baseline condition that also poses these incidental processing demands but in which there is no conflict between belief and reality.

In the developmental literature it is often noted that children pass true belief tasks before false belief tasks (e.g., Leslie et al., 2004, Wellman et al., 2001), and a true belief condition might, at first sight, appear a suitable baseline in the current study (i.e., “Really the ball on the table is yellow/He thinks the ball on the table is yellow”). However, a true belief condition presents the participant with a simple strategy for reducing processing costs by reducing the information they must hold in mind: all they need remember is a single set of information about object location and colour, and a single fact about the man (i.e., “Ball on the table is yellow/Man is right”). Therefore, a true belief condition might be easier to process than a False B/R condition merely because participants had to remember less information, not because there was no interference between belief and reality information. Thus, although we included true belief trials as filler items (see below) we did not think them a suitable experimental baseline.

Instead of a true belief baseline condition, we presented participants with information about reality and an unrelated belief, e.g., “Really the ball on the table is yellow”/“He thinks the ball on the chair is red”. Like the False B/R condition, the Unrelated B/R condition did not allow participants an obvious shortcut to remembering the necessary information: participants had to remember two sets of information about objects, locations and colours, and assign one set to the man and one set to reality (Ball on the table is yellow/He thinks the ball on the chair is red). Indeed, since there are different locations in the Unrelated B/R condition (table and chair) there are more distinct facts to remember than in the False B/R condition. Critically however, whereas the content of a false belief is in conflict with reality (this, recall, is often thought to be the critical source of difficulty for children), in the Unrelated B/R condition there is no conflict between reality and the content of the man’s belief.

Different accounts of how adults might mentally represent the information that they have been given about belief and reality yield different predictions about the relative processing costs of False B/R and Unrelated B/R conditions. One possibility is that adults do not integrate the information presented in the two sentences, with the result that the “falseness” (and thus conflict between belief and reality) of the False B/R condition is not represented. If this were the case then we would expect the processing costs to be determined solely by the number of distinct pieces of information that participants had to hold in mind and, if anything, we should observe higher costs for the Unrelated B/R condition (where the objects are described in different locations) than for the False B/R condition (where there is only one location). On the other hand, if adults do integrate the information presented in the two sentences into a coherent representation then the “falseness” (and thus conflict between belief and reality) in the False B/R condition would be represented, and we might expect higher processing costs in the False B/R condition than in the Unrelated B/R condition. Moreover, if processing costs in adults correspond directly to the “reality bias” pattern of difficulty observed in young children, or the “curse of knowledge” observed in adults (e.g., Birch & Bloom, 2007) then in the False B/R condition we would expect the processing cost to be highest for judgements about beliefs, and lower for judgements about reality.

Section snippets

Participants

Sixteen undergraduate students (13 female, all right handed) participated for course credits. Participants ranged between 18 and 28 years (mean = 20 years, SD = 2.28 years).

Design and procedure

Each trial consisted of three events: Sentence 1, Sentence 2 and a picture probe (see Fig. 1). N.B., picture probes appeared in colour, so the colour of the object in the box could be directly observed. Participants judged whether the picture probe accurately represented the situation described in the sentences. The critical

Participants

Thirty two undergraduate students (27 female, 29 right handed) participated for course credits. Participants ranged between 18 and 40 years (mean = 21 years, SD = 3.9 years).

Design and procedure

The method was the same as for Experiment 1 in all respects except that participants no longer determined for themselves the time available for reading Sentences 1 and 2. Instead, half of the participants saw each sentence for just 1500 ms (>300 ms faster than the 1838 ms average reading time in Experiment 1, and judged “fast” by

General discussion

The non-inferential false belief task allows us to investigate how adults mentally represent ToM information independent of any need to make ToM inferences. Adult participants showed greater processing costs (were slower and/or more error-prone) when informed about a false belief, the content of which conflicted with reality, than when informed about a belief whose content did not conflict with reality. This pattern would not be expected if the information in each stimulus sentence was treated

Acknowledgement

This work was supported by a grant from the ESRC: RES-000-23-1419.

References (29)

  • H. Wimmer et al.

    Beliefs about beliefs: Representation and constraining function of wrong beliefs in young children’s understanding of deception

    Cognition

    (1983)
  • I.A. Apperly et al.

    Is belief reasoning automatic?

    Psychological Science

    (2006)
  • S.A.J. Birch et al.

    The curse of knowledge in reasoning about false beliefs

    Psychological Science

    (2007)
  • Bull, R., Phillips, L. H., & Conway, C. (submitted for publication). The role of control functions in mentalising: Dual...
  • Cited by (102)

    • Knowledge, belief, and moral psychology

      2021, Behavioral and Brain Sciences
    View all citing articles on Scopus
    View full text