We propose a critique of normativism, defined as the idea that human thinking reflects a normative system against which it should be measured and judged. We analyze the methodological problems associated with normativism, proposing that it invites the controversial “is-ought” inference, much contested in the philosophical literature. This problem is triggered when there are competing normative accounts (the arbitration problem), as empirical evidence can help arbitrate between descriptive theories, but not between normative systems. Drawing on linguistics as a model, we (...) propose that a clear distinction between normative systems and competence theories is essential, arguing that equating them invites an “is-ought” inference: to wit, supporting normative “ought” theories with empirical “is” evidence. We analyze in detail two research programmes with normativist features – Oaksford and Chater’s rational analysis and Stanovich and West’s individual differences approach – demonstrating how, in each case, equating norm and competence leads to an is-ought inference. Normativism triggers a host of research biases in the psychology of reasoning and decision making: focusing on untrained participants and novel problems, analyzing psychological processes in terms of their normative correlates, and neglecting philosophically significant paradigms when they do not supply clear standards for normative judgement. For example, in a dual-process framework, normativism can lead to a fallacious “ought-is” inference, in which normative responses are taken as diagnostic of analytic reasoning. We propose that little can be gained from normativism that cannot be achieved by descriptivist computational-level analysis, illustrating our position with Hypothetical Thinking Theory and the theory of the suppositional conditional. We conclude that descriptivism is a viable option, and that theories of higher mental processing would be better off freed from normative considerations. (shrink)
We propose a critique ofnormativism, defined as the idea that human thinking reflects a normative system against which it should be measured and judged. We analyze the methodological problems associated with normativism, proposing that it invites the controversial “is-ought” inference, much contested in the philosophical literature. This problem is triggered when there are competing normative accounts, as empirical evidence can help arbitrate between descriptive theories, but not between normative systems. Drawing on linguistics as a model, we propose that a clear (...) distinction between normative systems and competence theories is essential, arguing that equating them invites an “is-ought” inference: to wit, supporting normative “ought” theories with empirical “is” evidence. We analyze in detail two research programmes with normativist features – Oaksford and Chater's rational analysis and Stanovich and West's individual differences approach – demonstrating how, in each case, equating norm and competence leads to an is-ought inference. Normativism triggers a host of research biases in the psychology of reasoning and decision making: focusing on untrained participants and novel problems, analyzing psychological processes in terms of their normative correlates, and neglecting philosophically significant paradigms when they do not supply clear standards for normative judgement. For example, in a dual-process framework, normativism can lead to a fallacious “ought-is” inference, in which normative responses are taken as diagnostic of analytic reasoning. We propose that little can be gained from normativism that cannot be achieved by descriptivist computational-level analysis, illustrating our position with Hypothetical Thinking Theory and the theory of the suppositional conditional. We conclude that descriptivism is a viable option, and that theories of higher mental processing would be better off freed from normative considerations. (shrink)
Normativism, the approach that judges human rationality by comparison against normative standards, has recently come under intensive criticism as unsuitable for psychological enquiry, and it has been suggested that it should be replaced with a descriptivist paradigm. My goal in this paper is to outline and defend a meta-theoretical framework of such a paradigm, grounded rationality, based on the related principles of descriptivism and (moderate) epistemic relativism. Bounded rationality takes into account universal biological and cognitive limitations on human rationality. Grounded (...) rationality accepts universal constraints but adds cognitive variability: Within-individual variability (dual processing), and individual as well as cultural differences. I discuss the implications of grounded rationality to dual processing, proposing that investing limited cognitive resources in analytic processing might be less instrumentally rational for individuals with low cognitive ability. (shrink)
Originally identified by Hume, the validity of is–ought inference is much debated in the meta-ethics literature. Our work shows that inference from is to ought typically proceeds from contextualised, value-laden causal utility conditional, bridging into a deontic conclusion. Such conditional statements tell us what actions are needed to achieve or avoid consequences that are good or bad. Psychological research has established that people generally reason fluently and easily with utility conditionals. Our own research also has shown that people’s reasoning from (...) is to ought is pragmatically sensitive and adapted to achieving the individual’s goals. But how do we acquire the necessary deontic rules? In this paper, we provide a rationale for this facility linked to Evans’s framework of dual mind rationality. People have an old mind which derives its rationality by repeating what has worked in the past, mostly by experiential learning. New mind rationality, in contrast, is evolutionarily recent, uniquely developed in humans, and draws on our ability to mentally simulate hypothetical events removed in time and place. We contend that the new mind achieves its goals by inducing and applying deontic rules and that a mechanism of deontic introduction evolved for this purpose. (shrink)
Iterated conditionals of the form If p, then if q, r are an important topic in philosophical logic. In recent years, psychologists have gained much knowledge about how people understand simple conditionals, but there are virtually no published psychological studies of iterated conditionals. This paper presents experimental evidence from a study comparing the iterated form, If p, then if q, r with the “imported,” noniterated form, If p and q, then r, using a probability evaluation task and a truth-table task, (...) and taking into account qualitative individual differences. This allows us to critically contrast philosophical and psychological approaches that make diverging predictions regarding the interpretation of these forms. The results strongly support the probabilistic Adams conditional and the “new paradigm” that takes this conditional as a starting point. (shrink)
Our target article identified normativism as the view that rationality should be evaluated against unconditional normative standards. We believe this to be entrenched in the psychological study of reasoning and decision making and argued that it is damaging to this empirical area of study, calling instead for a descriptivist psychology of reasoning and decision making. The views of 29 commentators (from philosophy and cognitive science as well as psychology) were mixed, including some staunch defences of normativism, but also a number (...) that were broadly supportive of our position, although critical of various details. In particular, many defended a position that we call which sees a role for normative evaluation within boundaries alongside more descriptive research goals. In this response, we clarify our use of the term and add discussion of defining both as descriptive and non-normative concepts. We consider the debate with reference to dual-process theory, the psychology of reasoning, and empirical research strategy in these fields. We also discuss cognitive variation by age, intelligence, and culture, and the issue of relative versus absolute definitions of norms. In conclusion, we hope at least to have raised consciousness about the important boundaries between norm and description in the psychology of thinking. (shrink)
In recent years, the psychology of reasoning has been undergoing a paradigm shift, with general Bayesian, probabilistic approaches replacing the older, much more restricted binary logic paradigm. At the same time, dual processing theories have been gaining influence. We argue that these developments should be integrated and moreover that such integration is already underway. The new reasoning paradigm should be grounded in dual processing for its algorithmic level of analysis just as it uses Bayesian theory for its computational level of (...) analysis. Moreover, we propose that, within the new paradigm, these levels of analysis reflect on each other. Bayesianism suggests a specific theoretical understanding of dual processing. Just as importantly, the duality in processing carries over to duality in function; although both types of processes compute degrees of belief, they generate different functions. (shrink)
Two Experiments demonstrate the existence of a “collapse illusion”, in which reasoners evaluate Truthteller-type propositions (“I am telling the truth”) as if they were simply true, whereas Liar-type propositions (“I am lying”) tend to be evaluated as neither true nor false. The second Experiment also demonstrates an individual differences pattern, in which shallow reasoners are more susceptible to the illusion. The collapse illusion is congruent with philosophical semantic truth theories such as Kripke's (1975), and with hypothetical thinking theory's principle of (...) satisficing, but can only be partially accounted for by the model theory principle of truth. Pragmatic effects related to sentence cohesion further reinforce hypothetical thinking theory interpretation of the data, although the illusion and cohesion data could also be accounted for within a modified mental model theory. (shrink)
This theoretical note proposes a two-dimensional cognitive architecture for dual-process theories of reasoning and decision making. Evans (2007b, 2008a, 2009) distinguishes between two types of dual-processing models: parallel-competitive , in which both types of processes operate in parallel, and default-interventionist , in which heuristic processes precede the analytic processes. I suggest that this temporal dimension should be enhanced with a functional distinction between interactionist architecture, in which either type of process influences the content and valence of the other, and independent (...) architecture, in which they do not. Override architecture is a special case of the latter, which postulates statistical interaction, but no interaction of valence and content. I show that this added dimensional distinction casts doubt on two assumptions of statistical modelling that Evans makes: independence and linearity. However, Evans' (2007b) point, that statistical modelling is underspecified vis- -vis the verbal theory, is given further support. The functional dimension is crucial to interpreting the statistical model, as well as to theoretical understanding of the cognitive architecture and its educational applications. (shrink)
We agree that current evolutionary accounts of base-rate neglect are unparsimonious, but we dispute the authors' account of the effect in terms of parallel associative and rule-based processes. We also question their assumption that cueing of nested set relations facilitates performance due to recruitment of explicit reasoning processes. In our account, such reasoning is always involved, but usually unsuccessful.