: The two main psychological theories of the ordinary conditional were designed to account for inferences made from assumptions, but few premises in everyday life can be simply assumed true. Useful premises usually have a probability that is less than certainty. But what is the probability of the ordinary conditional and how is it determined? We argue that people use a two stage Ramsey test that we specify to make probability judgements about indicative conditionals in natural language, and we describe (...) experiments that support this conclusion. Our account can explain why most people give the conditional probability as the probability of the conditional, but also why some give the conjunctive probability. We discuss how our psychological work is related to the analysis of ordinary indicative conditionals in philosophical logic. (shrink)
David E. Over is a leading cognitive scientist and, with his firm grounding in philosophical logic, he also exerts a powerful influence on the psychology of reasoning. He is responsible for not only a large body of empirical work and accompanying theory, but for advancing a major shift in thinking about reasoning, commonly known as the ‘new paradigm’ in the psychology of human reasoning. -/- Over’s signature mix of philosophical logic and experimental psychology has inspired generations of (...) researchers, psychologists, and philosophers alike over more than a quarter of a century. The chapters in this volume, written by a leading group of contributors including a number who helped shape the psychology of reasoning as we know it today, each take their starting point from the key themes of Over’s ground-breaking work. The essays in this collection explore a wide range of central topics—such as rationality, bias, dual processes, and dual systems—as well as contemporary psychological and philosophical theories of conditionals. It concludes with an engaging new chapter, authored by David E. Over himself, which details and analyses the new paradigm psychology of reasoning. -/- This book is therefore important reading for scholars, researchers, and advanced students in psychology, philosophy, and the cognitive sciences, including those who are not familiar with Over’s thought already. (shrink)
David Earl Over is a leading cognitive scientist and, with his firm grounding in philosophical logic, he also exerts a powerful influence on the psychology of reasoning. He is responsible for not only a large body of empirical work and accompanying theory, but for advancing a major shift in thinking about reasoning, commonly known as the 'new paradigm' in the psychology of human reasoning. Over's signature mix of philosophical logic and experimental psychology has inspired generations of researchers, (...) psychologists, and philosophers alike over more than a quarter of a century. The chapters in this volume, written by a leading group of contributors including a number who helped shape the psychology of reasoning as we know it today, each take their starting point from the key themes of Over's ground-breaking work. The essays in this collection explore a wide range of central topics--such as rationality, bias, dual processes, and dual systems--as well as contemporary psychological and philosophical theories of conditionals. It concludes with an engaging new chapter, authored by DavidOver himself, which details and analyses the new paradigm psychology of reasoning. This book is therefore important reading for scholars, researchers, and advanced students in psychology, philosophy, and the cognitive sciences, including those who are not familiar with Over's thought already. (shrink)
Oaksford & Chater (O&C) begin in the halfway Bayesian house of assuming that minor premises in conditional inferences are certain. We demonstrate that this assumption is a serious limitation. They additionally suggest that appealing to Jeffrey's rule could make their approach more general. We present evidence that this rule is not limited enough to account for actual probability judgements.
The two main psychological theories of the ordinary conditional were designed to account for inferences made from assumptions, but few premises in everyday life can be simply assumed true. Useful premises usually have a probability that is less than certainty. But what is the probability of the ordinary conditional and how is it determined? We argue that people use a two stage Ramsey test that we specify to make probability judgements about indicative conditionals in natural language, and we describe experiments (...) that support this conclusion. Our account can explain why most people give the conditional probability as the probability of the conditional, but also why some give the conjunctive probability. We discuss how our psychological work is related to the analysis of ordinary indicative conditionals in philosophical logic. (shrink)
Barbey & Sloman (B&S) relegate the logical rule of the excluded middle to a footnote. But this logical rule is necessary for natural sampling. Making the rule explicit in a logical tree can make a problem easier to solve. Examples are given of uses of the rule that are non-constructive and not reducible to a domain-specific module.
Two experiments using a realistic version of the selection task examined the relationship between participants' probability estimates of finding a counter example and their selections. Experiment 1 used everyday categories in the context of a scenario to determine whether or not the number of instances in a category affected the estimated probability of a counter-example. Experiment 2 modified the scenario in order to alter participants' estimates of finding a specific counter-example. Unlike Kirby 1994a, but consistent with his proposals, both studies (...) showed that probability estimates significantly predicted selection. Overall results point to the value of understanding selections in terms of their subjective expected utility. (shrink)
The new paradigm in the psychology of reasoning adopts a Bayesian, or prob- abilistic, model for studying human reasoning. Contrary to the traditional binary approach based on truth functional logic, with its binary values of truth and falsity, a third value that represents uncertainty can be introduced in the new paradigm. A variety of three-valued truth table systems are available in the formal literature, including one proposed by de Finetti. We examine the descriptive adequacy of these systems for natural language (...) indicative condi- tionals and bets on conditionals. Within our framework the so-called “defective” truth table, in which participants choose a third value when the antecedent of the indicative conditional is false, becomes a coherent response. We show that only de Finetti’s system has a good descriptive fit when uncer- tainty is the third value. (shrink)
A study is reported testing two hypotheses about a close parallel relation between indicative conditionals, if A then B , and conditional bets, I bet you that if A then B . The first is that both the indicative conditional and the conditional bet are related to the conditional probability, P(B|A). The second is that de Finetti's three-valued truth table has psychological reality for both types of conditional— true , false , or void for indicative conditionals and win , lose (...) , or void for conditional bets. The participants were presented with an array of chips in two different colours and two different shapes, and an indicative conditional or a conditional bet about a random chip. They had to make judgements in two conditions: either about the chances of making the indicative conditional true or false or about the chances of winning or losing the conditional bet. The observed distributions of responses in the two conditions were generally related to the conditional probability, supporting the first hypothesis. In addition, a majority of participants in further conditions chose the third option, “void”, when the antecedent of the conditional was false, supporting the second hypothesis. (shrink)
In his commentary, Oaksford makes two main claims: (1) that the externalisation method used by Green, Over, and Pyne (1997) enforces the correlation observed between probability estimates and selection, and (2) that these estimates support the prediction of a downward revision of P(p) when P(p) > P(q). In this reply, we rebut claim 1 by describing the instructions more comprehensively, and claim 2 by reiterating the importance of making certain theoretical distinctions which Oaksford does not make. Our interest is (...) the psychological process of reaching decision: externalisation methods provide a means of exploring this process and of assessing the value of Bayesian approaches. (shrink)
We investigate how the perceived uncertainty of a conditional affects a person's choice of conclusion. We use a novel procedure to introduce uncertainty by manipulating the conditional probability of the consequent given the antecedent. In Experiment 1, we show first that subjects reduce their choice of valid conclusions when a conditional is followed by an additional premise that makes the major premise uncertain. In this we replicate Byrne. These subjects choose, instead, a qualified conclusion expressing uncertainty. If subjects are given (...) a third statement that qualifies the likelihood of the additional premise, then the uncertainty of the conclusions they choose is systematically related to the suggested uncertainty. Experiment 2 confirms these observations in problems that omit the additional premise and qualify the first premise directly. Experiment 3 shows that the qualifying statement also affects the perceived probability of the consequent given the antecedent of the conditional. Experiment 4 investigates the effect of suggested uncertainty on the fallacies and shows that increases in uncertainty reduce the number of certain conclusions that are chosen while affirming the consequent but have no effect on denying the antecedent. We discuss our results in terms of rule theories and mental models and conclude that the latter give the most natural account of our results. (shrink)
Four experiments investigated uncertainty about a premise in a deductive argument as a function of the expertise of the speaker and of the conversational context. The procedure mimicked everyday reasoning in that participants were not told that the premises were to be treated as certain. The results showed that the perceived likelihood of a conclusion was greater when the major or the minor premise was uttered by an expert rather than a novice (Experiment 1). The results also showed that uncertainty (...) about the conclusion was higher when the major premise was uttered by a novice and an alternative premise by an expert, compared to when the major premise was uttered by an expert and the alternative by a novice (Experiment 2). Similarly, the believability of a conclusion was considerably lower when the minor premise was uttered by a novice and denied by an expert, as opposed to when an expert uttered the minor premise and a novice denied it (Experiment 3). Experiment 4 showed that the nature of the uncertainty induced by a denial of the minor premise depended on whether or not the context was a conversation. These results pose difficult problems for current theories of reasoning, as current theories are based on the results of experiments in which the premises are treated as certain. Our discussion of the results emphasises the importance of pragmatics in reasoning, namely, the role of general knowledge about the world in assessing the probability of a premise uttered by an expert or a novice and the role of interpretations of the premise based on pragmatic inferences in revising these initial probabilities. (shrink)
We analyze selected iterated conditionals in the framework of conditional random quantities. We point out that it is instructive to examine Lewis's triviality result, which shows the conditions a conditional must satisfy for its probability to be the conditional probability. In our approach, however, we avoid triviality because the import-export principle is invalid. We then analyze an example of reasoning under partial knowledge where, given a conditional if A then Cas information, the probability of A should intuitively increase. We explain (...) this intuition by making some implicit background information explicit. We consider several iterated conditionals, which allow us to formalize different kinds of latent information. We verify that for these iterated conditionals the prevision is greater than or equal to the probability of A. We also investigate the lower and upper bounds of the Affirmation of the Consequent inference. We conclude our study with some remarks on the supposed "independence" of two conditionals, and we interpret this property as uncorrelation between two random quantities. 2020 Elsevier Inc. All rights reserved. (shrink)
Introduction: Foundations of faith described -- Christian history : a brief overview -- The Apostolic Age (ca. A.D. 30-100 -- The Patristic Age (ca. A.D. 100-500) -- The Medieval Age (ca. A.D. 500-1500) -- The Reformation/counter-Reformation Age -- The Modern Age (ca. A.D. 1600-1950) -- The Postmodern Age (ca. A.D. 1950-present) -- Mormon and evangelical theology : a comparison -- Scripture and revelation -- God and humanity -- Church and temple -- Salvation and the afterlife -- Moral and social standards (...) -- Mormonism and Christianity -- Sociological foundations of faith -- Question 9: Who was or is the greatest influence on your religious beliefs? -- Question 10: What were the religious beliefs of your family when you were growing up? -- Question 13: How much do you associate with people that hold to other religious beliefs? -- Question 15: What would be the social consequences for you if you converted to another religion? -- Sociological foundations of faith : conclusion -- Spiritual foundations of faith -- Question 4: To what extent has spiritual or religious -- Question 6: Mormons and Evangelicals claim to have the witness of the Holy Ghost/Holy Spirit in their hearts confirming the Mormons and Evangelicals -- Truth of their faith : how do you know that the assurance you have in your heart is from God? -- Question 8: What do you think of the religious experiences of people outside your religion, especially those which seem to confirm their religious beliefs to them? -- Rational foundations of faith -- Question 16: To what extent do you have faith because you think that Mormonism/Evangelicalism is reasonable? -- Question 18: What do you consider to be the best proof or evidence for Mormonism/Evangelicalism? -- Question 19: Would you believe that Mormonism/Evangelicalism is true even if most of the evidence were against it? -- Question 20: How has your assurance changed over time? -- Question 21: What has caused your faith to become stronger or weaker over time? -- Question 22: To what extent do you ever doubt that Mormonism/Evangelicalism is true? -- Question 23: If you sometimes doubt that your beliefs are true, what causes you to doubt? -- Question 24: How do you respond to or deal -- Conversion stories -- Conclusion: Foundations of faith prescribed. (shrink)
The current literature on indeterminacy centers around two projects. One concerns the logic of indeterminacy; the other concerns its nature or source. The aim of this paper is to introduce, motivate and go some way toward addressing a new, third project: that of providing what I call a minimal characterization of indeterminacy. An MC, to a first approximation, is a relatively pre-theoretical characterization of indeterminacy that is neutral between the various substantive theories of the nature and logic of indeterminacy. An (...) MC thus captures a generic sense of indeterminacy that, at least in principle, is recognized by all parties to the debate over the phenomenon’s underlying nature and logic. I begin by introducing the concept of an MC and outlining some of the main theoretical virtues of providing an MC. I then establish some desiderata on a suitable MC, and use these desiderata to rule out various initially attractive proposals. In the final part of the paper I sketch the beginnings of my own MC and defend it against objections. (shrink)
This 1993 book discusses the rise of the marginalist conception of the firm in the context of economic thought over the past two centuries, and explains why economists continue to defend a theory with demonstrable shortcomings. Professor Schrader argues that the marginalist view of the firm retains its support not through any comparative advantage in empirical or predictive power, but by virtue of its being a part of the predominant marginalist economic programme. The clear problems that beset the marginalist (...) approach to the firm signal a general dilemma for economic theory as a whole. (shrink)
The ninth-century treatises Musica and Scolica Enchiriadis are the first musical writings in the West to present a theory of organum, a mode of plainchant performance that is the earliest known form of Western medieval polyphony. The fundamental principle of this theory is that the intervallic relationship between the simultaneous melodic lines be one of "consonance" . Nevertheless, intervals arise between the voice-parts that are not symphoniae; the theory responds to this, not by explicitly invoking the concept of dissonance, but (...) by excluding these non-symphonic or non-consonant intervals from the concept of "organum." Through close analysis of the relevant texts, the dissertation seeks to explicate this concept of organum as symphonia in terms of a "metaphysics of consonance" that the Enchiriadis theorists took over from the most influential of the musical writings of late Antiquity, the De institutione musica of Boethius. ;The metaphysics of consonance projected therein is a particular development of the Pythagorean-Platonic conception of music, in which the symphoniae, because of their simple numerical ratios, represent the universal, divinely ordained rational principles of cosmic order and harmony. After a section devoted to showing that such "speculative" ideas about music formed the basis of the "practical" theory of the Enchiriadis treatises, the dissertation goes on in Part II to find that the relationship between consonance and dissonance in the Boethius Musica is structured as an unequal, hierarchized opposition, in which consonance holds the dominant position in particular because of its virtual identification with the metaphysical ideal of unity. Conversely dissonance, the opposed term of the hierarchy, represents the negative values of duality and difference, and hence discord, conflict, and disorder. Consequently, through various textual strategies, consonance is identified with "music" itself, while dissonance is effectively excluded from that domain. ;The concept of organum in the Enchiriadis treatises is found to be determined by this same metaphysics of consonance, with its consequent tendency to exclude or marginalize dissonance, a conceptual pattern that has resonated throughout the subsequent history of polyphonic theory and practice in the West. (shrink)
This second edition of Historical Dictionary of Schopenhauer's Philosophy contains a chronology, an introduction, an appendix, and an extensive bibliography. The dictionary section has over 300 cross-referenced entries on all of Schopenhauer’s books, significant philosophical ideas and concepts.
Civil wars vary greatly in their duration. This book argues that conflicts are longer when they involve more actors who can block agreement and identifies specific problems that arise in multi-party bargaining. Quantitative analysis of over 200 civil wars since World War II reveals that conflicts with more of these actors last much longer than those with fewer. Detailed comparison of negotiations in Rwanda and Burundi demonstrates that multi-party negotiations present additional barriers to peace not found in two party (...) conflicts. In addition, conflicts with more veto players produce more casualties, are more likely to involve genocide and are followed by shorter periods of peace. Because they present many barriers to peace, the international community has a poor track record of resolving multi-party conflicts. David Cunningham shows that resolution is possible in these wars if peace processes are designed to address the barriers that emerge in multi-party conflicts. (shrink)
Studies of categorical induction typically examine how belief in a premise (e.g., Falcons have an ulnar artery) projects on to a conclusion (e.g., Robins have an ulnar artery). We study induction in cases in which the premise is uncertain (e.g., There is an 80% chance that falcons have an ulnar artery). Jeffrey's rule is a normative model for updating beliefs in the face of uncertain evidence. In three studies we tested the descriptive validity of Jeffrey's rule and a related probability (...) theorem, the rule of total probability. Although these rules provided good approximations to mean judgments in some cases, the results from regression and correlation analyses suggest that participants focus on the parts of these rules that are associated with the highest overall probability. We relate our findings to rational models of judgment. (shrink)
The short answer, which will no doubt frustrate those who read to find the short answer, is yes and no. Yes in respect of the fact that all agents are not the same and so what is good for one agent may be different from what is good for another agent. No in respect of the fact that normativity, or standards which range over agents relevantly similar, is still quite present. The point of this paper will be to unpack (...) this position. (shrink)
Dienes' & Perner's proposals are discussed in relation to the distinction between explicit and implicit systems of thinking. Evans and Over (1996) propose that explicit processing resources are required for hypothetical thinking, in which mental models of possible world states are constructed. Such thinking requires representations in which the individuals' propositional attitudes including relevant beliefs and goals are made fully explicit.
Twenty-five years ago Paul Wilpert called for a thorough re-examination of our knowledge of the content of Aristotle's lost workDe Philosophia. Expressing his reservations about the validity of our current reconstruction of the work, he wrote: ‘On the basis of attested fragments, we form for ourselves a picture of the content of a lost writing, and this picture in turn serves to interpret new fragments as echoes of that writing. So our joy over the swift growth of our collection (...) of fragments is clouded by the thought that we are not thereby really nearing the original character of the work, but we are entangling ourselves ever more tightly in a picture we ourselves have created.’ As a corrective Wilpert called for a critical retracing of our steps since 1830 to establish a more secure reconstruction of this important lost work.Since then there have been numerous, searching analyses of the ideas and fragments ofDe Philosophia, but at least one venerable old theory has escaped critical reappraisal: namely, the theory that inDe PhilosophiaAristotle discussed his doctrine of a fifth element, i.e. his belief that the heavenly bodies are composed of an element distinct from the four earthly elements, earth, water, air, and fire. This theory has become so widely accepted that it has virtually become a fact. When support is needed, most modern authors simply cite one or both of the two modern authorities on the early Aristotle, namely W. Jaeger and E. Bignone. The more meticulous restate the traditional evidence with complete confidence that this evidence proves their case. If Wilpert's hope for a firmly grounded reconstruction of theDe Philosophiais ever to be achieved, one of the importantdesideratatoday is a critical re-examination of the evidence for the fifth element in this work. (shrink)
Causal induction in the real world often has to be quick and efficient as well as accurate. We propose that people use two different frames to achieve these goals. The A-frame consists of heuristic processes that presuppose rarity and can detect causally relevant factors quickly. The B-frame consists of analytic processes that can be highly accurate in detecting actual causes. Our dual frame theory implies that several factors affect whether people use the A-frame or the B-frame in causal induction: among (...) these are symmetrical negation, intervention and commitment. This theory is tested and sustained in two experiments. The results also provide broad support for dual process accounts of human thinking in general. (shrink)
Psychological research on people’s understanding of natural language connectives has traditionally used truth table tasks, in which participants evaluate the truth or falsity of a compound sentence given the truth or falsity of its components in the framework of propositional logic. One perplexing result concerned the indicative conditional if A then C which was often evaluated as true when A and C are true, false when A is true and C is false but irrelevant“ (devoid of value) when A is (...) false (whatever the value of C). This was called the “psychological defective table of the conditional.” Here we show that far from being anomalous the “defective” table pattern reveals a coherent semantics for the basic connectives of natural language in a trivalent framework. This was done by establishing participants’ truth tables for negation, conjunction, disjunction, conditional, and biconditional, when they were presented with statements that could be certainly true, certainly false, or neither. We review systems of three-valued tables from logic, linguistics, foundations of quantum mechanics, philosophical logic, and artificial intelligence, to see whether one of these systems adequately describes people’s interpretations of natural language connectives. We find that de Finetti’s (1936/1995) three-valued system is the best approximation to participants’ truth tables. (shrink)
Teases out from assumptions underlying Polybius's constitutional theory an otherwise unknown subjectivist, agent-relative utilitarian theory of well-being. In contrast to other ancient theories, other-concern is assumed to be rooted in nonrational human nature and without moral value. Moral concepts arise within a social community from rational reflection on personal experience and lead to socially constructed moral values and political institutions that promote cooperative over competitive behaviors. The assumptions meet Arcesilaus's skeptical objections to dogmatic ethics. Polybius, some of whose political (...) associates studied under Arcesilaus, may have derived his theory from current antiskeptical justifications of normative ethics and politics. (shrink)
M. Oaksford and N. Chater presented a Bayesian analysis of the Wason selection task in which they proposed that people choose cards in order to maximize expected information gain as measured by reduction in uncertainty in the Shannon-Weaver information theory sense. It is argued that the EIG measure is both psychologically implausible and normatively inadequate as a measure of epistemic utility. The article is also concerned with the descriptive account of findings in the selection task literature offered by Oaksford and (...) Chater. First, it is shown that their analysis data reported in the recent article of K. N. Kirby is unsound; second, an EIG analysis is presented of the experiments of P. Pollard and J. St. B. T. Evans that provides a strong empirical disconfirmation of the theory. (shrink)
Carruthers’proposals would seem to implicate language in what is known as System 2 thinking (explicit) rather than System 1 thinking (implicit) in contemporary dual process theories of thinking and reasoning. We provide outline description of these theories and show that while Carruthers’characterization of non-verbal processes as domain-specific identifies one critical feature of System 1 thinking, he appears to overlook the fact that much cognition of this type results from domain-general learning processes. We also review cognitive psychological evidence that shows that (...) language and the explicit representations it supports are heavily involved in supporting System 1 thinking, but falls short of supporting his claim that it is the medium in which domain-general thinking occurs. (shrink)