In this book the author gives a broad overview of different areas of research in nonmonotonicreasoning, and presents some new results and ideas based on his research. The guiding principles are: clarification of the different research activities in the area, which have sometimes been undertaken independently of each other; and appreciation of the fact that these research activities often represent different means to the same ends, namely sound theoretical foundations and efficient computation. The book begins with a (...) discussion of the various types of nonmonotonicreasoning, their applications and their logics. Theorem proving techniques for these logics are also described. The following chapters deal with formulations of nonmonotonic inheritance, and nonmonotonicreasoning based on nonmonotonic rules. The final chapter discusses the achievements in the field in the light of the Yale shooting example. The book will be welcomed by researchers in theoretical computer science and artificial intelligence. (shrink)
This book by one of the world's foremost philosophers in the fields of epistemology and logic offers an account of suppositional reasoning relevant to practical deliberation, explanation, prediction and hypothesis testing. Suppositions made 'for the sake of argument' sometimes conflict with our beliefs, and when they do, some beliefs are rejected and others retained. Thanks to such belief contravention, adding content to a supposition can undermine conclusions reached without it. Subversion can also arise because suppositional reasoning is ampliative. (...) These two types of nonmonotonic logic are the focus of this book. A detailed comparison of nonmonotonicity appropriate to both belief contravening and ampliative suppositional reasoning reveals important differences that have been overlooked. (shrink)
Change, Choice and Inference develops logical theories that are necessary both for the understanding of adaptable human reasoning and for the design of intelligent systems. The book shows that reasoning processes - the drawing on inferences and changing one's beliefs - can be viewed as belonging to the realm of practical reason by embedding logical theories into the broader context of the theory of rational choice. The book unifies lively and significant strands of research in logic, philosophy, economics (...) and artificial intelligence. It elaborates on the relevant theories and provides a mathematically precise foundation for the thesis that large parts of theoretical reason can be subsumed under practical reason. (shrink)
This monograph provides a new account of justified inference as a cognitive process. In contrast to the prevailing tradition in epistemology, the focus is on low-level inferences, i.e., those inferences that we are usually not consciously aware of and that we share with the cat nearby which infers that the bird which she sees picking grains from the dirt, is able to fly. Presumably, such inferences are not generated by explicit logical reasoning, but logical methods can be used to (...) describe and analyze such inferences. Part 1 gives a purely system-theoretic explication of belief and inference. Part 2 adds a reliabilist theory of justification for inference, with a qualitative notion of reliability being employed. Part 3 recalls and extends various systems of deductive and nonmonotonic logic and thereby explains the semantics of absolute and high reliability. In Part 4 it is proven that qualitative neural networks are able to draw justified deductive and nonmonotonic inferences on the basis of distributed representations. This is derived from a soundness/completeness theorem with regard to cognitive semantics of nonmonotonicreasoning. The appendix extends the theory both logically and ontologically, and relates it to A. Goldman's reliability account of justified belief. This text will be of interest to epistemologists and logicians, to all computer scientists who work on nonmonotonicreasoning and neural networks, and to cognitive scientists. (shrink)
Conclusions reached using common sense reasoning from a set of premises are often subsequently revised when additional premises are added. Because we do not always accept previous conclusions in light of subsequent information, common sense reasoning is said to be nonmonotonic. But in the standard formal systems usually studied by logicians, if a conclusion follows from a set of premises, that same conclusion still follows no matter how the premise set is augmented; that is, the consequence relations (...) of standard logics are monotonic. Much recent research in AI has been devoted to the attempt to develop nonmonotonic logics. After some motivational material, we give four formal proofs that there can be no nonmonotonic consequence relation that is characterized by universal constraints on rational belief structures. In other words, a nonmonotonic consequence relation that corresponds to universal principles of rational belief is impossible. We show that the nonmonotonicity of common sense reasoning is a function of the way we use logic, not a function of the logic we use. We give several examples of how nonmonotonicreasoning systems may be based on monotonic logics. (shrink)
This paper has two goals. First, we develop frameworks for logical systems which are able to re ect not only nonmonotonic patterns of reasoning, but also paraconsistent reasoning. Our second goal is to have a better understanding of the conditions that a useful relation for nonmonotonicreasoning should satisfy. For this we consider a sequence of generalizations of the pioneering works of Gabbay, Kraus, Lehmann, Magidor and Makinson. These generalizations allow the use of monotonic nonclassical (...) logics as the underlying logic upon which nonmonotonicreasoning may be based. Our sequence of frameworks culminates in what we call (following Lehmann) plausible, nonmonotonic, multiple-conclusion consequence relations (which are based on a given monotonic one). Our study yields intuitive justi cations for conditions that have been proposed in previous frameworks and also clari es the connections among some of these systems. In addition, we present a general method for constructing plausible nonmonotonic relations. This method is based on a multiple-valued semantics, and on Shoham's idea of preferential models. 1.. (shrink)
Three studies of human nonmonotonicreasoning are described. The results show that people find such reasoning quite difficult, although being given problems with known subclass-superclass relationships is helpful. The results also show that recognizing differences in the logical strengths of arguments is important for the nonmonotonic problems studied. For some of these problems, specificity – which is traditionally considered paramount in drawing appropriate conclusions – was irrelevant and so should have lead to a “can’t tell” response; (...) however, people could give rational conclusions based on differences in the logical consequences of arguments. The same strategy also works for problems where specificity is relevant, suggesting that in fact specificity is not paramount. Finally, results showed that subjects’ success at responding appropriately to nonmonotonic problems involving conflict relies heavily on the ability to appreciate differences in the logical strength of simple, non-conflicting, statements. (shrink)
Nonmonotonicreasoning is often claimed to mimic human common sense reasoning. Only a few studies, though, investigated this claim empirically. In the present paper four psychological experiments are reported, that investigate three rules of system p, namely the and, the left logical equivalence, and the or rule. The actual inferences of the subjects are compared with the coherent normative upper and lower probability bounds derived from a non-infinitesimal probability semantics of system p. We found a relatively good (...) agreement of human reasoning and principles of nonmonotonicreasoning according to the coherence interpretation of system p. Contrary to the results reported in the “heuristics and biases” tradition, the subjects committed relatively few upper bound violations (conjunction fallacies). More lower than upper bound violations were observed. When the premises were presented in terms of intervals higher mean lower bounds were observed as when the premises were presented in terms of point percentages. (shrink)
Reasoning almost always occurs in the face of incomplete information. Such reasoning is nonmonotonic in the sense that conclusions drawn may later be withdrawn when additional information is obtained. There is an active literature on the problem of modeling such nonmonotonicreasoning, yet no category of method-let alone a single method-has been broadly accepted as the right approach. This paper introduces a new method, called sweeping presumptions, for (...) modeling nonmonotonicreasoning. The main goal of the paper is to provide an example-driven, nontechnical introduction to the method of sweeping presumptions, and thereby to make it plausible that sweeping presumptions can usefully be applied to the problems of nonmonotonicreasoning. The paper discusses a representative sample of examples that have appeared in the literature on nonmonotonicreasoning, and discusses them from the point of view of sweeping presumptions. (shrink)
Nonmonotonic conditionals (A |∼ B) are formalizations of common sense expressions of the form “if A, normally B”. The nonmonotonic conditional is interpreted by a “high” coherent conditional probability, P(B|A) > .5. Two important properties are closely related to the nonmonotonic conditional: First, A |∼ B allows for exceptions. Second, the rules of the nonmonotonic system p guiding A |∼ B allow for withdrawing conclusions in the light of new premises. This study reports a series of (...) three experiments on reasoning with inference rules about nonmonotonic conditionals in the framework of coherence. We investigated the cut, and the right weakening rule of system p. As a critical condition, we investigated basic monotonic properties of classical (monotone) logic, namely monotonicity, transitivity, and contraposition. The results suggest that people reason nonmonotonically rather than monotonically. We propose nonmonotonicreasoning as a competence model of human reasoning. (shrink)
We study the problem of embedding Halpern and Moses's modal logic of minimal knowledge states into two families of modal formalism for nonmonotonicreasoning, McDermott and Doyle's nonmonotonic modal logics and ground nonmonotonic modal logics. First, we prove that Halpern and Moses's logic can be embedded into all ground logics; moreover, the translation employed allows for establishing a lower bound (3p) for the problem of skeptical reasoning in all ground logics. Then, we show a translation (...) of Halpern and Moses's logic into a significant subset of McDermott and Doyle's formalisms. Such a translation both indicates the ability of Halpern and Moses's logic of expressing minimal knowledge states in a more compact way than McDermott and Doyle's logics, and allows for a comparison of the epistemological properties of such nonmonotonic modal formalisms. (shrink)
This paper gives a new, proof-theoretic explanation of partial-order reasoning about time in a nonmonotonic theory of action. The explanation relies on the technique of lifting ground proof systems to compute results using variables and unification. The ground theory uses argumentation in modal logic for sound and complete reasoning about specifications whose semantics follows Gelfond and Lifschitz’s language . The proof theory of modal logic A represents inertia by rules that can be instantiated by sequences of time (...) steps or events. Lifting such rules introduces string variables and associates each proof with a set of string equations; these equations are equivalent to a set of partial-order tree-constraints that can be solved efficiently. The defeasible occlusion of inertia likewise imposes partial-order constraints in the lifted system. By deriving an auxiliary partial order representation of action from the underlying logic, not the input formulas or proofs found, this paper strengthens the connection between practical planners and formal theories of action. Moreover, the general correctness of the theory of action justifies partial-order representations not only for forward reasoning from a completely specified start state, but also for explanatory reasoning and for reasoning by cases. (shrink)
A. Tarski  proposed the study of infinitary consequence operations as the central topic of mathematical logic. He considered monotonicity to be a property of all such operations. In this paper, we weaken the monotonicity requirement and consider more general operations, inference operations. These operations describe the nonmonotonic logics both humans and machines seem to be using when infering defeasible information from incomplete knowledge. We single out a number of interesting families of inference operations. This study of infinitary inference (...) operations is inspired by the results of  on nonmonotonic inference relations, and relies on some of the definitions found there. (shrink)
After a review of situation theory and previous attempts at `computational' situation theory, we present a new programming environment, BABY-SIT, which is based on situation theory. We then demonstrate how problems requiring formal temporal reasoning can be solved in this framework. Specifically, the Yale Shooting Problem, which is commonly regarded as a canonical problem for nonmonotonic temporal reasoning, is implemented in BABY-SIT using Yoav Shoham's causal theories.
Normic laws have the form "if A, then normally B". This paper attempts to show that if a philosophical analysis of normic laws (1, 4) is combined with certain developments in nonmono- tonic logic (2, 3), the following problems in philosophy of science can be seen in a new pers- pective which, at least in many cases, allows to improve their received analysis: explanation and individual case understanding in the humanities (1, 2), an evolution-theoretic foundation of normic laws which explains (...) their omnipresence and establishes a the connection between prototypi- cal and statstical normality, (¤4), ceteris paribus laws (¤5), differences between physical versus non-physical sciences (¤6) and finally, theory-protection through auxiliary hypotheses (¤7). (shrink)
We present new probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment, called Zλ- and lexλ-entailment, which are parameterized through a value λ ∈ [0,1] that describes the strength of the inheritance of purely probabilistic knowledge. In the special cases of λ = 0 and λ = 1, the notions of Zλ- and lexλ-entailment coincide with probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment that have been recently introduced by the author. We show (...) that the notions of Zλ- and lexλ-entailment have similar properties as their classical counterparts. In particular, they both satisfy the rationality postulates of System P and the property of Rational Monotonicity. Moreover, Zλ-entailment is weaker than lexλ-entailment, and both Zλ- and lexλ-entailment are proper generalizations of their classical counterparts. (shrink)
The present chapter describes a probabilistic framework of human reasoning. It is based on probability logic. While there are several approaches to probability logic, we adopt the coherence based approach.
Nonmonotonicreasoning is often claimed to mimic human common sense reasoning. Only a few studies, though, have investigated this claim empirically. We report four experiments which investigate three rules of SYSTEMP, namely the AND, the LEFT LOGICAL EQUIVALENCE, and the OR rule. The actual inferences of the subjects are compared with the coherent normative upper and lower probability bounds derived from a non-infinitesimal probability semantics of SYSTEM P. We found a relatively good agreement of human reasoning (...) and principles of nonmonotonicreasoning. Contrary to the results reported in the ‘heuristics and biases’ tradition, the subjects committed relatively few upper bound violations (conjunction fallacies). (shrink)
Nonmonotonic logics allow—contrary to classical (monotone) logics— for withdrawing conclusions in the light of new evidence. Nonmonotonicreasoning is often claimed to mimic human common sense reasoning. Only a few studies, though, have investigated this claim empirically. system p is a central, broadly accepted nonmonotonicreasoning system that proposes basic rationality postulates. We previously investigated empirically a probabilistic interpretation of three selected rules of system p. We found a relatively good agreement of human (...) class='Hi'>reasoning and principles of nonmonotonicreasoning according to the coherence interpretation of system p. This study reports an experiment on the cautious monotonicity Rule and its “incautious” counterpart that is not contained in system p, namely the monotonicity Rule. In accordance with our previous results, the data suggest that people reason nonmonotonically: the subjects in the cautious monotonicity condition infer significantly tighter intervals close to the coherence interpretation of system p compared with the subjects in the incautious monotonicity condition where rather wide (and hence non-informative) intervals are inferred. (shrink)
Handling exceptions in a knowledge-based system is an important issue in many application domains, such as medical domain. Recently, there is an increasing interest in nonmonotonic extension of description logics to handle exceptions in ontologies. In this paper, we propose three preferential semantics for plausible subsumption to deal with exceptions in description logic-based knowledge bases. Our preferential semantics are defined in the framework of possibility theory, which is an uncertainty theory devoted to handling incomplete information. We consider the properties (...) of these semantics and their relationships. We also discuss the relationship between two of our preferential semantics and two existing preferential semantics. We extend a description logic-based knowledge base by adding preferential subsumptions. Entailment of plausible subsumptions relative to an extended knowledge base is defined. Properties of the preferential subsumption relations relative to an extended description logic-based knowledge base are discussed. Finally, we show that our semantics for plausible subsumption can be reduced to standard semantics of an expressive description logic. Thus, the problem of plausible subsumption checking under our semantics can be reduced to the problem of subsumption checking under the classical semantics. (shrink)
Peter Gärdenfors has proved (Philosophical Review, 1986) that the Ramsey rule and the methodologically conservative Preservation principle are incompatible given innocuous-looking background assumptions about belief revision. Gärdenfors gives up the Ramsey rule; I argue for preserving the Ramsey rule and interpret Gärdenfors's theorem as showing that no rational belief-reviser can avoid reasoning nonmonotonically. I argue against the Preservation principle and show that counterexamples to it always involve nonmonotonicreasoning. I then construct a new formal model of belief (...) revision that does accommodate nonmonotonicreasoning. (shrink)
Nonmonotonic consequence is the subject of a vast literature, but the idea of a nonmonotonic counterpart of logical inconsistency—the idea of a defeasible property representing internal conflict of an inductive or evidential nature—has been entirely neglected. After considering and dismissing two possible analyses relating nonmonotonic consequence and a nonmonotonic counterpart of logical inconsistency, this paper offers a set of postulates for nonmonotonic inconsistency, an analysis of nonmonotonic inconsistency in terms of nonmonotonic consequence, and (...) a series of results showing that nonmonotonic inconsistency conforms to these postulates given the analysis of nonmonotonic inconsistency presented here and certain postulates for nonmonotonic consequence. The results presented here establish the interest of certain previously undiscussed postulates of nonmonotonic consequence. These results also show that nonmonotonicity, which has never seemed useful in the formulation of general principles governing nonmonotonicreasoning, is relevant to the positive characterization of nonmonotonic inference after all. (shrink)
Early attempts at combining multiple inheritance with nonmonotonicreasoning were based on straightforward extensions of tree-structured inheritance systems, and were theoretically unsound. In The Mathcmat~'cs of Inheritance Systcrns, or TMOIS, Touretzky described two problems these systems cannot handle: reasoning in the presence of true but redundant assertions, and coping with ambiguity. TMOIS provided a definition and analysis of a theoretically sound multiple inheritance system, accom-.
Charles Morgan has argued that nonmonotonic logic is ``impossible''. We show here that those arguments are mistaken, and that Morgan's preferred alternative, the representation of nonmonotonicreasoning by ``presuppositions'' fails to provide a framework in which nonmonotonicreasoning can be constructively criticised. We argue that an inductive logic, based on probabilistic acceptance, offers more than Morgan's approach through presuppositions.
This paper compares two ways of formalising defeasible deontic reasoning, both based on the view that the issues of conflicting obligations and moral dilemmas should be dealt with from the perspective of nonmonotonicreasoning. The first way is developing a special nonmonotonic logic for deontic statements. This method turns out to have some limitations, for which reason another approach is recommended, viz. combining an already existing nonmonotonic logic with a deontic logic. As an example of (...) this method the language of Reiter's default logic is extended to include modal expressions, after which the argumentation framework in default logic of [20, 22] is used to give a plausible logical analysis of moral dilemmas and prima facie obligations. (shrink)
Charles Morgan has argued that nonmonotonic logic is ``impossible''. We show here that those arguments are mistaken, and that Morgan's preferred alternative, the representation of nonmonotonicreasoning by ``presuppositions'' fails to provide a framework in which nonmonotonicreasoning can be constructively criticised. We argue that an inductive logic, based on probabilistic acceptance, offers more than Morgan's approach through presuppositions.
Default reasoning occurs whenever the truth of the evidence available to the reasoner does not guarantee the truth of the conclusion being drawn. Despite this, one is entitled to draw the conclusion “by default” on the grounds that we have no information which would make us doubt that the inference should be drawn. It is the type of conclusion we draw in the ordinary world and ordinary situations in which we find ourselves. Formally speaking, ‘nonmonotonicreasoning’ refers (...) to argumentation in which one uses certain information to reach a conclusion, but where it is possible that adding some further information to those very same premises could make one want to retract the original conclusion. It is easily seen that the informal notion of default reasoning manifests a type of nonmonotonicreasoning. Generally speaking, default statements are said to be true about the class of objects they describe, despite the acknowledged existence of “exceptional instances” of the class. In the absence of explicit information that an object is one of the exceptions we are enjoined to apply the default statement to the object. But further information may later tell us that the object is in fact one of the exceptions. So this is one of the points where nonmonotonicity resides in default reasoning. (shrink)
This article begins with an introduction to defeasible (nonmonotonic) reasoning and a brief description of a computer program, EVID, which can perform such reasoning. I then explain, and illustrate with examples, how this program can be applied in computational representations of ordinary dialogic argumentation. The program represents the beliefs and doubts of the dialoguers, and uses these propositional attitudes, which can include commonsense defeasible inference rules, to infer various changing conclusions as a dialogue progresses. It is proposed (...) that computational representations of this kind are a useful tool in the analysis of dialogic argumentation, and, in particular, demonstrate the important role of defeasible reasoning in everyday arguments using commonsense reasoning. (shrink)
cuted actions. It has been applied to several challenge problems in the theory of commonsense knowledge. We study the relationship between this formalism and other work on nonmonotonicreasoning and knowl-.
in models. We show that these natural preferential In the research on paraconsistency, preferential systems systems that were originally designed for paraconwere used for constructing logics which are paraconsistent sistent reasoning fulﬁll a key condition (stopperedbut stronger than substructural paraconsistent logics. The ness or smoothness) from the theoretical research preferences in these systems were deﬁned in different ways. of nonmonotonicreasoning. Consequently, the Some were based on checking which abnormal formulas nonmonotonic consequence relations that they in-.
Certain extensions of Nelson's constructive logic N with strong negation have recently become important in arti.cial intelligence and nonmonotonicreasoning, since they yield a logical foundation for answer set programming (ASP). In this paper we look at some extensions of Nelson's .rst-order logic as a basis for de.ning nonmonotonic inference relations that underlie the answer set programming semantics. The extensions we consider are those based on 2-element, here-and-there Kripke frames. In particular, we prove completeness for .rst-order here-and-there (...) logics, and their minimal strong negation extensions, for both constant and varying domains. We choose the constant domain version, which we denote by QNc5, as a basis for de.ning a .rst-order nonmonotonic extension called equilibrium logic. We establish several metatheoretic properties of QNc5, including Skolem forms and Herbrand theorems and Interpolation, and show that the .rst-oder version of equilibrium logic can be used as a foundation for answer set inference. (shrink)
: Claus Oetke, in his "Ancient Indian Logic as a Theory of Non-monotonic Reasoning," presents a sweeping new interpretation of the early history of Indian logic. His main proposal is that Indian logic up until Dharmakirti was nonmonotonic in character-similar to some of the newer logics that have been explored in the field of Artificial Intelligence, such as default logic, which abandon deductive validity as a requirement for formally acceptable arguments; Dharmakirti, he suggests, was the first to consider (...) that a good argument should be one for which it is not possible for the property identified as the "reason" (hetu) to occur without the property to be proved (sadhya)-a requirement akin to deductive validity. Oetke's approach is challenged here, arguing that from the very beginning in India something like monotonic, that is, deductively valid, reasoning was the ideal or norm, but that the conception of that ideal was continually refined, in that the criteria for determining when it is realized were progressively sharpened. (shrink)
This is a reply to a paper by Graham Oppy in the July, 1999 issue of this journal, “Koons’ Cosmological Argument.” Recent work in defeasible or nonmonotonic logic means that the cosmological argument can be cast in such a way that it does not presuppose that every contingent situation, without exception, has a cause. Instead, the burden of proof is shifted to the skeptic, who must produce positive reasons for thinking that the cosmos is an exception to the defeasible (...) law of causality. I show how Oppy’s critique can be turned into a plausible rebuttal of my argument. However, this rebuttal can be set aside when the original argument is supplemented by a plausible account of the nature of causal priority. Several independent lines of argument in support of this account are outlined. (shrink)
We identify two pragmatic problems in temporal reasoning, the qualification problem and the extended prediction problem, the latter subsuming the infamous frame problem. Solutions to those seem to call for nonmonotonic inferences, and yet naive use of standard nonmonotonic logics turns out to be inappropriate.Looking for an alternative, we first propose a uniform approach to constructing and understanding nonmonotonic logics. This framework subsumes many existing nonmonotonic formalisms, and yet is remarkably simple, adding almost no extra (...) baggage to traditional logic. (shrink)
The problems we deal with concern reasoning about incomplete knowledge. Knowledge is understood as ability of an ideal rational agent to make decisions about pieces of information. The formalisms we are particularly interested in are Moore's autoepistemic logic (AEL) and its variant, the logic of acceptance and rejection (AEL2). It is well-known that AEL may be seen as the nonmonotonic KD45 modal logic. The aim is to give an appropriate modal formalization for AEL2.
Regress arguments have convinced many that reasoning cannot require beliefs about what follows from what. In this paper I argue that this is a mistake. Regress arguments rest on dubious (although deeply entrenched) assumptions about the nature of reasoning—most prominently, the assumption that believing p by reasoning is simply a matter of having a belief in p with the right causal ancestry. I propose an alternative account, according to which beliefs about what follows from what play a (...) constitutive role in reasoning. (shrink)
Do other people's arguments tie you in knots? Do you lack the confidence in your ability to reason? Do you assume that everything written in newspapers must be true? We all engage in the process of reasoning, but we don't always pay attention to whether we are doing it well. This book offers the opportunity to practice reasoning in a clear-headed and critical way, with the aims of developing an awareness of the importance of reasoning well, and (...) of improving the reader's skill in analyzing and evaluating arguments. In this second edition of the highly successful Critical Reasoning , Anne Thomson has updated and revised the book to include new and topical examples which will guide students through the processes of critical reasoning in a clear and engaging way. (shrink)
Following Burge, many anti-individualists suppose that a subject can possess a concept even if she incompletely understands it. While agreeing that this is possible, I argue that there is a limit on the extent to which a subject can incompletely understand the set of concepts she thinks with. This limit derives from our conception of our ability to reflectively evaluate our own thoughts or, as Burge puts it, our ability to engage in critical reasoning. The paper extends Burge's own (...) work on critical reasoning. He argued that critical reasoning imposes a limit on the extent to which we can be mistaken about what thoughts we are having; in general, we can know non-empirically what we are thinking (Burge, "Our Entitlement to Self-Knowledge", Proceedings of the Aristotelian Society XCVI, 1996). He does not explicitly consider whether critical reasoning also imposes a limit on incomplete understanding of thoughts. (shrink)
This interdisciplinary work is a collection of major essays on reasoning: deductive, inductive, abductive, belief revision, defeasible (non-monotonic), cross cultural, conversational, and argumentative. They are each oriented toward contemporary empirical studies. The book focuses on foundational issues, including paradoxes, fallacies, and debates about the nature of rationality, the traditional modes of reasoning, as well as counterfactual and causal reasoning. It also includes chapters on the interface between reasoning and other forms of thought. In general, this last (...) set of essays represents growth points in reasoning research, drawing connections to pragmatics, cross-cultural studies, emotion and evolution. (shrink)
Robert Batterman examines a form of scientific reasoning called asymptotic reasoning, arguing that it has important consequences for our understanding of the scientific process as a whole. He maintains that asymptotic reasoning is essential for explaining what physicists call universal behavior. With clarity and rigor, he simplifies complex questions about universal behavior, demonstrating a profound understanding of the underlying structures that ground them. This book introduces a valuable new method that is certain to fill explanatory gaps across (...) disciplines. (shrink)
This book is an accessible introduction that will enable students, through practical exercises, to develop their own skills in reasoning about ethical issues, including analyzing and evaluating arguments used in discussions of ethical issues; analyzing and evaluating ethical concepts, such as utilitarianism; making decisions on ethical issues; and learning how to approach ethical issues in a fair minded way. The issues discussed in the book include abortion, euthanasia, capital punishment, animal rights, the environment and war. The book will be (...) essential reading for students studying all aspects of ethics. (shrink)
In this important new collection, Gilbert Harman presents a selection of fifteen interconnected essays on fundamental issues at the center of analytic philosophy. The book opens with a group of four essays discussing basic principles of reasoning and rationality. The next three essays argue against the once popular idea that certain claims are true and knowable by virtue of meaning. In the third group of essays Harman presents his own view of meaning and the possibility of thinking in language (...) The final three essays investigate the nature of mind, developing further the themes already set out. Reasoning, Meaning, and Mind offers an integrated presentation of this rich and influential body of work. which Harman has developed over thirty years. (shrink)
This chapter examines the extent to which there are continuities between the cognitive processes and epistemic practices engaged in by human hunter-gatherers, on the one hand, and those which are distinctive of science, on the other. It deploys anthropological evidence against any form of 'no-continuity' view, drawing especially on the cognitive skills involved in the art of tracking. It also argues against the 'child-as-scientist' accounts put forward by some developmental psychologists, which imply that scientific thinking is present in early infancy (...) and universal amongst humans who have sufficient time and resources to devote to it. In contrast, a modularist kind of 'continuity' account is proposed, according to which the innately channelled architecture of human cognition provides all the materials necessary for basic forms of scientific reasoning in older children and adults, needing only the appropriate sorts of external support, social context, and background beliefs and skills in order for science to begin its advance. (shrink)
The importance of situational constraint for moral evaluations is widely accepted in philosophy, psychology, and the law. However, recent work suggests that this relationship is actually bidirectional: moral evaluations can also influence our judgments of situational constraint. For example, if an agent is thought to have acted immorally rather than morally, that agent is often judged to have acted with greater freedom and under less situational constraint. Moreover, when considering interpersonal situations, we judge that an agent who forces another to (...) act immorally (versus morally) uses more force. These two features can result in contradictory response patterns in which participants judge both that (1) a forcer forced a forcee to act and (2) the forcee was not forced by the forcer to act. Here, we characterize potential psychological mechanisms, in particular, “moral focus” and counterfactual reasoning that account for this paradoxical pattern of judgments. (shrink)
This paper starts from an assumption defended in the author's previous work. This is that distinctivelyhuman flexible and creative theoretical thinking can be explained in terms of the interactions of a variety of modular systems, with the addition of just a few amodular components and dispositions. On the basis of that assumption it is argued that distinctively human practical reasoning, too, can be understood in modular terms. The upshot is that there is nothing in the human psyche that requires (...) any significant retreat from a thesis of massively modular mental organization. (shrink)
The view to be defended in this paper is intended to be a novel and compelling model of instrumental practical reasoning, reasoning aimed at determining how to act in order to achieve a given end in a certain set of circumstances. On standard views of instrumental reasoning, the end in question is the object of a particular desire that the agent has, a desire which, when combined with the agent’s beliefs about what means are available to him (...) or her in order to satisfy that desire, can cause the formation of an independent desire or intention to engage in the relevant means. One of the main goals in what follows is to show that such views provide an inadequate understanding of instrumental practical reasoning when it comes to the practical lives of agents. (shrink)
In this paper I am concerned with the question of whether degrees of belief can figure in reasoning processes that are executed by humans. It is generally accepted that outright beliefs and intentions can be part of reasoning processes, but the role of degrees of belief remains unclear. The literature on subjective Bayesianism, which seems to be the natural place to look for discussions of the role of degrees of belief in reasoning, does not address the question (...) of whether degrees of belief play a role in real agents’ reasoning processes. On the other hand, the philosophical literature on reasoning, which relies much less heavily on idealizing assumptions about reasoners than Bayesianism, is almost exclusively concerned with outright belief. One possible explanation for why no philosopher has yet developed an account of reasoning with degrees of belief is that reasoning with degrees of belief is not possible for humans. In this paper, I will consider three arguments for this claim. I will show why these arguments are flawed, and conclude that, at least as far as these arguments are concerned, it seems like there is no good reason why the topic of reasoning with degrees of belief has received so little attention. (shrink)
This paper addresses an argument offered by John Hawthorne gainst the propriety of an agent’s using propositions she does not know as premises in practical reasoning. I will argue that there are a number of potential structural confounds in Hawthorne’s use of his main example, a case of practical reasoning about a lottery. By drawing these confounds out more explicitly, we can get a better sense of how to make appropriate use of such examples in theorizing about norms, (...) knowledge, and practical reasoning. I will conclude by suggesting a prescription for properly using lottery propositions to do the sort of work that Hawthorne wants from them. (shrink)
Reasoning Practically deals with a classical philosophical topic, the link between thought and action--how we think about what we do or ought to do, and how we move from thinking to doing. The essays by such renowned contributors as Donald Davidson, Barry Stroud, Cass R. Sunstein, Seyla Benhabib, and Gerald Dworkin, cover a range of issues raised when we link reason and practice. This collection connects state-of-the-art philosophical work with concrete issues in social life and political practice, making it (...) of interest not only to philosophers, but to political theorists, legal scholars, and any researcher interested in the practical application of reason. (shrink)
This book is about Relational and Contextual Reasoning (RCR), a new theory of the human mind that addresses key areas of human conflict, such as the ideological conflict between nations, in close relationships and between science and religion. K. Helmut Reich provides a clear and accessible introduction to the RCR way of thinking that encourages an inclusive rather than oppositional approach to conflict and problem-solving.
I describe conventions not of correct reasoning but of giving and taking advice about reasoning. This article is asn anticipation of part of the first chapter of my forthcoming *Bounded Thinking*, OUP 2012.
As the eleventh volume in the New Directions in Cognitive Science series (formerly the Vancouver Studies in Cognitive Science series), this work promises superb scholarship and interdisciplinary appeal. It addresses three areas of current and varied interest: common sense, reasoning, and rationality. While common sense and rationality often have been viewed as two distinct features in a unified cognitive map, this volume offers novel, even paradoxical, views of the relationship. Comprised of outstanding essays from distinguished philosophers, it considers what (...) constitutes human rationality, behavior, and intelligence covering diverse areas of philosophy, psychology, cognitive science, and computer science. Indeed, it is at the forefront of cognitive research and promises to be of unprecedented influence across numerous disciplines. (shrink)
This study argues that Descartes's conception of rationality presupposes that the order of reasoning essentially obeys his metaphysical categories. It takes to the next level the current trend in de-emphasizing his purported epistemology in favor of his unique metaphysics of cognition.
In this essay we argue that reasoning can sometimes generate epistemic justification, rather than merely transmitting justification that the subject already possesses to new beliefs. We also suggest a way to account for it in terms of the relationship between epistemic normative requirements, justification and cognitive capacities.
Decision theory explains weakness of will as the result of a conflict of incentives between different transient agents. In this framework, self-control can only be achieved by the I-now altering the incentives or choice-sets of future selves. There is no role for an extended agency over time. However, it is possible to extend game theory to allow multiple levels of agency. At the inter-personal level, theories of team reasoning allow teams to be agents, as well as individuals. I apply (...) team reasoning at the intra-personal level, taking the self as a team of transient agents over time. This allows agents to ask, not just “what should I-now do?’, but also ‘What should I, the person over time do?’, which may enable agents to achieve self-control. The resulting account is Aristotelian in flavour, as it involves reasoning schemata and perception, and it is compatible with some of the psychological findings about self-control. (shrink)
In our commentary we briefly review the work on the neurological differences between the rational ethical analysis used in professional contexts and the reflexive emotional responses of our daily moral reasoning, and discuss the implications for the claim that our normative arguments should not rely on the emotion of repugnance.
Language pragmatics is applied to analyse problem statements and instructions used in a few influential experimental tasks in the psychology of reasoning. This analysis aims to determine the interpretation of the task which the participant is likely to construct. It is applied to studies of deduction (where the interpretation of quantifiers and connectives is crucial) and to studies of inclusion judgment and probabilistic judgment. It is shown that the interpretation of the problem statements or even the representation of the (...) task as a whole often turn out to differ from the experimenter's assumptions. This has serious consequences for the validity of these experimental results and therefore for the claims about human irrationality based on them. (shrink)
It is increasingly argued that there is a single unified constitutive norm of both assertion and practical reasoning. The most common suggestion is that knowledge is this norm. If this is correct, then we would expect that a diagnosis of problematic assertions should manifest as problematic reasons for acting. Jennifer Lackey has recently argued that assertions epistemically grounded in isolated second-hand knowledge (ISHK) are unwarranted. I argue that decisions epistemically grounded in premises based on ISHK also seem inappropriate. I (...) finish by suggesting that this finding has important implications for the debates regarding the norms of assertion and practical reasoning. (shrink)
We survey the meta-ethical tools and institutional processes that traditional Islamic ethicists apply when deliberating on bioethical issues. We present a typology of these methodological elements, giving particular attention to the meta-ethical techniques and devices that traditional Islamic ethicists employ in the absence of decisive or univocal authoritative texts or in the absence of established transmitted cases. In describing how traditional Islamic ethicists work, we demonstrate that these experts possess a variety of discursive tools. We find that the ethical responsa—i.e., (...) the products of the application of the tools that we describe—are generally characterized by internal consistency. We also conclude that Islamic ethical reasoning on bioethical issues, while clearly scripture-based, is also characterized by strong consequentialist elements and possesses clear principles-based characteristics. The paper contributes to the study of bioethics by familiarizing non-specialists in Islamic ethics with the role, scope, and applicability of key Islamic ethical concepts, such as “aims” (maqāṣid), “universals” (kulliyyāt), “interest” (maṣlaḥa), “maxims” (qawā`id), “controls” (ḍawābit), “differentiators” (furūq), “preponderization” (tarjīḥ), and “extension” (tafrī`). (shrink)
Contextual type theories are largely explored in their applications to programming languages, but less investigated for knowledge representation purposes. The combination of a constructive language with a modal extension of contexts appears crucial to explore the attractive idea of a type-theoretical calculus of provability from refutable assumptions for non-monotonic reasoning. This paper introduces such a language: the modal operators are meant to internalize two different modes of correctness, respectively with necessity as the standard notion of constructive verification and possibility (...) as provability up to refutation of contextual conditions. (shrink)
Does cognition sometimes literally extend into the extra-organismic environment (Clark, 2003), or is it always “merely” environmentally embedded (Rupert, 2004)? Underlying this current border dispute is the question about how to individuate cognitive processes on principled grounds. Based on recent evidence about the active role of representation selection and construction in learning how to reason (Stenning, 2002), I raise the question: what makes two distinct, modality-specific pen-and-paper manipulations of external representations – diagrams versus sentences – cognitive processes of the same (...) kind, e.g. episodes of syllogistic reasoning? In response, I defend a “division of labor” hypothesis, according to which external representations are dependent on perceptually grounded neural representations and mechanisms to guide our behavior; these internal mechanisms, however, are dependent on external representations to have their syllogistic content fixed. Only their joint contributions qualify the extended computational process as an episode of syllogistic reasoning in good standing. (shrink)
A novel explanation of belief bias in relational reasoning is presented based on the role of working memory and retrieval in deductive reasoning, and the influence of prior knowledge on this process. It is proposed that belief bias is caused by the believability of a conclusion in working memory which influences its activation level, determining its likelihood of retrieval and therefore its effect on the reasoning process. This theory explores two main influences of belief on the activation (...) levels of these conclusions. First, believable conclusions have higher activation levels and so are more likely to be recalled during the evaluation of reasoning problems than unbelievable conclusions, and therefore, they have a greater influence on the reasoning process. Secondly, prior beliefs about the conclusion have a base level of activation and may be retrieved when logically irrelevant, influencing the evaluation of the problem. The theory of activation and memory is derived from the Atomic Components of Thought-Rational (ACT-R) cognitive architecture and so this account is formalized in an ACT-R cognitive model. Two experiments were conducted to test predictions of this model. Experiment 1 tested strength of belief and Experiment 2 tested the impact of a concurrent working memory load. Both of these manipulations increased the main effect of belief overall and in particular raised belief-based responding in indeterminately invalid problems. These effects support the idea that the activation level of conclusions formed during reasoning influences belief bias. This theory adds to current explanations of belief bias by providing a detailed specification of the role of working memory and how it is influenced by prior knowledge. (shrink)
A demanding introduction to logic and critical thinking, this book offers more traditional means of teaching the art of reasoning at a time when the field has become almost mathematical. Francis Dauer has rethought the framework for teaching reasoning in general and formal logic in particular, the desired epistemological context, and the role of the fallacies. The result is a coherent and very readable work, informed by Dauer's extensive experience teaching and writing on the subject.
Qualitative description of the movement of objects can be very important when there are large quantity of data or incomplete information, such as in positioning technologies and movement of robots. We present a first step in the combination of fuzzy qualitative reasoning and quantitative data obtained by human interaction and external devices as GPS, in order to update and correct the qualitative information. We consider a Propositional Dynamic Logic which deals with qualitative velocity and enables us to represent some (...)reasoning tasks about qualitative properties. The use of logic provides a general framework which improves the capacity of reasoning. In this way, we can infer additional information by using axioms and the logic apparatus. In this paper we present sound and complete relational dual tableau that can be used for verification of validity of formulas of the logic in question. (shrink)
Offering an innovative approach to critical thinking, Good Reasoning Matters! identifies the essential structure of good arguments in a variety of contexts and also provides guidelines to help students construct their own effective arguments. In addition to examining the most common features of faulty reasoning--slanting, bias, propaganda, vagueness, ambiguity, and a common failure to consider opposing points of view--the book introduces a variety of argument schemes and rhetorical techniques. This edition adds material on visual arguments and more exercises.
In “Mercier and Sperber’s Argumentative Theory of Reasoning: From Psychology of Reasoning to Argumentation Studies” (2012) Santibáñez Yañez offers constructive comments and criticisms of the argumentative theory of reasoning. The purpose of this reply is twofold. First, it seeks to clarify two points broached by Yanez: (1) the relation between reasoning (in this specific theory) and dual process accounts in general and (2) the benefits that can be derived from reasoning and argumentation (again, in this (...) specific theory). Second, it suggests one domain—the categorization of arguments—in which argumentation studies and the argumentative theory of reasoning could usefully complement each other to yield a better understanding of the processes of argumentation. (shrink)
In this era of increased polarization of opinion and contentious disagreement, CRITICAL REASONING presents a cooperative approach to critical thinking and formation of beliefs. CRITICAL REASONING emphasizes the importance of developing and applying analytical skills in real life contexts. This book is unique in providing multiple, diverse examples of everyday arguments, both textual and visual, including hard to find long argument passages from real-life sources. The book provides clear, step-by-step procedures to help you decide for yourself what to (...) believe--to be a consumer of information in our contemporary "world of experts.". (shrink)
Presents the latest research on how reasoning with analogies, metaphors, metonymies, and images can facilitate mathematical understanding. For math education, educational psychology, and cognitive science scholars.
We introduce an Automatic Theorem Prover (ATP) of a dual tableau system for a relational logic for order of magnitude qualitative reasoning, which allows us to deal with relations such as negligibility, non-closeness and distance. Dual tableau systems are validity checkers that can serve as a tool for verification of a variety of tasks in order of magnitude reasoning, such as the use of qualitative sum of some classes of numbers. In the design of our ATP, we have (...) introduced some heuristics, such as the so called phantom variables, which improve the efficiency of the selection of variables used un the proof. (shrink)
Qualitative Reasoning (QR) is an area of research within Artificial Intelligence that automates reasoning and problem solving about the physical world. QR research aims to deal with representation and reasoning about continuous aspects of entities without the kind of precise quantitative information needed by conventional numerical analysis techniques. Order-of-magnitude Reasoning (OMR) is an approach in QR concerned with the analysis of physical systems in terms of relative magnitudes. In this paper we consider the logic OMR_N for (...) order-of-magnitude reasoning with the bidirectional negligibility relation. It is a multi-modal logic given by a Hilbert-style axiomatization that reflects properties and interactions of two basic accessibility relations (strict linear order and bidirectional negligibility). Although the logic was studied in many papers, nothing was known about its decidability. In the paper we prove decidability of OMR N by showing that the logic has the strong finite model property. (shrink)
Recently a number of variously motivated epistemologists have argued that knowledge is closely tied to practical matters. On the one hand, radical pragmatic encroachment is the view that facts about whether an agent has knowledge depend on practical factors and this is coupled to the view that there is an important connection between knowledge and action. On the other hand, one can argue for the less radical thesis only that there is an important connection between knowledge and practical reasoning. (...) So, defenders of both of these views endorse the view that knowledge is the norm of practical reasoning. This thesis has recently come under heavy fire and a number of weaker proposals have been defended. In this paper counter-examples to the knowledge norm of reasoning will be presented and it will be argued that this viewand a number of related but weaker viewscannot be sustained in the face of these counter-examples. The paper concludes with a novel proposal concerning the norm of practical reasoning that is immune to the counter-examples introduced here. (shrink)
Semantic studies on diagrammatic notations (Barwise & Etchemendy, ; Shimojima, ; Stenning & Lemon, ) have revealed that the “non-deductive,” “emergent,” or “perceptual” effects of diagrams (Chandrasekaran, Kurup, Banerjee, Josephson, & Winkler, ; Kulpa, ; Larkin & Simon, ; Lindsay, ) are all rooted in the exploitation of spatial constraints on graphical structures. Thus, theoretically, this process is a key factor in inference with diagrams, explaining the frequently observed reduction of inferential load. The purpose of this study was to examine (...) the empirical basis for this theoretical suggestion, focusing on the reality of the constraint-exploitation strategy in actual practices of diagrammatic reasoning. Eye movements were recorded while participants used simple position diagrams to solve three- or four-term transitive inference problems. Our experiments revealed that the participants could exploit spatial constraints on graphical structures even when (a) they were not in the position of actually manipulating diagrams, (b) the semantic rule for the provided diagrams did not match their preferences, and (c) the constraint-exploitation strategy invited a partly adverse effect. These findings indicate that the hypothesized process is in fact robust, with the potential to broadly account for the inferential advantage of diagrams. (shrink)
Mercier and Sperber (2011a, 2011b; Mercier, 2011a, 2011b, 2011c, and 2011d) have presented a stimulating and provocative new theory of reasoning: the argumentative theory of reasoning. They maintain that argumentation is a meta-representational module. In their evolutionary view of argumentation, the function of this module would be to regulate the flow of information between interlocutors through persuasiveness on the side of the communicator and epistemic vigilance on the side of the audience. The aim of this paper is to (...) discuss the perspective of the authors in which they conceive this competence as the natural scenario of reflective reasoning. (shrink)
The debate concerning the proper way of understanding, and hence solving, the “is-ought problem” produced two mutually exclusive positions. One position claims that it is entirely impossible to deduce an imperative statement from a set of factual statements. The other position holds a contrary view to the effect that one can naturally derive an imperative statement from a set of factual statements under certain conditions. Although these two positions have opposing views concerning the problem, it should be evident that they (...) both accept that the “is-ought problem” is concerned with the deducibility of imperative statements from factual statements. Later I will argue that this should not be our concern when we try to make sense of the way we reason about morality. (shrink)
Probabilistic models have started to replace classical logic as the standard reference paradigm in human deductive reasoning. Mental probability logic emphasizes general principles where human reasoning deviates from classical logic, but agrees with a probabilistic approach (like nonmonotonicity or the conditional event interpretation of conditionals). -/- This contribution consists of two parts. In the ﬁrst part we discuss general features of reasoning systems including consequence relations, how uncertainty may enter argument forms, probability intervals, and probabilistic informativeness. These (...) concepts are of central importance for the psychological task analysis. In the second part we report new experimental data on the paradoxes of the material conditional, the probabilistic modus ponens, the complement task, and data on the probabilistic truth table task. The results of the experiments provide evidence for the hypothesis that people represent indicative conditionals by conditional probability assertions. (shrink)
One might ask of two or more texts—what can be inferred from them, taken together? If the texts happen to contradict each other in some respect, then the unadorned answer of standard logic is EVERYTHING. But it seems to be a given that we often successfully reason with inconsistent information from multiple sources. The purpose of this paper is to attempt to develop an adequate approach to accounting for this given.
In 1685, in The Art of Discovery, Leibniz set down an extraordinary idea: "The only way to rectify our reasonings is to make them as tangible as those of the Mathematicians, so that we can find our error at a glance, and when there are disputes among persons, we can simply say: Let us calculate [calculemus], without further ado, to see who is right." Calculemus.
According to Principles of Sufficient Reason, every truth (in some relevant group) has an explanation. One of the most popular defenses of Principles of Sufficient Reason has been the presupposition of reason defense, which takes endorsement of the defended PSR to play a crucial role in our theory selection. According to recent presentations of this defense, our method of theory selection often depends on the assumption that, if a given proposition is true, then it has an explanation, and this will (...) only be justified if we think this holds for all propositions in the relevant group. I argue that this argument fails even when restricted to contingent propositions, and even if we grant that there is no non-arbitrary way to divide true propositions that have explanations from those that lack them. Further, we can give an alternate explanation of what justifies our selecting theories on the basis of explanatory features: the crucial role is not played by an endorsement of a PSR, but rather by our belief that, prima facie, we should prefer theories that exemplify explanatory power to greater degrees than their rivals. This guides our theory selection in a manner similar to ontological parsimony and theoretical simplicity. Unlike a PSR, our belief about explanatory power gives us a prima facie guiding principle, which provides justification in the cases where we think we have it, and not in the cases where we think we don't. (shrink)
In psychiatry, pharmacological drugs play an important experimental role in attempts to identify the neurobiological causes of mental disorders. Besides being developed in applied contexts as potential treatments for patients with mental disorders, pharmacological drugs play a crucial role in research contexts as experimental instruments that facilitate the formulation and revision of neurobiological theories of psychopathology. This paper examines the various epistemic functions that pharmacological drugs serve in the discovery, refinement, testing, and elaboration of neurobiological theories of mental disorders. I (...) articulate this thesis with reference to the history of antipsychotic drugs and the evolution of the dopamine hypothesis of schizophrenia in the second half of the twentieth century. I argue that interventions with psychiatric patients through the medium of antipsychotic drugs provide researchers with information and evidence about the neurobiological causes of schizophrenia. This analysis highlights the importance of pharmacological drugs as research tools in the generation of psychiatric knowledge and the dynamic relationship between practical and theoretical contexts in psychiatry. (shrink)
In this paper, I show that the question of how dual process theories of reasoning and judgement account for conflict between System 1 (heuristic) and System 2 (analytic) processes needs to be explicated and addressed in future research work. I demonstrate that a simple additive probability model that describes such conflict can be mapped on to three different cognitive models. The pre-emptive conflict resolution model assumes that a decision is made at the outset as to whether a heuristic or (...) analytic process will control the response. The parallel-competitive model assumes that each system operates in parallel to deliver a putative response, resulting sometimes in conflict that then needs to be resolved. Finally, the default-interventionist model involves the cueing of default responses by the heuristic system that may or may not be altered by subsequent intervention of the analytic system. A second, independent issue also emerges from this discussion. The superior performance of higher-ability participants on reasoning tasks may be due to the fact that they engage in more analytic reasoning ( quantity hypothesis ) or alternatively to the fact that the analytic reasoning they apply is more effective ( quality hypothesis ). (shrink)