In this study on deduction, the authors argue that people reason by imagining the relevant state of affairs, ie building an internal model of it, formulating a tentative conclusion based on this model and then searching for alternative models.
This volume examines the notion of an analytic proof as a natural deduction, suggesting that the proof's value may be understood as its normal form--a concept with significant implications to proof-theoretic semantics.
This important book provides a new unifying methodology for logic. It replaces the traditional view of logic as manipulating sets of formulas with the notion of structured families of labelled formulas with algebraic structures. This approach has far reaching consequences for the methodology of logics and their semantics, and the book studies the main features of such systems along with their applications. It will interest logicians, computer scientists, philosophers and linguists.
This book stands at the intersection of two topics: the decidability and computational complexity of hybrid logics, and the deductive systems designed for them. Hybrid logics are here divided into two groups: standard hybrid logics involving nominals as expressions of a separate sort, and non-standard hybrid logics, which do not involve nominals but whose expressive power matches the expressive power of binder-free standard hybrid logics.The original results of this book are split into two parts. This division reflects the division of (...) the book itself. The first type of results concern model-theoretic and complexity properties of hybrid logics. Since hybrid logics which we call standard are quite well investigated, the efforts focused on hybrid logics referred to as non-standard in this book. Non-standard hybrid logics are understood as modal logics with global counting operators ) whose expressive power matches the expressive power of binder-free standard hybrid logics. The relevant results comprise: 1. Establishing a sound and complete axiomatization for the modal logic K with global counting operators ), which can be easily extended onto other frame classes, 2. Establishing tight complexity bounds, namely NExpTime-completeness for the modal logic with global counting operators defined over the classes of arbitrary, reflexive, symmetric, serial and transitive frames ), MT), MD), MB), MK4) with numerical subscripts coded in binary. Establishing the exponential-size model property for this logic defined over the classes of Euclidean and equivalential frames ), MS5).Results of the second type consist of designing concrete deductive systems for standard and non-standard hybrid logics. More precisely, they include: 1. Devising a prefixed and an internalized tableau calculi which are sound, complete and terminating for a rich class of binder-free standard hybrid logics. An interesting feature of indicated calculi is the nonbranching character of the rule, 2. Devising a prefixed and an internalized tableau calculi which are sound, complete and terminating for non-standard hybrid logics. The internalization technique applied to a tableau calculus for the modal logic with global counting operators is novel in the literature, 3. Devising the first hybrid algorithm involving an inequality solver for modal logics with global counting operators. Transferring the arithmetical part of reasoning to an inequality solver turned out to be sufficient in ensuring termination.The book is directed to philosophers and logicians working with modal and hybrid logics, as well as to computer scientists interested in deductive systems and decision procedures for logics. Extensive fragments of the first part of the book can also serve as an introduction to hybrid logics for wider audience interested in logic.The content of the book is situated in the areas of formal logic and theoretical computer science with some elements of the theory of computational complexity. (shrink)
In this paper, I consider a family of three-valued regular logics: the well-known strong and weak S.C. Kleene’s logics and two intermedi- ate logics, where one was discovered by M. Fitting and the other one by E. Komendantskaya. All these systems were originally presented in the semantical way and based on the theory of recursion. However, the proof theory of them still is not fully developed. Thus, natural deduction sys- tems are built only for strong Kleene’s logic both with (...) one (A. Urquhart, G. Priest, A. Tamminga) and two designated values (G. Priest, B. Kooi, A. Tamminga). The purpose of this paper is to provide natural deduction systems for weak and intermediate regular logics both with one and two designated values. (shrink)
Deductive Cogency holds that the set of propositions towards which one has, or is prepared to have, a given type of propositional attitude should be consistent and closed under logical consequence. While there are many propositional attitudes that are not subject to this requirement, e.g. hoping and imagining, it is at least prima facie plausible that Deductive Cogency applies to the doxastic attitude involved in propositional knowledge, viz. belief. However, this thought is undermined by the well-known preface paradox, leading a (...) number of philosophers to conclude that Deductive Cogency has at best a very limited role to play in our epistemic lives. I argue here that Deductive Cogency is still an important epistemic requirement, albeit not as a requirement on belief. Instead, building on a distinction between belief and acceptance introduced by Jonathan Cohen and recent developments in the epistemology of understanding, I propose that Deductive Cogency applies to the attitude of treating propositions as given in the context of attempting to understand a given phenomenon. I then argue that this simultaneously accounts for the plausibility of the considerations in favor of Deductive Cogency and avoids the problematic consequences of the preface paradox. (shrink)
I argue, contrary to Dennis Schulting inKant’s Radical Subjectivism, that the main reasoning of Kant’s transcendental deduction of the categories is progressive, not regressive. Schulting is right, however, to emphasize that the deduction takes the object cognized to be constituted in an idealism-entailing way. But his reasoning has gaps and bypasses Kant’s most explicit deduction argument, independent of the Transcendental Aesthetic, for idealism. Finally, Schulting’s claim that Kantian discursivity itself requires idealism overlooks the fact that Kantian general (...) judgements can be true in a domain of objects without being specificallyofor about any particular ones of those objects. (shrink)
One of the strongest motivations for conceptualist readings of Kant is the belief that the Transcendental Deduction is incompatible with nonconceptualism. In this article, I argue that this belief is simply false: the Deduction and nonconceptualism are compatible at both an exegetical and a philosophical level. Placing particular emphasis on the case of non-human animals, I discuss in detail how and why my reading diverges from those of Ginsborg, Allais, Gomes and others. I suggest ultimately that it is (...) only by embracing nonconceptualism that we can fully recognise the delicate calibration of the trap which the Critique sets for Hume. (shrink)
The paper explores a deductive-nomological account of metaphysical explanation: some truths metaphysically explain, or ground, another truth just in case the laws of metaphysics determine the latter truth on the basis of the former. I develop and motivate a specific conception of metaphysical laws, on which they are general rules that regulate the existence and features of derivative entities. I propose an analysis of the notion of ‘determination via the laws’, based on a restricted form of logical entailment. I argue (...) that the DN-account of ground can be defended against the well-known objections to the DN-approach to scientific explanation. The goal of the paper is to show that the DN-account of metaphysical explanation is a well-motivated and defensible theory. (shrink)
This comprehensive account of the concept and practices of deduction is the first to bring together perspectives from philosophy, history, psychology and cognitive science, and mathematical practice. Catarina Dutilh Novaes draws on all of these perspectives to argue for an overarching conceptualization of deduction as a dialogical practice: deduction has dialogical roots, and these dialogical roots are still largely present both in theories and in practices of deduction. Dutilh Novaes' account also highlights the deeply human and (...) in fact social nature of deduction, as embedded in actual human practices; as such, it presents a highly innovative account of deduction. The book will be of interest to a wide range of readers, from advanced students to senior scholars, and from philosophers to mathematicians and cognitive scientists. (shrink)
Mathematicians often speak of conjectures as being confirmed by evidence that falls short of proof. For their own conjectures, evidence justifies further work in looking for a proof. Those conjectures of mathematics that have long resisted proof, such as Fermat's Last Theorem and the Riemann Hypothesis, have had to be considered in terms of the evidence for and against them. It is argued here that it is not adequate to describe the relation of evidence to hypothesis as `subjective', `heuristic' or (...) `pragmatic', but that there must be an element of what it is rational to believe on the evidence, that is, of non-deductive logic. (shrink)
It has been the dominant view that probabilistic explanations of particular facts must be inductive in character. I argue here that this view is mistaken, and that the aim of probabilistic explanation is not to demonstrate that the explanandum fact was nomically expectable, but to give an account of the chance mechanism(s) responsible for it. To this end, a deductive-nomological model of probabilistic explanation is developed and defended. Such a model has application only when the probabilities occurring in covering laws (...) can be interpreted as measures of objective chance, expressing the strength of physical propensities. Unlike inductive models of probabilistic explanation, this deductive model stands in no need of troublesome requirements of maximal specificity or epistemic relativization. (shrink)
This is a discussion of L. Jonathan Cohen’s argument against the possibility that empirical psychological research might show that lay deductive competence is inconsistent. I argue that, within the framework Cohen provides, the consistency of lay deductive practice is indeterminate.
This book provides a detailed exposition of one of the most practical and popular methods of proving theorems in logic, called Natural Deduction. It is presented both historically and systematically. Also some combinations with other known proof methods are explored. The initial part of the book deals with Classical Logic, whereas the rest is concerned with systems for several forms of Modal Logics, one of the most important branches of modern logic, which has wide applicability.
This introduction to the basic forms of deductive inference as evaluated by methods of modern symbolic logic is designed for sophomore-junior-level students ready to specialize in the study of deductive logic. It can be used also for an introductory logic course. The independence of many sections allows the instructor utmost flexibility. The text consists of eight chapters, the first six of which are designed to introduce the student to basic topics of sentence and predicate logic. The last two chapters extend (...) the procedures of the first six to alethic modal logic, the logic of imperatives, and deontic logic. Throughout the text there is an attempt to relate symbolic techniques to issues in the philosophy of logic. (shrink)
Henry E. Allison presents an analytical and historical commentary on Kant`s transcendental deduction of the pure concepts of the understanding in the Critique of Pure Reason. He argues that, rather than providing a new solution to an old problem, it addresses a new problem, and he traces the line of thought that led Kant to the recognition of the significance of this problem in his 'pre-critical' period. In addition to the developmental nature of the account of Kant`s views presented (...) here, two distinctive features of Allison's reading of the deduction are a defense of Kant`s oft criticized claim that the conformity of appearances to the categories must be unconditionally rather than merely conditionally necessary and an insistence that the argument cannot be separated from Kant`s transcendental idealism. (shrink)
This paper presents an outline of a new theory of relevant deduction which arose from the purpose of solving paradoxes in various fields of analytic philosophy. In distinction to relevance logics, this approach does not replace classical logic by a new one, but distinguishes between relevance and validity. It is argued that irrelevant arguments are, although formally valid, nonsensical and even harmful in practical applications. The basic idea is this: a valid deduction is relevant iff no subformula of (...) the conclusion is replaceable on some of its occurrences by any other formula salva validitate of the deduction. The paper first motivates the approach by showing that four paradoxes seemingly very distant from each other have a common source. Then the exact definition of relevant deduction is given and its logical properties are investigated. An extension to relevance of premises is discussed. Finally the paper presents an overview of its applications in philosophy of science, ethics, cognitive psychology and artificial intelligence. (shrink)
This is a discussion of L. Jonathan Cohen’s argument against the possibility that empirical psychological research might show that lay deductive competence is inconsistent. I argue that, within the framework Cohen provides, the consistency of lay deductive practice is indeterminate.
ABSTRACTThe new paradigm in the psychology of reasoning redirects the investigation of deduction conceptually and methodologically because the premises and the conclusion of the inferences are assumed to be uncertain. A probabilistic counterpart of the concept of logical validity and a method to assess whether individuals comply with it must be defined. Conceptually, we used de Finetti's coherence as a normative framework to assess individuals' performance. Methodologically, we presented inference schemas whose premises had various levels of probability that contained (...) non-numerical expressions and, as a control, sure levels. Depending on the inference schemas, from 60% to 80% of the participants produced coherent conclusions when the premises were uncertain. The data also show that except for schemas involving conjunction, performance was consistently lower with certain than uncertain premises, the rate of conjunction fallacy was consistently low (not exceeding 20%,.. (shrink)
The distinction between the syntactic and the semantic approach to scientific theories emerged in formal philosophy of science. The semantic approach is commonly considered more advanced and more successful than the syntactic one, but the transition from the one approach to the other was not brought about without any loss. In essence, it is the formal analysis of atomic propositions and the analysis of deductive reasoning that dropped out of consideration in at least some of the elaborated versions of the (...) semantic approach. In structuralist theory of science, as founded by Sneed and Stegmüller, the focus is on global propositions concerning the question of whether or not certain empirical systems satisfy a set-theoretic predicate that encodes the axioms of a scientific theory. Hence, an analysis of deductive reasoning from atomic premisses with the help of a given theory is missing. The objective of the present paper is to develop a deductive system on the basis of the structuralist framework. This system comes with a novel formulation of empirical propositions in structuralism. (shrink)
The deduction of categories in the 1781 edition of the Critique of the Pure Reason (A Deduction) has “two sides”—the “objective deduction” and the “subjective deduction”. Kant seems ambivalent about the latter deduction. I treat it as a significant episode of Kant’s thinking about categories that extended from the early 1770s to around 1790. It contains his most detailed answer to the question about the origin of categories that he formulated in the 1772 letter to (...) Marcus Herz. The answer is that categories are generated a priori through a kind of intellectual “epigenesis”. This account leaves unexplained why precisely such and such categories should be generated. While this observation caused Kant to worry about the hypothetical status of the subjective deduction in 1781, he would come to acquiesce in the recognition that the ground of the possibility of categories is itself inscrutable. I call this his “methodological skepticism”. (shrink)
This chapter describes the main accounts of deductive competence, which explain what is computed in carrying out deductions. It argues that people have a modicum of competence, which is useful in daily life and a prerequisite for acquiring logical expertise. It outlines the three main sorts of theory of deductive performance, which explain how people make deductions: They rely on factual knowledge, formal rules, or mental models. It reviews recent experimental studies of deductive reasoning in order to help readers to (...) assess these theories of performance. (shrink)
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic versions (...) of the connectives in question. (shrink)
The present article illustrates a conflict between the claim that rational belief sets are closed under deductive consequences, and a very inclusive claim about the factors that are sufficient to determine whether it is rational to believe respective propositions. Inasmuch as it is implausible to hold that the factors listed here are insufficient to determine whether it is rational to believe respective propositions, we have good reason to deny that rational belief sets are closed under deductive consequences.
How is moral knowledge possible? This paper defends the anti-Humean thesis that we can acquire moral knowledge by deduction from wholly non-moral premises. According to Hume’s Law, as it has become known, we cannot deduce an ‘ought’ from an ‘is’, since it is “altogether inconceivable how this new relation can be a deduction from others, which are entirely different from it” (Hume, 1739, 3.1.1). This paper explores the prospects for a deductive theory of moral knowledge that rejects Hume’s (...) Law. (shrink)
Duncan Pritchard recently proposed a Wittgensteinian solution to closure-based skepticism. According to Wittgenstein, all epistemic systems assume certain truths. The notions that we are not disembodied brains, that the Earth has existed for a long time and that one’s name is such-and-such all function as “hinge commitments.” Pritchard views a hinge commitment as a positive propositional attitude that is not a belief. Because closure principles concern only knowledge-apt beliefs, they do not apply to hinge commitments. Thus, from the fact that (...) a subject knows that he is sitting in a room, and the fact that the subject’s sitting in a room entails his bodily existence, it does not follow that the subject also knows that he is not an envatted brain. This paper rejects Pritchard’s non-belief reading of hinge commitments. I start by showing that the non-belief reading fails to solve the skeptical paradox because the reasons that Pritchard uses to support the non-belief reading do not exempt hinge propositions from closure principles. I then proceed to argue that the non-belief reading is false as it claims that hinge commitments, unlike ordinary beliefs, are rationally unresponsive—with the help of a scenario in which a subject’s experience is internally chaotic, we can safely conclude that the hinge commitment that one is not systematically mistaken about the world is equally responsive to one’s evidential situations. (shrink)
Hypothetico-deductive (H-D) confirmation builds on the idea that confirming evidence consists of successful predictions that deductively follow from the hypothesis under test. This article reviews scope, history and recent development of the venerable H-D account: First, we motivate the approach and clarify its relationship to Bayesian confirmation theory. Second, we explain and discuss the tacking paradoxes which exploit the fact that H-D confirmation gives no account of evidential relevance. Third, we review several recent proposals that aim at a sounder and (...) more comprehensive formulation of H-D confirmation. Finally, we conclude that the reputation of hypothetico-deductive confirmation as outdated and hopeless is undeserved: not only can the technical problems be addressed satisfactorily, the hypothetico-deductive method is also highly relevant for scientific practice. (shrink)
I discuss three elements of Dennis Schulting’s new book on the transcendental deduction of the pure concepts of the understanding, or categories. First, that Schulting gives a detailed account of the role of each individual category. Second, Schulting’s insistence that the categories nevertheless apply ‘en bloc’. Third, Schulting’s defence of Kant’s so-called reciprocity thesis that subjective unity of consciousness and objectivity in the sense of cognition’s objective purport are necessary conditions for the possibility of one another. I endorse these (...) fascinating but unfashionable claims and sketch my own version of what they amount to, which is quite different to Schulting’s own construal. I point to some fundamental limitations and problems for Schulting’s position and argue that his project needs to be reshaped or at least reconceived in the face of them. Even if Schulting’s argument is sound, it does not provide a deduction, properly speaking, of the categories. (shrink)
The development of recursion theory motivated Kleene to create regular three-valued logics. Remove it taking his inspiration from the computer science, Fitting later continued to investigate regular three-valued logics and defined them as monotonic ones. Afterwards, Komendantskaya proved that there are four regular three-valued logics and in the three-valued case the set of regular logics coincides with the set of monotonic logics. Next, Tomova showed that in the four-valued case regularity and monotonicity do not coincide. She counted that there are (...) 6400 four-valued regular logics, but only six of them are monotonic. The purpose of this paper is to create natural deduction systems for them. We also describe some functional properties of these logics. (shrink)
In section 1, I develop epistemic communism, my view of the function of epistemically evaluative terms such as ‘rational’. The function is to support the coordination of our belief-forming rules, which in turn supports the reliable acquisition of beliefs through testimony. This view is motivated by the existence of valid inferences that we hesitate to call rational. I defend the view against the worry that it fails to account for a function of evaluations within first-personal deliberation. In the rest of (...) the paper, I then argue, on the basis of epistemic communism, for a view about rationality itself. I set up the argument in section 2 by saying what a theory of rational deduction is supposed to do. I claim that such a theory would provide a necessary, sufficient, and explanatorily unifying condition for being a rational rule for inferring deductive consequences. I argue in section 3 that, given epistemic communism and the conventionality that it entails, there is no such theory. Nothing explains why certain rules for deductive reasoning are rational. (shrink)
Deductive Irrationality examines and critiques economic rationalism by assessing the work of influential political philosophers and economic theorists such as Thomas Hobbes, John Locke, Adam Smith, Friedrich Hayek, Gunnar Myrdal, and John F. Muth. It is one of the first serious attempts to investigate the dominant sub-fields in economic theory through the lens of political philosophy.
We present a sound and complete Fitch-style natural deduction system for an S5 modal logic containing an actuality operator, a diagonal necessity operator, and a diagonal possibility operator. The logic is two-dimensional, where we evaluate sentences with respect to both an actual world (first dimension) and a world of evaluation (second dimension). The diagonal necessity operator behaves as a quantifier over every point on the diagonal between actual worlds and worlds of evaluation, while the diagonal possibility quantifies over some (...) point on the diagonal. Thus, they are just like the epistemic operators for apriority and its dual. We take this extension of Fitch’s familiar derivation system to be a very natural one, since the new rules and labeled lines hereby introduced preserve the structure of Fitch’s own rules for the modal case. (shrink)
The relation between logic and thought has long been controversial, but has recently influenced theorizing about the nature of mental processes in cognitive science. One prominent tradition argues that to explain the systematicity of thought we must posit syntactically structured representations inside the cognitive system which can be operated upon by structure sensitive rules similar to those employed in systems of natural deduction. I have argued elsewhere that the systematicity of human thought might better be explained as resulting from (...) the fact that we have learned natural languages which are themselves syntactically structured. According to this view, symbols of natural language are external to the cognitive processing system and what the cognitive system must learn to do is produce and comprehend such symbols. In this paper I pursue that idea by arguing that ability in natural deduction itself may rely on pattern recognition abilities that enable us to operate on external symbols rather than encodings of rules that might be applied to internal representations. To support this suggestion, I present a series of experiments with connectionist networks that have been trained to construct simple natural deductions in sentential logic. These networks not only succeed in reconstructing the derivations on which they have been trained, but in constructing new derivations that are only similar to the ones on which they have been trained. (shrink)
It is often assumed that Fichte's aim in Part I of the System of Ethics is to provide a deduction of the moral law, the very thing that Kant – after years of unsuccessful attempts – deemed impossible. On this familiar reading, what Kant eventually viewed as an underivable 'fact' (Factum), the authority of the moral law, is what Fichte traces to its highest ground in what he calls the principle of the 'I'. However, scholars have largely overlooked a (...) passage in the System of Ethics where Fichte explicitly invokes Kant's doctrine of the fact of reason with approval, claiming that consciousness of the moral law grounds our belief in freedom (GA I/5:65). On the reading I defend, Fichte's invocation of the Factum is consistent with the structure of Part I when we distinguish (a) the feeling of moral compulsion from (b) the moral law itself. (shrink)
In lively and readable prose, Arthur presents a new approach to the study of logic, one that seeks to integrate methods of argument analysis developed in modern “informal logic” with natural deduction techniques. The dry bones of logic are given flesh by unusual attention to the history of the subject, from Pythagoras, the Stoics, and Indian Buddhist logic, through Lewis Carroll, Venn, and Boole, to Russell, Frege, and Monty Python. A previous edition of this book appeared under the title (...) _Natural Deduction_. This new edition adds clarifications of the notions of explanation, validity and formal validity, a more detailed discussion of derivation strategies, and another rule of inference, Reiteration. (shrink)
Curry's paradox, sometimes described as a general version of the better known Russell's paradox, has intrigued logicians for some time. This paper examines the paradox in a natural deduction setting and critically examines some proposed restrictions to the logic by Fitch and Prawitz. We then offer a tentative counterexample to a conjecture by Tennant proposing a criterion for what is to count as a genuine paradox.
The structure of derivations in natural deduction is analyzed through isomorphism with a suitable sequent calculus, with twelve hidden convertibilities revealed in usual natural deduction. A general formulation of conjunction and implication elimination rules is given, analogous to disjunction elimination. Normalization through permutative conversions now applies in all cases. Derivations in normal form have all major premisses of elimination rules as assumptions. Conversion in any order terminates.Through the condition that in a cut-free derivation of the sequent Γ⇒C, no (...) inactive weakening or contraction formulas remain in Γ, a correspondence with the formal derivability relation of natural deduction is obtained: All formulas of Γ become open assumptions in natural deduction, through an inductively defined translation. Weakenings are interpreted as vacuous discharges, and contractions as multiple discharges. In the other direction, non-normal derivations translate into derivations with cuts having the cut formula principal either in both premisses or in the right premiss only. (shrink)
The notion of local deduction theorem (which generalizes on the known instances of indeterminate deduction theorems, e.g. for the infinitely-valued ukasiewicz logic C ) is defined. It is then shown that a given finitary non-pathological logic C admits the local deduction theorem iff the class Matr(C) of all matrices validating C has the C-filter extension property (Theorem II.1).
In this provocative book, Lance Rips describes a unified theory of natural deductive reasoning and fashions a working model of deduction, with strong experimental support, that is capable of playing a central role in mental life.Rips argues that certain inference principles are so central to our notion of intelligence and rationality that they deserve serious psychological investigation to determine their role in individuals' beliefs and conjectures. Asserting that cognitive scientists should consider deductive reasoning as a basis for thinking, Rips (...) develops a theory of natural reasoning abilities and shows how it predicts mental successes and failures in a range of cognitive tasks.In parts I and II of the book Rips builds insights from cognitive psychology, logic, and artificial intelligence into a unified theoretical structure. He defends the idea that deduction depends on the ability to construct mental proofs - actual memory units that link given information to conclusions it warrants. From this base Rips develops a computational model of deduction based on two cognitive skills: the ability to make suppositions or assumptions and the ability to posit sub-goals for conclusions. A wide variety of original experiments support this model, including studies of human subjects evaluating logical arguments as well as following and remembering proofs. Unlike previous theories of mental proof, this one handles names and variables in a general way. This capability enables deduction to play a crucial role in other thought processes,such as classifying and problem solving.In part III Rips compares the theory to earlier approaches in psychology which confined the study of deduction to a small group of tasks, and examines whether the theory is too rational or too irrational in its mode of thought.Lance J. Rips is Professor of Psychology at Northwestern University. (shrink)
Probabilistic models have started to replace classical logic as the standard reference paradigm in human deductive reasoning. Mental probability logic emphasizes general principles where human reasoning deviates from classical logic, but agrees with a probabilistic approach (like nonmonotonicity or the conditional event interpretation of conditionals). -/- This contribution consists of two parts. In the ﬁrst part we discuss general features of reasoning systems including consequence relations, how uncertainty may enter argument forms, probability intervals, and probabilistic informativeness. These concepts are of (...) central importance for the psychological task analysis. In the second part we report new experimental data on the paradoxes of the material conditional, the probabilistic modus ponens, the complement task, and data on the probabilistic truth table task. The results of the experiments provide evidence for the hypothesis that people represent indicative conditionals by conditional probability assertions. (shrink)
Various sources in the literature claim that the deduction theorem does not hold for normal modal or epistemic logic, whereas others present versions of the deduction theorem for several normal modal systems. It is shown here that the apparent problem arises from an objectionable notion of derivability from assumptions in an axiomatic system. When a traditional Hilbert-type system of axiomatic logic is generalized into a system for derivations from assumptions, the necessitation rule has to be modified in a (...) way that restricts its use to cases in which the premiss does not depend on assumptions. This restriction is entirely analogous to the restriction of the rule of universal generalization of first-order logic. A necessitation rule with this restriction permits a proof of the deduction theorem in its usual formulation. Other suggestions presented in the literature to deal with the problem are reviewed, and the present solution is argued to be preferable to the other alternatives. A contraction-and cut-free sequent calculus equivalent to the Hubert system for basic modal logic shows the standard failure argument untenable by proving the underivability of DA from A. (shrink)