This paper describes a formal measure of epistemic justification motivated by the dual goal of cognition, which is to increase true beliefs and reduce false beliefs. From this perspective the degree of epistemic justification should not be the conditional probability of the proposition given the evidence, as it is commonly thought. It should be determined instead by the combination of the conditional probability and the prior probability. This is also true of the degree of incremental confirmation, and I argue that (...) any measure of epistemic justification is also a measure of incremental confirmation. However, the degree of epistemic justification must meet an additional condition, and all known measures of incremental confirmation fail to meet it. I describe this additional condition as well as a measure that meets it. The paper then applies the measure to the conjunction fallacy and proposes an explanation of the fallacy. (shrink)
It is well known that the probabilistic relation of confirmation is not transitive in that even if E confirms H1 and H1 confirms H2, E may not confirm H2. In this paper we distinguish four senses of confirmation and examine additional conditions under which confirmation in different senses becomes transitive. We conduct this examination both in the general case where H1 confirms H2 and in the special case where H1 also logically entails H2. Based on these analyses, we argue that (...) the Screening-Off Condition is the most important condition for transitivity in confirmation because of its generality and ease of application. We illustrate our point with the example of Moore’s “proof” of the existence of a material world, where H1 logically entails H2, the Screening-Off Condition holds, and confirmation in all four senses turns out to be transitive. (shrink)
We show that as a chain of confirmation becomes longer, confirmation dwindles under screening-off. For example, if E confirms H1, H1 confirms H2, and H1 screens off E from H2, then the degree to which E confirms H2 is less than the degree to which E confirms H1. Although there are many measures of confirmation, our result holds on any measure that satisfies the Weak Law of Likelihood. We apply our result to testimony cases, relate it to the Data-Processing Inequality (...) in information theory, and extend it in two respects so that it covers a broader range of cases. (shrink)
It is well known that probabilistic support is not transitive. But it can be shown that probabilistic support is transitive provided the intermediary proposition screens off the original evidence with respect to the hypothesis in question. This has the consequence that probabilistic support is transitive when the original evidence is testimonial, memorial or perceptual (i.e., to the effect that such and such was testified to, remembered, or perceived), and the intermediary proposition is its representational content (i.e., to the effect that (...) the such and such occurred). (shrink)
This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the problem of (...) measure sensitivity and fails to justify the use of MI in giving definitive answers to questions of information. We propose a fourth interpretation of MI by reduction in expected inaccuracy, where inaccuracy is measured by a strictly proper monotonic scoring rule. It is shown that the answers to questions of information given by MI are definitive whenever this interpretation is appropriate, and that it is appropriate in a wide range of applications with epistemic implications. _1_ Introduction _2_ Formal Analyses of the Three Interpretations _2.1_ Reduction in doubt _2.2_ Reduction in uncertainty _2.3_ Divergence _3_ Inconsistency with Epistemic Value of Information _4_ Problem of Measure Sensitivity _5_ Reduction in Expected Inaccuracy _6_ Resolution of the Problem of Measure Sensitivity _6.1_ Alternative measures of inaccuracy _6.2_ Resolution by strict propriety _6.3_ Range of applications _7_ Global Scoring Rules _8_ Conclusion. (shrink)
In this paper we examine C. I. Lewis's view on the roleof coherence – what he calls ''congruence'' – in thejustification of beliefs based on memory ortestimony. Lewis has two main theses on the subject. His negativethesis states that coherence of independent items ofevidence has no impact on the probability of a conclusionunless each item has some credibility of its own. Thepositive thesis says, roughly speaking, that coherenceof independently obtained items of evidence – such asconverging memories or testimonies – raises (...) the probabilityof a conclusion to the extent sufficient for epistemicjustification, or, to use Lewis's expression, ''rationaland practical reliance''.It turns out that, while thenegative thesis is essentially correct, astrong positive connection between congruence andprobability – a connection of the kind Lewis ultimatelyneeds in his validation of memory – is contingent on thePrinciple of Indifference. In the final section we assess therepercussions of the latter fact for Lewis's theory in particularand for coherence justification in general. (shrink)
It is well known that the probabilistic relation of confirmation is not transitive in that even if E confirms H1 and H1 confirms H2, E may not confirm H2. In this paper we distinguish four senses of confirmation and examine additional conditions under which confirmation in different senses becomes transitive. We conduct this examination both in the general case where H1 confirms H2 and in the special case where H1 also logically entails H2. Based on these analyses, we argue that (...) the Screening-Off Condition is the most important condition for transitivity in confirmation because of its generality and ease of application. We illustrate our point with the example of Moore’s ‘‘proof’’ of the existence of a material world, where H1 logically entails H2, the Screening-Off Condition holds, and confirmation in all four senses turns out to be transitive. (shrink)
This paper defends reductionism about testimonial justification of beliefs against two influential arguments. One is the empirical argument to the effect that the reductionist justification of our trust in testimony is either circular since it relies on testimonial evidence or else there is scarce evidence in support of our trust in testimony. The other is the transcendental argument to the effect that trust in testimony is a prerequisite for the very existence of testimonial evidence since without the presumption of people’s (...) truthfulness we cannot interpret their utterances as testimony with propositional contents. This paper contends that the epistemic subject can interpret utterances as testimony with propositional contents without presupposing the credibility of testimony, and that evidence available to the normal epistemic subject can justify her trust in testimony. (shrink)
In this paper we make three points about justification of propositions by coherence “from scratch”, where pieces of evidence that are coherent have no individual credibility. First, we argue that no matter how many pieces of evidence are coherent, and no matter what relation we take coherence to be, coherence does not make independent pieces of evidence with no individual credibility credible. Second, we show that an intuitively plausible informal reasoning for justification by coherence from scratch is deficient since it (...) relies on an understanding of “individual credibility” inappropriate for justification from scratch. Third, we show that coherence, when it is recurrent, can make independent sources of evidence with no individual credibility credible. We describe specifically a case in which the same group of independent witnesses with no individual credibility repeatedly produce reports that are in agreement with each other, and their reports become credible as a result. (shrink)
This paper considers two novel Bayesian responses to a well-known skeptical paradox. The paradox consists of three intuitions: first, given appropriate sense experience, we have justification for accepting the relevant proposition about the external world; second, we have justification for expanding the body of accepted propositions through known entailment; third, we do not have justification for accepting that we are not disembodied souls in an immaterial world deceived by an evil demon. The first response we consider rejects the third intuition (...) and proposes an explanation of why we have a faulty intuition. The second response, which we favor, accommodates all three intuitions; it reconciles the first and the third intuition by the dual component model of justification, and defends the second intuition by distinguishing two principles of epistemic closure. (shrink)
This paper examines how coherence of the contents of evidence affects the transmission of probabilistic support from the evidence to the hypothesis. It is argued that coherence of the contents in the sense of the ratio of the positive intersection reduces the transmission of probabilistic support, though this negative impact of coherence may be offset by other aspects of the relations among the contents. It is argued further that there is no broader conception of coherence whose impact on the transmission (...) of probabilistic support is never offset by other aspects of the relations among the contents. The paper also examines reasons for the contrary impression that coherence of the contents increases the transmission of probabilistic support, especially in the special case where the hypothesis to evaluate is the conjunction of the contents of evidence. (shrink)
This paper disputes the widely held view that one cannot establish the reliability of a belief-forming process with the use of belief's that are obtained by that very process since such self-dependent justification is circular. Harold Brown ([1993]) argued in this journal that some cases of self-dependent justification are legitimate despite their circularity. I argue instead that under appropriate construal many cases of self-dependent justification are not truly circular but are instances of ordinary Bayesian confirmation, and hence they can raise (...) the probability of the hypothesis as legitimately as any such confirmation does. I shall argue in particular that despite its dependence on perception we can use naturalized epistemology to confirm the reliability of a perceptual process without circularity. (shrink)
ABSTRACT This article aims to achieve two things: to identify the conditions for transitivity in probabilistic support in various settings, and to uncover the components and structure of the mediated probabilistic relation. It is shown that when the probabilistic relation between the two propositions, x and z, is mediated by multiple layers of partitions of propositions, the impact x has on z consists of the purely indirect impact, the purely bypass impact, and the mixed impact. It is also shown that (...) although mediated confirmation as a whole is not transitive, the indirect part of mediated confirmation is transitive. _1_ Introduction _2_ The Structure of the Mediated Probabilistic Relation _3_ Transitivity and Anti-transitivity _4_ Bypass Disconfirmation _5_ Horizontal Generalization _6_ Coarse Screens _7_ Vertical Generalization _8_ Conclusion Appendix. (shrink)
This book develops new techniques in formal epistemology and applies them to the challenge of Cartesian skepticism. It introduces two formats of epistemic evaluation that should be of interest to epistemologists and philosophers of science: the dual-component format, which evaluates a statement on the basis of its safety and informativeness, and the relative-divergence format, which evaluates a probabilistic model on the basis of its complexity and goodness of fit with data. Tomoji Shogenji shows that the former lends support to Cartesian (...) skepticism, but the latter allows us to defeat Cartesian skepticism. Along the way, Shogenji addresses a number of related issues in epistemology and philosophy of science, including epistemic circularity, epistemic closure, and inductive skepticism. (shrink)
This paper proposes a new interpretation of mutual information (MI). We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information (EVI) assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the (...) problem of measure sensitivity and fails to justify the use of MI in giving definitive answers to questions of information. We propose a fourth interpretation of MI by reduction in expected inaccuracy, where inaccuracy is measured by a strictly proper monotonic scoring rule. It is shown that the answers to questions of information given by MI are definitive whenever this interpretation is appropriate, and that it is appropriate in a wide range of applications with epistemic implications. (shrink)
This paper aims to reconcile (i) the intuitively plausible view that a higher degree of coherence among independent pieces of evidence makes the hypothesis they support more probable, and (ii) the negative results in Bayesian epistemology to the effect that there is no probabilistic measure of coherence such that a higher degree of coherence among independent pieces of evidence makes the hypothesis they support more probable. I consider a simple model in which the negative result appears in a stark form: (...) the prior probability of the hypothesis and the individual vertical relations between each piece of evidence and the hypothesis completely determine the conditional probability of the hypothesis given the total evidence, leaving no room for the lateral relation (such as coherence) among the pieces of evidence to play any role. Despite this negative result, the model also reveals that a higher degree of coherence is indirectly associated with a higher conditional probability of the hypothesis because a higher degree of coherence indicates stronger individual supports. This analysis explains why coherence appears truth-conducive but in such a way that it defeats the idea of coherentism since the lateral relation (such as coherence) plays no independent role in the confirmation of the hypothesis. (shrink)
Among many reasons for which contemporary philosophers take coherentism in epistemology seriously, the most important is probably the perceived inadequacy of alternative accounts, most notably misgivings about foundationalism. But coherentism also receives straightforward support from cases in which beliefs are apparently justified by their coherence. From the perspective of those against coherentism, this means that an explanation is needed as to why in these cases coherence apparently justifies beliefs. Curiously, this task has not been carried out in a serious way (...) in the anti-coherentist literature although there is no scarcity of objections to coherentism. The traditional charge has been that justification by coherence is circular. More recently the isolation problem allegedly reveals that coherentism justifies beliefs that should not be justified. Questions have also been raised with respect to the basing relation and feasibility.1 However, these objections do not explain why some beliefs appear to be justified by their coherence. This paper fills this gap in the anti-coherentist literature by offering a noncoherentist account of justification by coherence. The paper proceeds as follows. Section I delineates the framework of discussion and develops some conceptual tools needed in later analyses. Section II argues that there are genuine cases of an increase in existing empirical justification by coherence, but that it does not require coherence to generate additional justification—coherence serves as a channel of justification among beliefs, which is no more problematic than channeling of justification from basic to derived beliefs in foundationalism. Section III makes a stronger case for justification by coherence, where each of the coherent beliefs has no independent empirical justification; but Section IV argues that even in these cases coherence need not generate justification—coherence licenses the channeling of justification from outside sources.. (shrink)
This paper examines the role of coherence of evidence in what I call the non-dynamic model of confirmation. It appears that other things being equal, a higher degree of coherence among pieces of evidence raises to a higher degree the probability of the proposition they support. I argue against this view on the basis of three related observations. First, we should be able to assess the impact of coherence on any hypothesis of interest the evidence supports. Second, the impact of (...) coherence among the pieces of evidence can be different on different hypotheses of interest they support. Third, when we assess the impact of coherence on a hypothesis of interest, other conditions that should be held equal for a fair assessment include the degrees of individual support which the propositions directly supported by the respective pieces of evidence provide for the hypothesis. Once we take these points into consideration, the impression that coherence of evidence plays a positive role in confirmation dissipates. In some cases it can be shown that other things being equal, a higher degree of coherence among the pieces of evidence reduces the degree of confirmation for the hypothesis they support. (shrink)
This paper addresses the meta-epistemological dispute over the basis of epistemic evaluation from the standpoint of meliorative epistemology. Meliorative epistemology aims at guiding our epistemic practice to better results, and it comprises two levels of epistemic evaluation. At the social level (meliorative social epistemology) appropriate experts conduct evaluation for the community, so that epistemic evaluation is externalist since each epistemic subject in the community need not have access to the basis of the experts' evaluation. While at the personal level (meliorative (...) personal epistemology) epistemic evaluation is internalist since each member of the community must evaluate the reliability of the (apparent) experts from the first-person perspective. I argue that evaluation at the social level should be the primary focus of meliorative epistemology since meliorative personal epistemology does not provide informative epistemic norms. It is then pointed out that epistemic evaluation at the social level can be considered internalist in the extended sense (social internalism) in that every component of the evaluation needs to be recognized by some members of the community at some points. As a result, some familiar problems of internalist epistemology, such as regress and circularity of epistemic support, carry over to meliorative social epistemology. (shrink)
This paper proposes an analysis of surprise formulated in terms of proximity to the truth, to replace the probabilistic account of surprise. It is common to link surprise to the low probability of the outcome. The idea seems sensible because an outcome with a low probability is unexpected, and an unexpected outcome often surprises us. However, the link between surprise and low probability is known to break down in some cases. There have been some attempts to modify the probabilistic account (...) to deal with these cases, but as we shall see, they are still faced with problems. The new analysis of surprise I propose turns to accuracy and identifies an unexpected degree of inaccuracy as reason for surprise. The shift from probability to proximity allows us to solve puzzles that strain the probabilistic account of surprise. (shrink)
This paper addresses the issue of rule-following in the context of the problem of the criterion. It presents a line of reasoning which concludes we do not know what rule we follow, but which develops independently of the problem of extrapolation that plays a major role in many recent discussions of rule-following. The basis of the argument is the normativity of rules, but the problem is also distinct from the issue of the gap between facts and values in axiology. The (...) paper further points out that the epistemic problem of not knowing what rule we follow leads to the outright denial of rule-following. (shrink)
This paper addresses the issue of rule-following in the context of the problem of the criterion. It presents a line of reasoning which concludes we do not know what rule we follow, but which develops independently of the problem of extrapolation that plays a major role in many recent discussions of rule-following. The basis of the argument is the normativity of rules, but the problem is also distinct from the issue of the gap between facts and values in axiology. The (...) paper further points out that the epistemic problem of not knowing what rule we follow leads to the outright denial of rule-following. (shrink)
The proportional weight view in epistemology of disagreement generalizes the equal weight view and proposes that we assign to judgments of different people weights that are proportional to their epistemic qualifications. It is shown that if the resulting degrees of confidence are to constitute a probability function, they must be the weighted arithmetic means of individual degrees of confidence, while if the resulting degrees of confidence are to obey the Bayesian rule of conditionalization, they must be the weighted geometric means (...) of individual degrees of confidence. The double bind entails that the proportional weight view (and its moderate adjustment in favor of one’s own judgment) is inconsistent with Bayesianism. (shrink)
Erik Olsson’s Against Coherence: Truth, Probability, and Justification is an important contribution to the growing literature on Bayesian coherentism. The book applies the formal theory of probability to issues of coherence in two contexts. One is the philosophical debate over radical skepticism, and the other is common sense and scientific reasoning. As the title of the book suggests, Olsson’s view about coherence is negative on both accounts. With regard to radical skepticism, Olsson states that “the connection between coherence and truth (...) is […] too weak to allow coherence to play the role it is supposed to play in a convincing response to radical scepticism.” (viii) Olsson also states, with regard to common sense and scientific reasoning, that “there is no way to specify an informative notion of coherence that would allow us to draw even the minimal conclusion that more coherence means a higher likelihood of truth other things being equal”. (viii) I want to begin with the second point, which is the more surprising of the two. (shrink)
One of the central issues in the recent discussion of rule-following has been the apparent gap between the finitude of any facts about the rule-follower and the infinitude of possible applications of rules. In this paper the author argues that the combination of the rule-follower's disposition and explicit directions can fill this gap with respect to the interpretation of individual words, but that the problem of finitude remains a serious threat to compositional semantics for natural language because there are no (...) explicit directions we can rely on in learning its rules. (shrink)
In this dissertation I examine the sceptical problem of rule following presented by Saul Kripke in his interpretation of Ludwig Wittgenstein's later works: Do any facts determine what rule we were following in our apparently rule-following activities such as the use of language? I distinguish three ways of understanding this question--modest scepticism, radical scepticism, and metascepticism--and address them in Parts 1, 2 and 3 of the dissertation, respectively. ;Part 1 discusses modest scepticism, which asserts that no finite facts about humans (...) can determine rules which have infinitely many possible applications. I resolve this apparent conflict between the finite and the infinite by incorporating the recursiveness of rule applications into the theory of rules. ;Part 2 discusses radical scepticism, which asserts that no facts determine what normative rules we were following. I show first that this challenge has a wider scope than the Humean Thesis of the underivability of ought from is, and that as a result, the radical sceptic can refute attitudinal theories of norms, which have been considered a possible response to the Humean challenge. I then propose a solution to radical scepticism which explains our practice and beliefs about normative rules without assuming the existence of rule-determining facts. Unlike the solution Kripke ascribes to Wittgenstein, it offers no justification of our practice and beliefs about normative rules. Some methodological implications of the solution are also explored. ;Part 3 discusses metascepticism about rule following, whose challenge extends to the rules used for the very discussion of rule following. I examine first whether such a challenge is self-undermining in that it makes the sceptical argument itself impossible. I propose an extended form of reductio ad absurdum argument by which the metasceptic can challenge rules in general, including those used in the sceptical argument. Then I argue that the solution to radical scepticism given in Part 2 survives metascepticism as well. (shrink)
This paper examines the epistemic status of the reflective belief about the content of one’s own conscious mental state, with emphasis on perceptual experience. I propose that the process that gives a special epistemic status to a reflective belief is not observation, inference, or conceptual articulation, but semantic ascent similar to the transition from a sentence in the object language to a sentence in the meta-language that affirms the truth of the original sentence. This account of the process of reflection (...) explains why a reflective belief is (subject to some qualification) infallible. (shrink)
We propose a coherence account of the conjunction fallacy applicable to both of its two paradigms. We compare our account with a recent proposal by Tentori et al. : 235–255, 2013) that attempts to generalize earlier confirmation accounts. Their model works better than its predecessors in some respects, but it exhibits only a shallow form of generality and is unsatisfactory in other ways as well: it is strained, complex, and untestable as it stands. Our coherence account inherits the strength of (...) the confirmation account, but in addition to being applicable to both paradigms, it is natural, simple, and readily testable. It thus constitutes the next natural step for Bayesian theorizing about the conjunction fallacy. (shrink)
We propose a coherence account of the conjunction fallacy applicable to both of its two paradigms. We compare our account with a recent proposal by Tentori et al. : 235–255, 2013) that attempts to generalize earlier confirmation accounts. Their model works better than its predecessors in some respects, but it exhibits only a shallow form of generality and is unsatisfactory in other ways as well: it is strained, complex, and untestable as it stands. Our coherence account inherits the strength of (...) the confirmation account, but in addition to being applicable to both paradigms, it is natural, simple, and readily testable. It thus constitutes the next natural step for Bayesian theorizing about the conjunction fallacy. (shrink)
Can there be a good argument for the total denial of rule following? The question concerns the "total" denial, where the targeted rules include those meta-rules presumably required for philosophical argumentation. In this paper the author contends that such a self-undermining argument can never be a good argument even in a "reductio ad absurdum" form, but that the defender of rule following cannot dismiss a challenge on this ground when the opponent adopts "the virus strategy".