Recent works in epistemology show that the claim that coherence is truth conducive – in the sense that, given suitable ceteris paribus conditions, more coherent sets of statements are always more probable – is dubious and possibly false. From this, it does not follows that coherence is a useless notion in epistemology and philosophy of science. Dietrich and Moretti (Philosophy of science 72(3): 403–424, 2005) have proposed a formal of account of how coherence is confirmation conducive—that is, of how (...) the coherence of a set of statements facilitates the confirmation of such statements. This account is grounded in two confirmation transmission properties that are satisfied by some of the measures of coherence recently proposed in the literature. These properties explicate everyday and scientific uses of coherence. In his paper, I review the main findings of Dietrich and Moretti (2005) and define two evidence-gathering properties that are satisfied by the same measures of coherence and constitute further ways in which coherence is confirmation conducive. At least one of these properties vindicates important applications of the notion of coherence in everyday life and in science. (shrink)
I review recent work on Phenomenal Conservatism, the position introduced by Michael Huemer according to which if it seems that P to a subject S, in the absence of defeaters S has thereby some degree of justification for believing P.
According to Jim Pryor’s dogmatism, when you have an experience with content p, you often have prima facie justification for believing p that doesn’t rest on your independent justification for believing any proposition. Although dogmatism has an intuitive appeal and seems to have an antisceptical bite, it has been targeted by various objections. This paper principally aims to answer the objections by Roger White according to which dogmatism is inconsistent with the Bayesian account of how evidence affects our rational credences. (...) If this were true, the rational acceptability of dogmatism would be seriously questionable. I respond that these objections don’t get off the ground because they assume that our experiences and our introspective beliefs that we have experiences have the same evidential force, whereas the dogmatist is uncommitted to this assumption. I also consider the question whether dogmatism has an antisceptical bite. I suggest that the answer turns on whether or not the Bayesian can determine the priors of hypotheses and conjectures on the grounds of their extra-empirical virtues. If the Bayesian can do so, the thesis that dogmatism has an antisceptical bite is probably false. (shrink)
In this paper we focus on transmission and failure of transmission of warrant. We identify three individually necessary and jointly sufficient conditions for transmission of warrant, and we show that their satisfaction grounds a number of interesting epistemic phenomena that have not been sufficiently appreciated in the literature. We then scrutinise Wright’s analysis of transmission failure and improve on extant readings of it. Nonetheless, we present a Bayesian counterexample that shows that Wright’s analysis is partially incoherent with our analysis of (...) warrant transmission and prima facie defective. We conclude exploring three alternative lines of reply: developing a more satisfactory account of transmission failure, which we outline; dismissing the Bayesian counterexample by rejecting some of its assumptions; reinterpreting Wright’s analysis to make it immune to the counterexample. (shrink)
Transmission of justification across inference is a valuable and indeed ubiquitous epistemic phenomenon in everyday life and science. It is thanks to the phenomenon of epistemic transmission that inferential reasoning is a means for substantiating predictions of future events and, more generally, for expanding the sphere of our justified beliefs or reinforcing the justification of beliefs that we already entertain. However, transmission of justification is not without exceptions. As a few epistemologists have come to realise, more or less trivial forms (...) of circularity can prevent justification from transmitting from p to q even if one has justification for p and one is aware of the inferential link from p to q. In interesting cases this happens because one can acquire justification for p only if one has independent justification for q. In this case the justification for q cannot depend on the justification for p and the inferential link from p to q, as genuine transmission would require. The phenomenon of transmission failure seems to shed light on philosophical puzzles, such as Moore's proof of a material world and McKinsey's paradox, and it plays a central role in various philosophical debates. For this reason it is being granted continued and increasing attention. (shrink)
Crispin Wright has given an explanation of how a first time warrant can fall short of transmitting across a known entailment. Formal epistemologists have struggled to turn Wright’s informal explanation into cogent Bayesian reasoning. In this paper, I analyse two Bayesian models of Wright’s account respectively proposed by Samir Okasha and Jake Chandler. I argue that both formalizations are unsatisfactory for different reasons, and I lay down a third Bayesian model that appears to me to capture the valid kernel of (...) Wright’s explanation. After this, I consider a recent development in Wright’s account of transmission failure. Wright suggests that his condition sufficient for transmission failure of first time warrant also suffices for transmission failure of supplementary warrant. I propose an interpretation of Wright’s suggestion that shield it from objections. I then lay down a fourth Bayesian framework that provides a simplified model of the unified explanation of transmission failure envisaged by Wright. (shrink)
John Hardwig has championed the thesis (NE) that evidence that an expert EXP has evidence for a proposition P, constituted by EXP’s testimony that P, is not evidence for P itself, where evidence for P is generally characterized as anything that counts towards establishing the truth of P. In this paper, I first show that (NE) yields tensions within Hardwig’s overall view of epistemic reliance on experts and makes it imply unpalatable consequences. Then, I use Shogenji-Roche’s theorem of transitivity of (...) incremental confirmation to show that (NE) is false if a natural Bayesian formalization of the above notion of evidence is implemented. I concede that Hardwig could resist my Bayesian objection if he re-interpreted (NE) as a more precise thesis that only applies to community-focused evidence. I argue, however, that this precisification, while diminishing the philosophical relevance of (NE), wouldn’t settle the tensions internal to Hardwig’s views. (shrink)
Coherentism in epistemology has long suffered from lack of formal and quantitative explication of the notion of coherence. One might hope that probabilistic accounts of coherence such as those proposed by Lewis, Shogenji, Olsson, Fitelson, and Bovens and Hartmann will finally help solve this problem. This paper shows, however, that those accounts have a serious common problem: the problem of belief individuation. The coherence degree that each of the accounts assigns to an information set (or the verdict it gives as (...) to whether the set is coherent tout court) depends on how beliefs (or propositions) that represent the set are individuated. Indeed, logically equivalent belief sets that represent the same information set can be given drastically different degrees of coherence. This feature clashes with our natural and reasonable expectation that the coherence degree of a belief set does not change unless the believer adds essentially new information to the set or drops old information from it; or, to put it simply, that the believer cannot raise or lower the degree of coherence by purely logical reasoning. None of the accounts in question can adequately deal with coherence once logical inferences get into the picture. Toward the end of the paper, another notion of coherence that takes into account not only the contents but also the origins (or sources) of the relevant beliefs is considered. It is argued that this notion of coherence is of dubious significance, and that it does not help solve the problem of belief individuation. (shrink)
This paper considers two novel Bayesian responses to a well-known skeptical paradox. The paradox consists of three intuitions: first, given appropriate sense experience, we have justification for accepting the relevant proposition about the external world; second, we have justification for expanding the body of accepted propositions through known entailment; third, we do not have justification for accepting that we are not disembodied souls in an immaterial world deceived by an evil demon. The first response we consider rejects the third intuition (...) and proposes an explanation of why we have a faulty intuition. The second response, which we favor, accommodates all three intuitions; it reconciles the first and the third intuition by the dual component model of justification, and defends the second intuition by distinguishing two principles of epistemic closure. (shrink)
R. Feldman defends a general principle about evidence the slogan form of which says that ‘evidence of evidence is evidence’. B. Fitelson considers three renditions of this principle and contends they are all falsified by counterexamples. Against both Feldman and Fitelson, J. Comesaña and E. Tal show that the third rendition––the one actually endorsed by Feldman––isn’t affected by Fitelson’s counterexamples, but only because it is trivially true and thus uninteresting. Tal and Comesaña defend a fourth version of Feldman’s principle, which––they (...) claim––has not yet been shown false. Against Tal and Comesaña I show that this new version of Feldman’s principle is false. (shrink)
The logical hexagon (or hexagon of opposition) is a strange, yet beautiful, highly symmetrical mathematical figure, mysteriously intertwining fundamental logical and geometrical features. It was discovered more or less at the same time (i.e. around 1950), independently, by a few scholars. It is the successor of an equally strange (but mathematically less impressive) structure, the “logical square” (or “square of opposition”), of which it is a much more general and powerful “relative”. The discovery of the former did not raise interest, (...) neither among logicians, nor among philosophers of logic, whereas the latter played a very important theoretical role (both for logic and philosophy) for nearly two thousand years, before falling in disgrace in the first half of the twentieth century: it was, so to say, “sentenced to death” by the so-called analytical philosophers and logicians. Contrary to this, since 2004 a new, unexpected promising branch of mathematics (dealing with “oppositions”) has appeared, “oppositional geometry” (also called “n-opposition theory”, “NOT”), inside which the logical hexagon (as well as its predecessor, the logical square) is only one term of an infinite series of “logical bi-simplexes of dimension m”, itself just one term of the more general infinite series (of series) of the “logical poly-simplexes of dimension m”. In this paper we recall the main historical and the main theoretical elements of these neglected recent discoveries. After proposing some new results, among which the notion of “hybrid logical hexagon”, we show which strong reasons, inside oppositional geometry, make understand that the logical hexagon is in fact a very important and profound mathematical structure, destined to many future fruitful developments and probably bearer of a major epistemological paradigm change. (shrink)
In this paper we argue that Michael Huemer’s phenomenal conservatism—the internalist view according to which our beliefs are prima facie justified if based on how things seems or appears to us to be—doesn’t fall afoul of Michael Bergmann’s dilemma for epistemological internalism. We start by showing that the thought experiment that Bergmann adduces to conclude that is vulnerable to his dilemma misses its target. After that, we distinguish between two ways in which a mental state can contribute to the justification (...) of a belief: the direct way and the indirect way. We identify a straightforward reason for claiming that the justification contributed indirectly is subject to Bergmann’s dilemma. Then we show that the same reason doesn’t extend to the claim that the justification contributed directly is subject to Bergmann’s dilemma. As is the view that seemings or appearances contribute justification directly, we infer that Bergmann’s contention that his dilemma applies to is unmotivated. In the final part, we suggest that our line of response to Bergmann can be used to shield other types of internalist justification from Bergmann’s objection. We also propose that seeming-grounded justification can be combined with justification of one of these types to form the basis of a promising version of internalist foundationalism. (shrink)
According to Jim Pryor’s dogmatism, if you have an experience as if P, you acquire immediate prima facie justification for believing P. Pryor contends that dogmatism validates Moore’s infamous proof of a material world. Against Pryor, I argue that if dogmatism is true, Moore’s proof turns out to be non-transmissive of justification according to one of the senses of non-transmissivity defined by Crispin Wright. This type of non-transmissivity doesn’t deprive dogmatism of its apparent antisceptical bite.
Whereas geometrical oppositions (logical squares and hexagons) have been so far investigated in many fields of modal logic (both abstract and applied), the oppositional geometrical side of “deontic logic” (the logic of “obligatory”, “forbidden”, “permitted”, . . .) has rather been neglected. Besides the classical “deontic square” (the deontic counterpart of Aristotle’s “logical square”), some interesting attempts have nevertheless been made to deepen the geometrical investigation of the deontic oppositions: Kalinowski (La logique des normes, PUF, Paris, 1972) has proposed a (...) “deontic hexagon” as being the geometrical representation of standard deontic logic, whereas Joerden (jointly with Hruschka, in Archiv für Rechtsund Sozialphilosophie 73:1, 1987), McNamara (Mind 105:419, 1996) and Wessels (Die gute Samariterin. Zur Struktur der Supererogation, Walter de Gruyter, Berlin, 2002) have proposed some new “deontic polygons” for dealing with conservative extensions of standard deontic logic internalising the concept of “supererogation”. Since 2004 a new formal science of the geometrical oppositions inside logic has appeared, that is “ n -opposition theory”, or “NOT”, which relies on the notion of “logical bi-simplex of dimension m ” ( m = n − 1). This theory has received a complete mathematical foundation in 2008, and since then several extensions. In this paper, by using it, we show that in standard deontic logic there are in fact many more oppositional deontic figures than Kalinowski’s unique “hexagon of norms” (more ones, and more complex ones, geometrically speaking: “deontic squares”, “deontic hexagons”, “deontic cubes”, . . ., “deontic tetraicosahedra”, . . .): the real geometry of the oppositions between deontic modalities is composed by the aforementioned structures (squares, hexagons, cubes, . . ., tetraicosahedra and hyper-tetraicosahedra), whose complete mathematical closure happens in fact to be a “deontic 5-dimensional hyper-tetraicosahedron” (an oppositional very regular solid). (shrink)
I focus on a key argument for global external world scepticism resting on the underdetermination thesis: the argument according to which we cannot know any proposition about our physical environment because sense evidence for it equally justifies some sceptical alternative (e.g. the Cartesian demon conjecture). I contend that the underdetermination argument can go through only if the controversial thesis that conceivability is per se a source of evidence for metaphysical possibility is true. I also suggest a reason to doubt that (...) conceivability is per se a source of evidence for metaphysical possibility, and thus to doubt the underdetermination argument. (shrink)
Crispin Wright maintains that the architecture of perceptual justification is such that we can acquire justification for our perceptual beliefs only if we have antecedent justification for ruling out any sceptical alternative. Wright contends that this principle doesn’t elicit scepticism, for we are non-evidentially entitled to accept the negation of any sceptical alternative. Sebastiano Moruzzi has challenged Wright’s contention by arguing that since our non-evidential entitlements don’t remove the epistemic risk of our perceptual beliefs, they don’t actually enable us to (...) acquire justification for these beliefs. In this paper I show that Wright’s responses to Moruzzi are ineffective and that Moruzzi’s argument is validated by probabilistic reasoning. I also suggest that Wright cannot answer Moruzzi’s challenge without weakening the support available for his conception of the architecture of perceptual justification. (shrink)
This paper proves that it is possible to build a Lagrangian for quantum electrodynamics which makes it explicit that the photon mass is eventually set to zero in the physical part on observational ground. Gauge independence is achieved upon considering the joint effect of gauge-averaging term and ghost fields. It remains possible to obtain a counterterm Lagrangian where the only non-gauge-invariant term is proportional to the squared divergence of the potential, while the photon propagator in momentum space falls off like (...) k −2 at large k which indeed agrees with perturbative renormalizability. The resulting radiative corrections to the Coulomb potential in QED are also shown to be gauge-independent. The experience acquired with quantum electrodynamics is used to investigate properties and problems of the extension of such ideas to non-Abelian gauge theories. (shrink)
This paper criticizes phenomenal conservatism––the influential view according to which a subject S’s seeming that P provides S with defeasible justification for believing P. I argue that phenomenal conservatism, if true at all, has a significant limitation: seeming-based justification is elusive because S can easily lose it by just reflecting on her seemings and speculating about their causes––I call this the problem of reflective awareness. Because of this limitation, phenomenal conservatism doesn’t have all the epistemic merits attributed to it by (...) its advocates. If true, phenomenal conservatism would constitute a unified theory of epistemic justification capable of giving everyday epistemic practices a rationale, but it wouldn’t afford us the means of an effective response to the sceptic. Furthermore, phenomenal conservatism couldn’t form the general basis for foundationalism. (shrink)
Some Carrollian posthumous manuscripts reveal, in addition to his famous ‘logical diagrams’, two mysterious ‘logical charts’. The first chart, a strange network making out of fourteen logical sentences a large 2D ‘triangle’ containing three smaller ones, has been shown equivalent—modulo the rediscovery of a fourth smaller triangle implicit in Carroll's global picture—to a 3D tetrahedron, the four triangular faces of which are the 3+1 Carrollian complex triangles. As it happens, such an until now very mysterious 3D logical shape—slightly deformed—has been (...) rediscovered, independently from Carroll and much later, by a logician , a mathematician and a linguist studying the geometry of the ‘opposition relations’, that is, the mathematical generalisations of the ‘logical square’. We show that inside what is called equivalently ‘n-opposition theory’, ‘oppositional geometry’ or ‘logical geometry’, Carroll's first chart corresponds exactly, duly reshap.. (shrink)
Three confirmation principles discussed by Hempel are the Converse Consequence Condition, the Special Consequence Condition and the Entailment Condition. Le Morvan (1999) has argued that, when the choice among confirmation principles is just about them, it is the Converse Consequence Condition that must be rejected. In this paper, I make this argument definitive. In doing that, I will provide an indisputable proof that the simple conjunction of the Converse Consequence Condition and the Entailment Condition yields a disastrous consequence.
Dummett has recently presented his most mature and sophisticated version of justificationism, i.e. the view that meaning and truth are to be analysed in terms of justifiability. In this paper, I argue that this conception does not resolve a difficulty that also affected Dummett’s earlier version of justificationism: the problem that large tracts of the past continuously vanish as their traces in the present dissipate. Since Dummett’s justificationism is essentially based on the assumption that the speaker has limited (i.e. non-idealized) (...) cognitive powers, no further refinement of this position is likely to settle the problem of the vanishing past. (shrink)
Brogaard and Salerno (2005, Nous, 39, 123–139) have argued that antirealism resting on a counterfactual analysis of truth is flawed because it commits a conditional fallacy by entailing the absurdity that there is necessarily an epistemic agent. Brogaard and Salerno's argument relies on a formal proof built upon the criticism of two parallel proofs given by Plantinga (1982, "Proceedings and Addresses of the American Philosophical Association", 56, 47–70) and Rea (2000, "Nous," 34, 291–301). If this argument were conclusive, antirealism resting (...) on a counterfactual analysis of truth should probably be abandoned. I argue however that the antirealist is not committed to a controversial reading of counterfactuals presupposed in Brogaard and Salerno's proof, and that the antirealist can in principle adopt an alternative reading that makes this proof invalid. My conclusion is that no reductio of antirealism resting on a counterfactual analysis of truth has yet been provided. (shrink)
Hypothetico-deductivists have struggled to develop qualitative confirmation theories not raising the so-called tacking by disjunction paradox. In this paper, I analyze the difficulties yielded by the paradox and I argue that the hypothetico-deductivist solutions given by Gemes (1998) and Kuipers (2000) are questionable because they do not fit such analysis. I then show that the paradox yields no difficulty for the Bayesian who appeals to the Total Evidence Condition. I finally argue that the same strategy is unavailable to the hypothetico-deductivist.
In this paper, I focus on the so-called "tacking by disjunction problem". Namely, the problem to the effect that, if a hypothesis H is confirmed by a statement E, H is confirmed by the disjunction E v F, for whatever statement F. I show that the attempt to settle this difficulty made by Grimes 1990, in a paper apparently forgotten by today methodologists, is irremediably faulty.
[NOTE: I WROTE THIS PAPER BEFORE STARTING MY PhD. SO DON'T EXPECT TOO MUCH.] Laudan and Leplin have argued that empirically equivalent theories can elude underdetermination by resorting to indirect confirmation. Moreover, they have provided a qualitative account of indirect confirmation that Okasha has shown to be incoherent. In this paper, I develop Kukla's recent contention that indirect confirmation is grounded in the probability calculus. I provide a Bayesian rule to calculate the probability of a hypothesis given indirect evidence. I (...) also suggest that the application of the rule presupposes the methodological relevance of non‐empirical virtues of theories. If this is true, Laudan and Leplin's strategy will not work in many cases. Moreover, without an independent way of justifying the role of non‐empirical virtues in methodology, the scientific realists cannot use indirect evidence to defeat underdetermination. (shrink)
In this paper, we identify a new and mathematically well-defined sense in which the coherence of a set of hypotheses can be truth-conducive. Our focus is not, as usual, on the probability but on the confirmation of a coherent set and its members. We show that, if evidence confirms a hypothesis, confirmation is “transmitted” to any hypotheses that are sufficiently coherent with the former hypothesis, according to some appropriate probabilistic coherence measure such as Olsson’s or Fitelson’s measure. Our findings have (...) implications for scientific methodology, as they provide a formal rationale for the method of indirect confirmation and the method of confirming theories by confirming their parts. (shrink)
Minimal entities are, roughly, those that fall under notions defined by only deflationary principles. In this paper I provide an accurate characterization of two types of minimal entities: minimal properties and minimal facts. This characterization is inspired by both Schiffer's notion of a pleonastic entity and Horwich's notion of minimal truth. I argue that we are committed to the existence of minimal properties and minimal facts according to a deflationary notion of existence, and that the appeal to the inferential role (...) reading of the quantifiers does not dismiss this commitment. I also argue that deflationary existence is language-dependent existence—this clarifies why minimalists about properties and facts are not realists about these entities though their language may appear indistinguishable from the language of realists. (shrink)
En la consideración de numerosos asuntos y respecto de muy variadas exposiciones, el uso de expresiones como "filosófica" sugiere que debemos remitirnos a procederes, preguntas o exigencias especiales. Rabossi propone un modo de caracterizar el sentido con que usamos esas expresiones y, sobre esa base, concluye que la filosofía tal como se la practica desde hace doscientos años pretende ser una disciplina profesional pero no puede serlo debido a la índole de la preceptiva que la constituye. En este artículo se (...) examinan sus argumentos y se sostiene que, aunque no parecen suficientes para la conclusión a la que apuntan, hay razones para modificarlos de cierto modo que conducen a ese resultado. Whenever expressions like "X is philosophical" appear in different questions and assorted statements, the reference to special kinds of actions, questions or demands are suggested by these uses. Rabossi propounds a way for characterizing the sense of our using such expressions and, based on it, he states that philosophy, as it is accomplished from the last two centuries up to now, pretends to be a professional discipline but she cannot to be as such because the nature of its precepts. Rabossi's arguments are examined and it is maintained that although they do not seem to be sufficient for the pursued conclusion, there are reasons to modify them such a way to come to that conclusion. (shrink)
We focus on issues of learning assessment from the point of view of an investigation of philosophical elements in teaching. We contend that assessment of concept possession at school based on ordinary multiple-choice tests might be ineffective because it overlooks aspects of human rationality illuminated by Robert Brandom’s inferentialism––the view that conceptual content largely coincides with the inferential role of linguistic expressions used in public discourse. More particularly, we argue that multiple-choice tests at schools might fail to accurately assess the (...) possession of a concept or the lack of it, for they only check the written outputs of the pupils who take them, without detecting the inferences actually endorsed or used by them. We suggest that school tests would acquire reliability if they enabled pupils to make the reasons of their answers or the inferences they use explicit, so as to contribute to what Brandom calls the game of giving and asking for reasons. We explore the possibility to put this suggestion into practice by deploying two-tier multiple-choice tests. (shrink)
According to Wrights minimalism, a notion of truth neutral with respect to realism and antirealism can be built out of the notion of warranted assertibility and a set of a priori platitudes among which the Equivalence Schema has a prominent role. Wright believes that the debate about realism and antirealism will be properly and fruitfully developed if both parties accept the conceptual framework of minimalism. In this paper, I show that this conceptual framework commits the minimalist to the realist thesis (...) that there are mind-independent propositions; with the consequence that minimalism is not neutral to realism and antirealism. I suggest that Wright could avert this conclusion if he rejected the customary interpretation of the Equivalence Schema according to which this Schema applies to propositions. This would however render minimalism unpalatable to philosophers who welcome the traditional reading of the Equivalence Schema and believe that propositions are bearers of truth. (shrink)
Time as defined in the context of individual lives cannot be measured or compared; it therefore needs to be particularized through processes of synchronization and desynchronization. Subjectivity is a notion that supports temporal objectivity only if the mode of production is not based on a concept of exchange but on simple appropriation. Time as identified with the life of the individual remains incommensurable. But the history of growth in the spatial dimensions of trade and the reduction in the amount of (...) time needed to effect commercial exchanges is integral to and consequent on the development of science as a method of forecasting and planning. As trade grows, so does the role of science, to the point where it can be seen as pivotal to a society in which the practice of trade is becoming both universal and frequent. The growth of trade was the cause and the effect of both a need to consolidate and develop an increasingly complex system of forecasting, and the requirement for a science with the capacity to make the future less unpredictable. (shrink)