Whereas geometrical oppositions (logical squares and hexagons) have been so far investigated in many fields of modal logic (both abstract and applied), the oppositional geometrical side of “deontic logic” (the logic of “obligatory”, “forbidden”, “permitted”, . . .) has rather been neglected. Besides the classical “deontic square” (the deontic counterpart of Aristotle’s “logical square”), some interesting attempts have nevertheless been made to deepen the geometrical investigation of the deontic oppositions: Kalinowski (La logique des normes, PUF, Paris, 1972) has proposed a (...) “deontic hexagon” as being the geometrical representation of standard deontic logic, whereas Joerden (jointly with Hruschka, in Archiv für Rechtsund Sozialphilosophie 73:1, 1987), McNamara (Mind 105:419, 1996) and Wessels (Die gute Samariterin. Zur Struktur der Supererogation, Walter de Gruyter, Berlin, 2002) have proposed some new “deontic polygons” for dealing with conservative extensions of standard deontic logic internalising the concept of “supererogation”. Since 2004 a new formal science of the geometrical oppositions inside logic has appeared, that is “ n -opposition theory”, or “NOT”, which relies on the notion of “logical bi-simplex of dimension m ” ( m = n − 1). This theory has received a complete mathematical foundation in 2008, and since then several extensions. In this paper, by using it, we show that in standard deontic logic there are in fact many more oppositional deontic figures than Kalinowski’s unique “hexagon of norms” (more ones, and more complex ones, geometrically speaking: “deontic squares”, “deontic hexagons”, “deontic cubes”, . . ., “deontic tetraicosahedra”, . . .): the real geometry of the oppositions between deontic modalities is composed by the aforementioned structures (squares, hexagons, cubes, . . ., tetraicosahedra and hyper-tetraicosahedra), whose complete mathematical closure happens in fact to be a “deontic 5-dimensional hyper-tetraicosahedron” (an oppositional very regular solid). (shrink)
Recent works in epistemology show that the claim that coherence is truth conducive – in the sense that, given suitable ceteris paribus conditions, more coherent sets of statements are always more probable – is dubious and possibly false. From this, it does not follows that coherence is a useless notion in epistemology and philosophy of science. Dietrich and Moretti (Philosophy of science 72(3): 403–424, 2005) have proposed a formal of account of how coherence is confirmation conducive—that is, of how (...) the coherence of a set of statements facilitates the confirmation of such statements. This account is grounded in two confirmation transmission properties that are satisfied by some of the measures of coherence recently proposed in the literature. These properties explicate everyday and scientific uses of coherence. In his paper, I review the main findings of Dietrich and Moretti (2005) and define two evidence-gathering properties that are satisfied by the same measures of coherence and constitute further ways in which coherence is confirmation conducive. At least one of these properties vindicates important applications of the notion of coherence in everyday life and in science. (shrink)
According to Jim Pryor’s dogmatism, when you have an experience with content p, you have prima facie justification to believe p that does not rest on your independent justification or evidence to believe any proposition. Although dogmatism is intuitive and seems to have an antisceptical punch, it has been targeted by different objections. In this paper I aim to answer the objections by Roger White according to which dogmatism is incoherent with the Bayesian account of how evidence affects rational credences. (...) If this were true, the rational acceptability of dogmatism would be seriously questionable. I respond that these objections don’t get off the ground because they assume that experiences and reports of experience have the same evidential force, whereas the dogmatist is uncommitted to this assumption. I also elucidate what gives dogmatism its antisceptical punch by drawing from recent papers by Brian Weatherson, Peter Kung and Pryor himself in which alternative responses to White’s challenge are delineated. I argue that my rejoinder is more complete and simpler than these responses, for the latter permit White’s objections to go through in many cases, whereas my response doesn’t. Furthermore according to these responses, dogmatism is tenable only if Bayesianism is replaced with alternative formal frameworks, which is not a requirement of my rejoinder. (shrink)
I focus on a key argument for global external world scepticism resting on the underdetermination thesis: the argument according to which we cannot know any proposition about our physical environment because sense evidence for it equally justifies some sceptical alternative (e.g. the Cartesian demon conjecture). I contend that the underdetermination argument can go through only if the controversial thesis that conceivability is per se a source of evidence for metaphysical possibility is true. I also suggest a reason to doubt that (...) conceivability is per se a source of evidence for metaphysical possibility, and thus to doubt the underdetermination argument. (shrink)
Beall and Restall 2000; 2001; 2006 advocate a comprehensive pluralist approach to logic, which they call Logical Pluralism, according to which there is not one true logic but many equally acceptable logical systems. They maintain that Logical Pluralism is compatible with monism about metaphysical modality, according to which there is just one correct logic of metaphysical modality. Wyatt 2004 contends that Logical Pluralism is incompatible with monism about metaphysical modality. We first suggest that if Wyatt were right, Logical Pluralism would (...) be strongly implausible because it would get upside down a dependence relation that holds between metaphysics and logic of modality. We then argue that Logical Pluralism is prima facie compatible with monism about metaphysical modality. (shrink)
Crispin Wright has given an explanation of how a first time warrant can fall short of transmitting across a known entailment. Formal epistemologists have struggled to turn Wright’s informal explanation into cogent Bayesian reasoning. In this paper, I analyse two Bayesian models of Wright’s account respectively proposed by Samir Okasha and Jake Chandler. I argue that both formalizations are unsatisfactory for different reasons, and I lay down a third Bayesian model that appears to me to capture the valid kernel of (...) Wright’s explanation. After this, I consider a recent development in Wright’s account of transmission failure. Wright suggests that his condition sufficient for transmission failure of first time warrant also suffices for transmission failure of supplementary warrant. I propose an interpretation of Wright’s suggestion that shield it from objections. I then lay down a fourth Bayesian framework that provides a simplified model of the unified explanation of transmission failure envisaged by Wright. (shrink)
In this paper we focus on transmission and failure of transmission of warrant. We identify three individually necessary and jointly sufficient conditions for transmission of warrant, and we show that their satisfaction grounds a number of interesting epistemic phenomena that have not been sufficiently appreciated in the literature. We then scrutinise Wright's analysis of transmission failure and improve on extant readings of it. Nonetheless, we present a Bayesian counterexample that shows that Wright's analysis is partially incoherent with our analysis of (...) warrant transmission and prima facie defective. We conclude exploring three alternative lines of reply: developing a more satisfactory account of transmission failure, which we outline; dismissing the Bayesian counterexample by rejecting some of its assumptions; reinterpreting Wright’s analysis to make it immune to the counterexample. (shrink)
According to Wright, Moore’s contentious “proof of the existence of a material world” in not cogent because no warrant can transmit from its premise to its conclusion. Since Bayesian confirmation theory probably affords the best account of inductive reasoning we have today, if Wright’s analysis of Moore’s “proof” could be translated in Bayesian language, it would probably be preferable to rival analyses that cannot be reformulated in the same way. Okasha has recently proposed a Bayesian model that apparently vindicates Wright’s (...) analysis on the whole. In this paper I first argue that Okasha’s Bayesian vindication is in different respects flawed and thus unacceptable. I then propose a more suitable Bayesian framework, resting on the so-called Lockean Thesis, which does vindicate Wright’s analysis. My investigation sheds new lights on the logical features proper to the warrant that Wright deems not to transmit across entailment, on the constituents of the logical “mechanism” that according to Wright engenders failure of transmission, and on the fine structure of the rational architecture of perceptual warrant outlined by Wright. (shrink)
The expression conditional fallacy identifies a family of arguments deemed to entail odd and false consequences for notions defined in terms of counterfactuals. The antirealist notion of truth is typically defined in terms of what a rational enquirer or a community of rational enquirers would believe if they were suitably informed. This notion is deemed to entail, via the conditional fallacy, odd and false propositions, for example that the Peircean end of inquiry has been reached or that there is necessarily (...) a rational enquirer. If these consequences followed from the antirealist notion of truth, alethic antirealism should probably be rejected. In this paper we analyse the conditional fallacy from a semantic (i.e. model-theoretic) point of view. This allows us to identify with precision the philosophical commitments that ground the validity of this type of arguments. We show that the conditional fallacy arguments against alethic antirealism are valid only if controversial metaphysical assumptions are accepted. We suggest that the antirealist is not committed to the conditional fallacy because she is not committed to some of these assumptions. (shrink)
Dummett has recently presented his most mature and sophisticated version of justificationism, i.e. the view that meaning and truth are to be analysed in terms of justifiability. In this paper, I argue that this conception does not resolve a difficulty that also affected Dummett’s earlier version of justificationism: the problem that large tracts of the past continuously vanish as their traces in the present dissipate. Since Dummett’s justificationism is essentially based on the assumption that the speaker has limited (i.e. non-idealized) (...) cognitive powers, no further refinement of this position is likely to settle the problem of the vanishing past. (shrink)
The general tendency or attitude that Dreier 2004 calls creeping minimalism is ramping up in contemporary analytic philosophy. Those who entertain this attitude will take for granted a framework of deflationary or minimal notions – principally semantical1 and ontological – by means of which to analyse problems in different philosophical fields – e.g. theory of truth, metaethics, philosophy of language, the debate on realism and antirealism, etc. Let us call sweeping minimalist the philosopher affected by creeping minimalism. The framework of (...) minimal notions that the sweeping minimalist takes for granted encompasses, for instance, the concept of truth, reference, proposition, fact, individual, and property. Minimal notions are characterized in terms of general platitudinous principles expressed by schemata like the following (cf.: 26): ‘S’ is true if and only if S; ‘S’ is true if and only if ‘S’ corresponds to the facts; a has the property of being P if and only if a is P. Where ‘S’ and ‘a is P’ stand for sentences satisfying superficial constraints of truth-aptitude (i.e. sentences in declarative form subject to communally acknowledged standards of proper use), and.. (shrink)
Brogaard and Salerno (2005, Nous, 39, 123–139) have argued that antirealism resting on a counterfactual analysis of truth is flawed because it commits a conditional fallacy by entailing the absurdity that there is necessarily an epistemic agent. Brogaard and Salerno's argument relies on a formal proof built upon the criticism of two parallel proofs given by Plantinga (1982, "Proceedings and Addresses of the American Philosophical Association", 56, 47–70) and Rea (2000, "Nours," 34, 291–301). If this argument were conclusive, antirealism resting (...) on a counterfactual analysis of truth should probably be abandoned. I argue however that the antirealist is not committed to a controversial reading of counterfactuals presupposed in Brogaard and Salerno's proof, and that the antirealist can in principle adopt an alternative reading that makes this proof invalid. My conclusion is that no reductio of antirealism resting on a counterfactual analysis of truth has yet been provided. (shrink)
According to Wrights minimalism, a notion of truth neutral with respect to realism and antirealism can be built out of the notion of warranted assertibility and a set of a priori platitudes among which the Equivalence Schema has a prominent role. Wright believes that the debate about realism and antirealism will be properly and fruitfully developed if both parties accept the conceptual framework of minimalism. In this paper, I show that this conceptual framework commits the minimalist to the realist thesis (...) that there are mind-independent propositions; with the consequence that minimalism is not neutral to realism and antirealism. I suggest that Wright could avert this conclusion if he rejected the customary interpretation of the Equivalence Schema according to which this Schema applies to propositions. This would however render minimalism unpalatable to philosophers who welcome the traditional reading of the Equivalence Schema and believe that propositions are bearers of truth. (shrink)
Minimal entities are, roughly, those that fall under notions defined by only deflationary principles. In this paper I provide an accurate characterization of two types of minimal entities: minimal properties and minimal facts. This characterization is inspired by both Schiffer's notion of a pleonastic entity and Horwich's notion of minimal truth. I argue that we are committed to the existence of minimal properties and minimal facts according to a deflationary notion of existence, and that the appeal to the inferential role (...) reading of the quantifiers does not dismiss this commitment. I also argue that deflationary existence is language-dependent existence—this clarifies why minimalists about properties and facts are not realists about these entities though their language may appear indistinguishable from the language of realists. (shrink)
Coherentism in epistemology has long suffered from lack of formal and quantitative explication of the notion of coherence. One might hope that probabilistic accounts of coherence such as those proposed by Lewis, Shogenji, Olsson, Fitelson, and Bovens and Hartmann will finally help solve this problem. This paper shows, however, that those accounts have a serious common problem: the problem of belief individuation. The coherence degree that each of the accounts assigns to an information set (or the verdict it gives as (...) to whether the set is coherent tout court) depends on how beliefs (or propositions) that represent the set are individuated. Indeed, logically equivalent belief sets that represent the same information set can be given drastically different degrees of coherence. This feature clashes with our natural and reasonable expectation that the coherence degree of a belief set does not change unless the believer adds essentially new information to the set or drops old information from it; or, to put it simply, that the believer cannot raise or lower the degree of coherence by purely logical reasoning. None of the accounts in question can adequately deal with coherence once logical inferences get into the picture. Toward the end of the paper, another notion of coherence that takes into account not only the contents but also the origins (or sources) of the relevant beliefs is considered. It is argued that this notion of coherence is of dubious significance, and that it does not help solve the problem of belief individuation. (shrink)
Hypothetico-deductivists have struggled to develop qualitative confirmation theories not raising the so-called tacking by disjunction paradox. In this paper, I analyze the difficulties yielded by the paradox and I argue that the hypothetico-deductivist solutions given by Gemes (1998) and Kuipers (2000) are questionable because they do not fit such analysis. I then show that the paradox yields no difficulty for the Bayesian who appeals to the Total Evidence Condition. I finally argue that the same strategy is unavailable to the hypothetico-deductivist.
In this paper, I show that Lewis' definition of coherence and Fitelson's and Shogenji's measures of coherence are unacceptable because they entail the absurdity that any set of beliefs in general is coherent and not coherent at the same time. This devastating result is obtained if a simple and plausible principle of stability for coherence is accepted.
In this paper, we identify a new and mathematically well-defined sense in which the coherence of a set of hypotheses can be truth-conducive. Our focus is not, as usually, on the probability but on the confirmation of a coherent set and its members. We show that, if evidence confirms a hypothesis, confirmation is "transmitted" to any hypotheses that are sufficiently coherent with the former hypothesis, according to some appropriate probabilistic coherence measure such as Olsson’s or Fitelson’s measure. Our findings have (...) implications for scientific methodology, as they provide a formal rationale for the method of indirect confirmation and the method of confirming theories by confirming their parts. (shrink)
Three confirmation principles discussed by Hempel are the Converse Consequence Condition, the Special Consequence Condition and the Entailment Condition. Le Morvan (1999) has argued that, when the choice among confirmation principles is just about them, it is the Converse Consequence Condition that must be rejected. In this paper, I make this argument definitive. In doing that, I will provide an indisputable proof that the simple conjunction of the Converse Consequence Condition and the Entailment Condition yields a disastrous consequence.
In this paper, I focus on the so-called "tacking by disjunction problem". Namely, the problem to the effect that, if a hypothesis H is confirmed by a statement E, H is confirmed by the disjunction E v F, for whatever statement F. I show that the attempt to settle this difficulty made by Grimes 1990, in a paper apparently forgotten by today methodologists, is irremediably faulty.
Published in Darren Tofts, Annemarie Jonson, and Alessio Cavallaro (eds), _Prefiguring Cyberculture: an intellectual history_ (MIT Press and Power Publications, December 2002). Please do send comments: email me. Back to my main publications page . Back to my home page.
Pragmatist reinterpretations of both deliberative-communicative theory and legal positivism point out the mentalist fallacy entailed by these prevalent models. I argue that pragmatist approaches imply analogous erroneous beliefs since they presuppose as given the shared perception of social contexts. Therefore they take for granted the shared interpretation of social problems and shared selection of common goals. Hence I advance the necessity of inquiring into the possibility conditions for a shared perception of social contexts. This would entail the organization of institutional (...) incentives meant to extend the scope and inclusiveness of the immediate perception of social context expressed by different agents. (shrink)
Note: This is not an ad hoc change at all. It’s simply the natural thing say here – if one thinks of F as a generalization of classical logical entailment. The extra complexity I had in my original (incorrect) deﬁnition of F was there because I was foolishly trying to encode some non-classical, or “relavant” logical structure in F. I now think this is a mistake, and that I should go with the above, classical account of F. Arguments about relevance (...) logic need to be handled in a diﬀerent way (and a diﬀerent context!). And, besides, as Luca Moretti has shown (see below), the original deﬁnition of F cannot be the right basis for C ! OK, now on to C. (shrink)
This paper investigates the varieties of reductionism and realism about causal relations in macroeconometrics. There are two issues, which are kept distinct in the analysis but which are interrelated in the development of econometrics. The first one is the question of the reducibility of causal relations to regularities, measured in statistics by correlations. The second one is the question of the reducibility of causes among macroeconomic aggregates to microeconomic behaviour. It is argued that there is a continuum of possible positions (...) between realism and reductionism for both the questions, but, as far as the second question is concerned, the dominant position of mainstream macroeconometrics is strongly reductionist. The paper defends an integrative approach that emphasizes the gradual nature of many real world cases. (shrink)
We investigate an extension of the formalism of interpreted systems by Halpern and colleagues to model the correct behaviour of agents. The semantical model allows for the representation and reasoning about states of correct and incorrect functioning behaviour of the agents, and of the system as a whole. We axiomatise this semantic class by mapping it into a suitable class of Kripke models. The resulting logic, KD45n i-j, is a stronger version of KD, the system often referred to as Standard (...) Deontic Logic. We extend this formal framework to include the standard epistemic notions defined on interpreted systems, and introduce a new doubly-indexed operator representing the knowledge that an agent would have if it operates under the assumption that a group of agents is functioning correctly. We discuss these issues both theoretically and in terms of applications, and present further directions of work. (shrink)
Two persons have been proposed as the author of the Summa Lamberti, a thireenth-century treatise on logic. Franco Alessio takes him to be the Auxerre Dominican Lambert of Ligny-le-Châtel, and he basis his claim on Dominican sources from the fourteenth to the nineteenth centuries. Recently, Alain de Libera has presented a counter-proposal: the author was Lambert of Lagny, a secular cleric at the time of the composition, who afterwards became a Dominican. This claim is based on the acta of (...) the counts of Champagne and a document of Pope Urban IV. I conclude that, given the present evidence, de Libera’s case rests on more historically sound data, but that to arrive at this conclusion one must impeach the Dominican sources (not done by de Libera) and take into consideration additional data from the research of Michèle Mulchahey on the introduction of logic into the Dominican curriculum. (shrink)
Questions about truth and questions about reality are intimately connected. One can ask whether reality includes numbers by asking ‘Are there numbers?’ But one can also ask what (arguably) amounts to the very same question by asking ‘Is the sentence “There are numbers” true?’ Such ‘semantic ascent’ makes it seem that the nature of reality can be investigated by investigating our true sentences. This line of thought was very much taken for granted in twentieth century philosophy, but it is now (...) beginning to be called into question. Just how much can we learn about the nature of reality by investigating our true sentences? Does, for example, the truth of ‘There is a prime number between ten and twenty’ mean that prime numbers exist? Does the truth of ‘Eating people is wrong’ mean that moral properties exist? Does the truth of 'Spiders give me the creeps' mean that the creeps exists? In From Truth to Reality, Heather Dyke brings together some of the foremost metaphysicians to examine approaches to truth, reality, and the connections between the two. This collection features new and previously unpublished material by JC Beall, Mark Colyvan, Michael Devitt, John Heil, Frank Jackson, Fred Kroon, D. H. Mellor, Luca Moretti, Alan Musgrave, Robert Nola, J. J. C. Smart, Paul Snowdon, and Daniel Stoljar. (shrink)
La Paradoja de Orayen es dos cosas en una. Primeramente, es un homenaje al ﬁlósofo argentino Raúl Orayen (1942–2003). Pocos ﬁlósofos hispanoamericanos han gozado de la solidez intelectual y agudeza ﬁlosóﬁca de Orayen, y pocos han sido tan queridos. Se trata, pues, de un homenaje bien merecido y que mucho agradecemos los que tuvimos la fortuna de interactuar con Raúl y aprender de él. En segundo lugar, el libro es una contribución a la ﬁlosofía hispanoamericana. Alberto Moretti y Guillermo (...) Hurtado tuvieron el acierto de reconocer el valor de un proyecto que la prematura muerte de Orayen dejó inconcluso, y apreciar su potencial para generar discusión ﬁlosóﬁca de alto nivel. El resultado es un volumen que recompensará la atención de sus lectores, y dará al trabajo de Orayen justa prominencia en el mundo hispanoamericano. (shrink)