The principle of indifference states that in the absence of any relevant evidence, a rational agent will distribute their credence equally among all the possible outcomes under consideration. Despite its intuitive plausibility, PI famously falls prey to paradox, and so is widely rejected as a principle of ideal rationality. In this article, I present a novel rehabilitation of PI in terms of the epistemology of comparative confidence judgments. In particular, I consider two natural comparative reformulations of PI and argue that (...) while one of them prescribes the adoption of patently irrational epistemic states, the other provides a consistent formulation of PI that overcomes the most salient limitations of existing formulations. (shrink)
According to the Bayesian paradigm in the psychology of reasoning, the norms by which everyday human cognition is best evaluated are probabilistic rather than logical in character. Recently, the Bayesian paradigm has been applied to the domain of argumentation, where the fundamental norms are traditionally assumed to be logical. Here, we present a major generalisation of extant Bayesian approaches to argumentation that utilizes a new class of Bayesian learning methods that are better suited to modelling dynamic and conditional inferences than (...) standard Bayesian conditionalization, is able to characterise the special value of logically valid argument schemes in uncertain reasoning contexts, greatly extends the range of inferences and argumentative phenomena that can be adequately described in a Bayesian framework, and undermines some influential theoretical motivations for dual function models of human cognition. We conclude that the probabilistic norms given by the Bayesian approach to rationality are not necessarily at odds with the norms given by classical logic. Rather, the Bayesian theory of argumentation can be seen as justifying and enriching the argumentative norms of classical logic. (shrink)
Suppositions can be introduced in either the indicative or subjunctive mood. The introduction of either type of supposition initiates judgments that may be either qualitative, binary judgments about whether a given proposition is acceptable or quantitative, numerical ones about how acceptable it is. As such, accounts of qualitative/quantitative judgment under indicative/subjunctive supposition have been developed in the literature. We explore these four different types of theories by systematically explicating the relationships canonical representatives of each. Our representative qualitative accounts of indicative (...) and subjunctive supposition are based on the belief change operations provided by AGM revision and KM update respectively; our representative quantitative ones are offered by conditionalization and imaging. This choice is motivated by the familiar approach of understanding supposition as `provisional belief revision' wherein one temporarily treats the supposition as true and forms judgments by making appropriate changes to their other opinions. To compare the numerical judgments recommended by the quantitative theories with the binary ones recommended by the qualitative accounts, we rely on a suitably adapted version of the Lockean thesis. Ultimately, we establish a number of new results that we interpret as vindicating the often-repeated claim that conditionalization is a probabilistic version of revision, while imaging is a probabilistic version of update. (shrink)
In this article, we address a major outstanding question of probabilistic Bayesian epistemology: how should a rational Bayesian agent update their beliefs upon learning an indicative conditional? A number of authors have recently contended that this question is fundamentally underdetermined by Bayesian norms, and hence that there is no single update procedure that rational agents are obliged to follow upon learning an indicative conditional. Here we resist this trend and argue that a core set of widely accepted Bayesian norms is (...) sufficient to identify a normatively privileged updating procedure for this kind of learning. Along the way, we justify a privileged formalization of the notion of ‘epistemic conservativity’, offer a new analysis of the Judy Benjamin problem, and emphasize the distinction between interpreting the content of new evidence and updating one’s beliefs on the basis of that content. (shrink)
Kim’s causal exclusion argument purports to demonstrate that the non-reductive physicalist must treat mental properties (and macro-level properties in general) as causally inert. A number of authors have attempted to resist Kim’s conclusion by utilizing the conceptual resources of Woodward’s (2005) interventionist conception of causation. The viability of these responses has been challenged by Gebharter (2017a), who argues that the causal exclusion argument is vindicated by the theory of causal Bayesian networks (CBNs). Since the interventionist conception of causation relies crucially (...) on CBNs for its foundations, Gebharter’s argument appears to cast significant doubt on interventionism’s antireductionist credentials. In the present article, we both (1) demonstrate that Gebharter’s CBN-theoretic formulation of the exclusion argument relies on some unmotivated and philosophically significant assumptions (especially regarding the relationship between CBNs and the metaphysics of causal relevance), and (2) use Bayesian networks to develop a general theory of causal inference for multi-level systems that can serve as the foundation for an antireductionist interventionist account of causation. (shrink)
Schupbach and Sprenger introduce a novel probabilistic approach to measuring the explanatory power that a given explanans exerts over a corresponding explanandum. Though we are sympathetic to their general approach, we argue that it does not adequately capture the way in which the causal explanatory power that c exerts on e varies with background knowledge. We then amend their approach so that it does capture this variance. Though our account of explanatory power is less ambitious than Schupbach and Sprenger’s in (...) the sense that it is limited to causal explanatory power, it is also more ambitious because we do not limit its domain to cases where c genuinely explains e. Instead, we claim that c causally explains e if and only if our account says that c explains e with some positive amount of causal explanatory power. 1Introduction 2The Logic of Explanatory Power 3Subjective and Nomic Distributions 3.1Actual degrees of belief 3.2The causal distribution 4Background Knowledge 4.1Conditionalization and colliders 4.2A helpful intervention 5Causal Explanatory Power 5.1The applicability of explanatory power 5.2Statistical relevance ≠ causal explanatory power 5.3Interventionist explanatory power 5.4E illustrated 6Conclusion. (shrink)
ABSTRACT Topos quantum theory is standardly portrayed as a kind of ‘neo-realist’ reformulation of quantum mechanics.1 1 In this article, I study the extent to which TQT can really be characterized as a realist formulation of the theory, and examine the question of whether the kind of realism that is provided by TQT satisfies the philosophical motivations that are usually associated with the search for a realist reformulation of quantum theory. Specifically, I show that the notion of the quantum state (...) is problematic for those who view TQT as a realist reformulation of quantum theory. 1Introduction 2Topos Quantum Theory 2.1Phase space 2.2Hilbert space 2.3Beyond Hilbert space 2.4Defining realism 2.5The spectral presheaf 2.6The logic of topos quantum theory 3Interpreting States in Topos Quantum Theory 4Interpreting Truth Values and Clopen Subobjects in Topos Quantum Theory 4.1Interpreting the truth values 4.2Interpreting Subcl 5Neo-realism 5.1The covariant approach 6Conclusion. (shrink)
We provide a Bayesian justification of the idea that, under certain conditions, the absence of an argument in favour of the truth of a hypothesis H constitutes a good argument against the truth of H.
Schupbach and Sprenger introduce a novel probabilistic approach to measuring the explanatory power that a given explanans exerts over a corresponding explanandum. Though we are sympathetic to their general approach, we argue that it does not adequately capture the way in which the causal explanatory power that c exerts on e varies with background knowledge. We then amend their approach so that it does capture this variance. Though our account of explanatory power is less ambitious than Schupbach and Sprenger’s in (...) the sense that it is limited to causal explanatory power, it is also more ambitious because we do not limit its domain to cases where c genuinely explains e. Instead, we claim that c causally explains e if and only if our account says that c explains e with some positive amount of causal explanatory power. (shrink)
The logic of indicative conditionals remains the topic of deep and intractable philosophical disagreement. I show that two influential epistemic norms—the Lockean theory of belief and the Ramsey test for conditional belief—are jointly sufficient to ground a powerful new argument for a particular conception of the logic of indicative conditionals. Specifically, the argument demonstrates, contrary to the received historical narrative, that there is a real sense in which Stalnaker’s semantics for the indicative did succeed in capturing the logic of the (...) Ramseyan indicative conditional. (shrink)
In recent years, a number of authors have defended the coherence and philosophical utility of the notion of metaphysical indeterminacy. Concurrently, the idea that reality can be stratified into more or less fundamental ‘levels’ has gained significant traction in the literature. Here, I examine the relationship between these two notions. Specifically, I consider the question of what metaphysical determinacy at one level of reality tells us about the possibility of metaphysical determinacy at other more or less fundamental levels. Towards this (...) end, I propose a novel conception of the way in which fundamental states of affairs determine derivative states of affairs in the presence of indeterminacy and construct a corresponding formal model of multilevel systems that demonstrates the compatibility of determinacy at the fundamental level with indeterminacy at higher levels, thereby rebutting Barnes' suggestion that indeterminacy at any level of reality implies indeterminacy at the fundamental level. (shrink)
We provide a novel Bayesian justification of inference to the best explanation. More specifically, we present conditions under which explanatory considerations can provide a significant confirmatory boost for hypotheses that provide the best explanation of the relevant evidence. Furthermore, we show that the proposed Bayesian model of IBE is able to deal naturally with the best known criticisms of IBE such as van Fraassen?s?bad lot? argument.
Quantum set theory and topos quantum theory are two long running projects in the mathematical foundations of quantum mechanics that share a great deal of conceptual and technical affinity. Most pertinently, both approaches attempt to resolve some of the conceptual difficulties surrounding quantum mechanics by reformulating parts of the theory inside of non-classical mathematical universes, albeit with very different internal logics. We call such mathematical universes, together with those mathematical and logical structures within them that are pertinent to the physical (...) interpretation, `Q-worlds'. Here, we provide a unifying framework that allows us to better understand the relationship between different Q-worlds, and define a general method for transferring concepts and results between TQT and QST, thereby significantly increasing the expressive power of both approaches. Along the way, we develop a novel connection to paraconsistent logic and introduce a new class of structures that have significant implications for recent work on paraconsistent set theory. (shrink)
Most agree that mental properties depend in some way on physical properties. While phys- icalists describe this dependence in terms of deterministic synchronic relations like identity or supervenience, some dualists prefer to think of it in terms of indeterministic dynamic relations, like causation. I’m going to develop a third conception of the dependence of the mental on the physical that falls somewhere between the deterministic synchronic dependence relations of the physicalist and the indeterministic diachronic dependence relations advocated by some dualists. (...) I’ll then use this new conception of metaphysical dependence to formulate a novel approach to the mind body problem that (i) posits a necessary, metaphysically robust synchronic dependence of the mental on the physical, (ii) satisfies several of the key motivations of both non-reductive physicalism and naturalistic dualism, (iii) is consistent with both the causal efficacy of the mental and the causal closure of the physical, and (iv) is capable of reconciling determinism about the physical world with indeterminism about the mental world. (shrink)
As a metaphysical theory, radical ontic structural realism is characterised mainly in terms of the ontological primacy it places on relations and structures, as opposed to the individual relata and objects that inhabit these relations/structures. The most popular criticism of ROSR is that its central thesis is incoherent. Bain attempts to address this criticism by arguing that the mathematical language of category theory allows for a coherent articulation of ROSR’s key thesis. Subsequently, Wüthrich and Lam and Lal and Teh have (...) criticised Bain’s arguments and claimed that category theory fares no better than set theory in coherently articulating the main ideas of ROSR. In this paper, we defend Bain’s main arguments against these critiques, and attempt to elaborate on the sense in which category theory can be seen as providing a coherent articulation of ROSR. We also consider the relationship between ROSR and Categorical Quantum Mechanics. (shrink)
Topos quantum theory represents a whole new approach to the formalization of non-relativistic quantum theory. It is well known that TQT replaces the orthomodular quantum logic of the traditional Hilbert space formalism with a new intuitionistic logic that arises naturally from the topos theoretic structure of the theory. However, it is less well known that TQT also has a dual logical structure that is paraconsistent. In this paper, we investigate the relationship between these two logical structures and study the implications (...) of this relationship for the definition of modal operators in TQT. (shrink)
According to orthodoxy, there are two basic moods of supposition: indicative and subjunctive. The most popular formalizations of the corresponding norms of suppositional judgement are given by Bayesian conditionalization and Lewisian imaging, respectively. It is well known that Bayesian conditionalization can be generalized to provide a model for the norms of partial indicative supposition. This raises the question of whether imaging can likewise be generalized to model the norms of ‘partial subjunctive supposition’. The present article casts doubt on whether the (...) most natural generalizations of imaging are able to provide a plausible account of the norms of partial subjunctive supposition. (shrink)
A consistent finding in research on conditional reasoning is that individuals are more likely to endorse the valid modus ponens (MP) inference than the equally valid modus tollens (MT) inference. This pattern holds for both abstract task and probabilistic task. The existing explanation for this phenomenon within a Bayesian framework (e.g., Oaksford & Chater, 2008) accounts for this asymmetry by assuming separate probability distributions for both MP and MT. We propose a novel explanation within a computational-level Bayesian account of reasoning (...) according to which “argumentation is learning”. We show that the asymmetry must appear for certain prior probability distributions, under the assumption that the conditional inference provides the agent with new information that is integrated into the existing knowledge by minimizing the Kullback-Leibler divergence between the posterior and prior probability distribution. We also show under which conditions we would expect the opposite pattern, an MT-MP asymmetry. (shrink)
The problem of old evidence, first described by Glymour [1980], is still widely regarded as one of the most pressing foundational challenges to the Bayesian account of scientific reasoning. Many so...
The way in which philosophers have thought about the scientific method and the nature of good scientific reasoning over the last few centuries has been consistently and heavily influenced by the examples set by physics. The astounding achievements of 19th and 20th century physics demonstrated that physicists had successfully identified methodologies and reasoning patterns that were uniquely well suited to discovering fundamental truths about the natural world. Inspired by this success, generations of philosophers set themselves the goal of taxonomising, codifying, (...) formalising and evaluating these reasoning patterns. Many will concede that this has been a tremendously fruitful exercise that has served both to illuminate characteristic methodological and epistemological features of the physical sciences, and to inform the way that philosophers think about the epistemic ideals served by science more generally. However, as has been widely noted, the great challenges confronted by contemporary physics have led to a number of fundamental shifts in the way that physicists formulate, assess and apply their theories of the physical world. The most prominent examples of this trend occur in the realm of theoretical high energy physics, where many of the most influential theories advocated by physicists lie beyond the reach of extant experimental methods, and are therefore extremely difficult to test empirically. The fact that whole communities of physicists have devoted so much time and effort to evaluating theories that are largely disconnected from experiments and empirical testing suggests that existing philosophical accounts of the epistemology of physics, based as they are on a broadly empiricist conception of physics, are no longer completely apt, or are at least somewhat out of date. This in turn suggests that it is time for philosophers to redirect their attention towards the characteristic reasoning strategies at play in contemporary physics. As well as clarifying the epistemic structure of 21st century physics and providing new stimulation for general debates surrounding the epistemology of science, this will also allow more philosophers to confront and engage with the existential methodological debates currently raging within physics. There is currently intense and wide spread disagreement surrounding what kinds of reasoning strategies can legitimately be employed in the assessment of competing physical theories and research programs. This special issue aims to facilitate the further engagement of philosophers in this crucial debate by collecting eight contributions that both highlight the deep philosophical issues at stake in debates surrounding the proper methodology for theory assessment in physics, and present novel philosophical perspectives on specific reasoning strategies in the physical sciences. (shrink)
Does y obtain under the counterfactual supposition that x? The answer to this question is famously thought to depend on whether y obtains in the most similar world in which x obtains. What this notion of ‘similarity’ consists in is controversial, but in recent years, graphical causal models have proved incredibly useful in getting a handle on considerations of similarity between worlds. One limitation of the resulting conception of similarity is that it says nothing about what would obtain were the (...) causal structure to be different from what it actually is, or from what we believe it to be. In this paper, we explore the possibility of using graphical causal models to resolve counterfactual queries about causal structure by introducing a notion of similarity between causal graphs. Since there are multiple principled senses in which a graph G* can be more similar to a graph G than a graph G**, we introduce multiple similarity metrics, as well as multiple ways to prioritize the various metrics when settling counterfactual queries about causal structure. (shrink)
The technique of imaging was first introduced by Lewis, in order to provide a novel account of the probability of conditional propositions. In the intervening years, imaging has been the object of significant interest in both AI and philosophy, and has come to be seen as a philosophically important approach to probabilistic updating and belief revision. In this paper, we consider the possibility of generalising imaging to deal with uncertain evidence and partial belief revision. In particular, we introduce a new (...) logical criterion that any update rule should satisfy, and use it to evaluate a range of different approaches to generalising imaging to situations involving uncertain evidence. We show that none of the currently prevalent approaches to imaging allow for such a generalisation, although a lesser known version of imaging, introduced by Joyce, can be generalised in a way that mitigates these problems. (shrink)
In recent years, anthropic reasoning has been used to justify a number of controversial skeptical hypotheses. In this paper, we consider two prominent examples, viz. Bostrom’s ‘Simulation Argument’ and the problem of ‘Boltzmann Brains’ in big bang cosmology. We argue that these cases call into question the assumption, central to Bayesian confirmation theory, that the relation of evidential confirmation is universally symmetric. We go on to argue that the fact that these arguments appear to contradict this fundamental assumption should not (...) be taken as an immediate refutation, but should rather be seen as indicative of the peculiar role that the relevant hypotheses play in their respective epistemic frameworks. (shrink)