Sider (Four-dimensionalism 2001; Philos Stud 114:135–146, 2003; Nous 43:557–567, 2009) has developed an influential argument against indeterminacy in existence. In what follows, I argue that the defender of metaphysical forms of indeterminate existence has a unique way of responding to Sider’s argument. The response I’ll offer is interesting not only for its applicability to Sider’s argument, but also for its broader implications; responding to Sider helps to show both how we should think about precisification in the context of metaphysical indeterminacy (...) and how we should understand commitment to metaphysically indeterminate existence. And if I’m right that metaphysical indeterminacy can allow for indeterminate existence in a way that semantic indeterminacy can’t, indeterminate existence might actually give us a reason to accept metaphysical indeterminacy (rather than a reason to reject it, as is commonly assumed). (shrink)
In this paper, I argue for a new way of characterizing ontological emergence. I appeal to recent discussions in meta-ontology regarding fundamentality and dependence, and show how emergence can be simply and straightforwardly characterized using these notions. I then argue that many of the standard problems for emergence do not apply to this account: given a clearly specified meta-ontological background, emergence becomes much easier to explicate. If my arguments are successful, they show both a helpful way of thinking about emergence (...) and the potential utility of discussions in meta-ontology when applied to first-order metaphysics. (shrink)
Many of us are tempted by the thought that the future is open, whereas the past is not. The future might unfold one way, or it might unfold another; but the past, having occurred, is now settled. In previous work we presented an account of what openness consists in: roughly, that the openness of the future is a matter of it being metaphysically indeterminate how things will turn out to be. We were previously concerned merely with presenting the view and (...) exploring its consequences; we did not attempt to argue for it over rival accounts. That is what we will aim to do in this paper. (shrink)
In this paper I develop a framework for understanding ontic vagueness. The project of the paper is two-fold. I first outline a definitional account of ontic vagueness – one that I think is an improvement on previous attempts because it remains neutral on other, independent metaphysical issues. I then develop one potential manifestation of that basic definitional structure. This is a more robust (and much less neutral) account which gives a fully classical explication of ontic vagueness via modal concepts. The (...) overarching aim is to systematically investigate the puzzling question of what exactly it could be for the world itself to be vague. (shrink)
abstract In this paper I develop a characterization of disability according to which disability is in no way a sub-optimal feature. I argue, however, that this conception of disability is compatible with the idea that having a disability is, at least in a restricted sense, a harm. I then go on to argue that construing disability in this way avoids many of the common objections levelled at accounts which claim that disability is not a negative feature.
In this paper I argue that Gareth Evans’ famous proof of the impossibility of de re indeterminate identity fails on a counterpart-theoretic interpretation of the determinacy operators. I attempt to motivate a counterpart-theoretic reading of the determinacy operators and then show that, understood counterpart-theoretically, Evans’ argument is straightforwardly invalid.
In this paper we aim to disentangle the thesis that the future is open from theses that often get associated or even conflated with it. In particular, we argue that the open future thesis is compatible with both the unrestricted principle of bivalence and determinism with respect to the laws of nature. We also argue that whether or not the future (and indeed the past) is open has no consequences as to the existence of (past and) future ontology.
We discuss arguments against the thesis that the world itself can be vague. The first section of the paper distinguishes dialectically effective from ineffective arguments against metaphysical vagueness. The second section constructs an argument against metaphysical vagueness that promises to be of the dialectically effective sort: an argument against objects with vague parts. Firstly, cases of vague parthood commit one to cases of vague identity. But we argue that Evans' famous argument against will not on its own enable one to (...) complete the reductio in the present context. We provide a metaphysical premise that would complete the reductio, but note that it seems deniable. We conclude by drawing general morals from our case study. (shrink)
provides a sustained and ambitious development of the basic idea that knowledge is true belief that tracks the truth. In this essay, I provide a quick synopsis of Roush's book and offer a substantive discussion of her analysis of scientific evidence. Roush argues that, for e to serve as evidence for h, it should be easier to determine the truth value of e than it is to determine the truth value of h, an ideal she refers to as leverage. She (...) defends a detailed method by which the value of p(h/e) is computed without direct information about p(h) but only using evidence about the value of p(e), from which the value of p(h) is derived. She presents an example of how to use her leverage method, which I argue involves a certain critical mistake. I show how the leveraging method can be used in a way that is sound—I conclude with a few remarks about the importance of distinguishing clearly between prior and posterior probabilities. CiteULike Connotea Del.icio.us What's this? (shrink)
Sherrilyn Roush's Tracking Truth provides a sustained and ambitious development of the basic idea that knowledge is true belief that tracks the truth. In this essay, I provide a quick synopsis of Roush's book and offer a substantive discussion of her analysis of scientific evidence. Roush argues that, for e to serve as evidence for h, it should be easier to determine the truth value of e than it is to determine the truth value of h, an ideal she refers (...) to as 'leverage'. She defends a detailed method by which the value of p(h/e) is computed without 'direct' information about p(h) but only using evidence about the value of p(e), from which the value of p(h) is derived. She presents an example of how to use her leverage method, which I argue involves a certain critical mistake. I show how the leveraging method can be used in a way that is sound--I conclude with a few remarks about the importance of distinguishing clearly between prior and posterior probabilities. (shrink)
This thesis is a systematic investigation of whether there might be conceptual room for the idea that the world itself might be vague, independently of how we describe it. This idea – the existence of so-called ontic vagueness – has generally been extremely unpopular in the literature; my thesis thus seeks to evaluate whether this ‘negative press’ is justified. I start by giving a working definition and semantics for ontic vagueness, and then attempt to show that there are no conclusive (...) arguments that rule out vagueness of this kind. I subsequently establish what type of arguments I think would be most effective in establishing ontic vagueness and provide some arguments of this form. I then highlight a potential worry for this type of argument, but argue that it can be circumvented. Finally, I consider the main ways that the opponent of ontic vagueness would be likely resist the arguments I have offered, and argue that these strategies of response are methodologically problematic. I conclude by claiming that ontic vagueness is a perfectly plausible ontological commitment. (shrink)
In this paper I respond to Trenton Merricks's (2005) paper ‘Composition and Vagueness’. I argue that Merricks's paper faces the following difficulty: he claims to provide independent motivation for denying one of the premisses of the Lewis-Sider vagueness argument for unrestricted composition, but the alleged motivation he provides begs the question.
Predictivism asserts that novel confirmations carry special probative weight. Epistemic pluralism asserts that the judgments of agents (about, e.g., the probabilities of theories) carry epistemic import. In this paper, I propose a new theory of predictivism that is tailored to pluralistic evaluators of theories. I replace the orthodox notion of use-novelty with a notion of endorsement-novelty, and argue that the intuition that predictivism is true has two roots. I provide a detailed Bayesian rendering of this theory and argue that pluralistic (...) theory evaluation pervades scientific practice. I compare my account of predictivism with those of Maher and Worrall. (shrink)
In recent literature on vagueness, writers have noted that more ‘plentiful’ theories of properties – those that postulate genuine properties corresponding to the classically vague predicates like ‘bald’ and ‘heap’ – appear straightforwardly committed to ontic vagueness. In this paper, however, I will argue that worries of ontic vagueness are not specific to ‘plentiful’ accounts of properties. The classically ‘sparse’ theories of properties – Universals and tropes – will, I contend, be subject to similar difficulties.
The miracle argument for scientific realism can be cast in two forms: according to the miraculous theory argument, realism is the only position which does not make the empirical successes of particular theories miraculous. According to the miraculous choice argument, realism is the only position which does not render the fact that empirically successful theories have been chosen a miracle. A vast literature discusses the miraculous theory argument, but the miraculous choice argument has been unjustifiably neglected. I raise two objections (...) to Richard Boyd's defense of the latter: (1) we have no miracle free account of the emergence of take-off theories and (2) the anti-realist can account for the non-miraculous choice of empirically successful theories by attributing mere empirical adequacy to background theory. I argue that the availability of extra-empirical criteria that are arguably truth conductive but not theory-laden suffices to answer (1), and the unavailability of extra-empirical criteria that are conductive to empirical adequacy but not necessarily to truth (and are also not theory-laden) constitutes to reply to (2). The prospects for a realist victory are at least somewhat promising, on a controversial assumption about the rate at which empirically successful theories emerge. (shrink)
The quantitative problem of old evidence is the problem of how to measure the degree to which e confirms h for agent A at time t when A regards e as justified at t. Existing attempts to solve this problem have applied the e-difference approach, which compares A's probability for h at t with what probability A would assign h if A did not regard e as justified at t. The quantitative problem has been widely regarded as unsolvable primarily on (...) the grounds that the e-difference approach suffers from intractable problems. Various philosophers have proposed that 'Bayesianism' should be rejected as a research strategy in confirmation theory in part because of the unsolvability of this problem. I develop a version of the e-difference approach which overcomes these problems and possesses various advantages (but also certain limitations). I develop an alternative 'theistic' approach which handles many cases that my development of the e-difference approach does not handle. I conclude with an assessment of the significance of the quantitative problem for Bayesianism and argue that this problem is misunderstood in so far as it is regarded as unsolvable, and in so far as it is regarded as a problem only for Bayesians. (shrink)
A pluralistic scientific method is one that incorporates a variety of points of view in scientific inquiry. This paper investigates one example of pluralistic method: the use of weighted averaging in probability estimation. I consider two methods of weight determination, one based on disjoint evidence possession and the other on track record. I argue that weighted averaging provides a rational procedure for probability estimation under certain conditions. I consider a strategy for calculating ‘mixed weights’ which incorporate mixed information about agent (...) credibility. I address various objections to the weighted averaging technique and conclude that the technique is a promising one in various respects. (shrink)
Predictivism holds that, where evidence E confirms theory T, E confirms T more strongly when E is predicted on the basis of T and subsequently confirmed than when E is known in advance of T's formulation and used, in some sense, in the formulation of T. Predictivism has lately enjoyed some strong supporting arguments from Maher (1988, 1990, 1993) and Kahn, Landsberg, and Stockman (1992). Despite the many virtues of the analyses these authors provide it is my view that they (...) (along with all other authors on this subject) have failed to understand a fundamental truth about predictivism: the existence of a scientist who predicted T prior to the establishment that E is true has epistemic import for T (once E is established) only in connection with information regarding the social milieu in which the T-predictor is located and information regarding how the T-predictor was located. The aim of this paper is to show that predictivism is ultimately a social phenomenon that requires a social level of analysis, a thesis I deem social predictivism. (shrink)
Predictivism asserts that where evidence E confirms theory T, E provides stronger support for T when E is predicted on the basis of T and then confirmed than when E is known before T's construction and 'used', in some sense, in the construction of T. Among the most interesting attempts to argue that predictivism is a true thesis (under certain conditions) is that of Patrick Maher (1988, 1990, 1993). The purpose of this paper is to investigate the nature of predictivism (...) using Maher's analysis as a starting point. I briefly summarize Maher's primary argument and expand upon it; I explore related issues pertaining to the causal structure of empirical domains and the logic of discovery. (shrink)
D. Miller's demonstrations of the language dependence of truthlikeness raise a profound problem for the claim that scientific progress is objective. In two recent papers (Barnes 1990, 1991) I argue that the objectivity of progress may be grounded on the claim that the aim of science is not merely truth but knowledge; progress thus construed is objective in an epistemic sense. In this paper I construct a new solution to Miller's problem grounded on the notion of "approximate causal explanation" which (...) allows for linguistically invariant progress outside an epistemic context. I suggest that the notion of "approximate causal explanation" provides the resources for a more robust theory of progress than that provided by the notion of "approximate truth.". (shrink)
I aim to show that one way of testing the mettle of a theory of scientific explanation is to inquire what that theory entails about the status of brute facts. Here I consider the nature of brute facts, and survey several contemporary accounts of explanation vis a vis this subject (the Friedman-Kitcher theory of explanatory unification, Humphreys' causal theory of explanation, and Lipton's notion of 'explanatory loveliness'). One problem with these accounts is that they seem to entail that brute (...) facts represent a gap in scientific understanding. I argue that brute facts are non-mysterious and indeed are even explainable by the lights of Salmon's ontic conception of explanation (which I endorse here). The plausibility of various models of explanation, I suggest, depends to some extent on the tendency of their proponents to focus on certain examples of explananda - I ponder brute facts qua explananda here as a way of helping us to recognize this dependency. (shrink)
Philip Kitcher has proposed a theory of explanation based on the notion of unification. Despite the genuine interest and power of the theory, I argue here that the theory suffers from a fatal deficiency: It is intrinsically unable to account for the asymmetric structure of explanation, and thus ultimately falls prey to a problem similar to the one which beset Hempel's D-N model. I conclude that Kitcher is wrong to claim that one can settle the issue of an argument's explanatory (...) force merely on the basis of considerations about the unifying power of the argument pattern the argument instantiates. (shrink)
The theory of explanatory unification was first proposed by Friedman (1974) and developed by Kitcher (1981, 1989). The primary motivation for this theory, it seems to me, is the argument that this account of explanation is the only account that correctly describes the genesis of scientific understanding. Despite the apparent plausibility of Friedman's argument to this effect, however, I argue here that the unificationist thesis of understanding is false. The theory of explanatory unification as articulated by Friedman and Kitcher thus (...) emerges as fundamentally misconceived. (shrink)
This paper proposes a solution to David Miller's Minnesotan-Arizonan demonstration of the language dependence of truthlikeness (Miller 1974), along with Miller's first-order demonstration of the same (Miller 1978). It is assumed, with Peter Urbach, that the implication of these demonstrations is that the very notion of truthlikeness is intrinsically language dependent and thus non-objective. As such, truthlikeness cannot supply a basis for an objective account of scientific progress. I argue that, while Miller is correct in arguing that the number of (...) true atomic sentences of a false theory is language dependent, the number of known sentences (under certain straightforward assumptions) is conserved by translation; degree of knowledge, unlike truthlikeness, is thus a linguistically invariant notion. It is concluded that the objectivity of scientific progress must be grounded on the fact (noted in Cohen 1980) that knowledge, not mere truth, is the aim of science. (shrink)
David Miller has demonstrated to the satisfaction of a variety of philosophers that the accuracy of false quantitative theories is language dependent (cf. Miller 1975). This demonstration renders the accuracy-based mode of comparison for such theories obsolete. The purpose of this essay is to supply an alternate basis for theory comparison which in this paper is deemed the knowledge-based mode of quantitative theory comparison. It is argued that the status of a quantitative theory as knowledge depends primarily on the soundness (...) of the measurement procedure which produced the theory; such soundness is invariant, on my view, under Milleresque translations. This point is the basis for the linguistic invariance of knowledgelikeness. When the aim of science is not construed simply in terms of the truthlikeness or accuracy of theories, but in terms of the knowledge such theories embody, Miller's language dependence problem is overcome. One result of this analysis is that the possibility of objective scientific progress is restored, a possibility that Miller's analysis has prima facie defeated. (shrink)