The goal of this paper is to sketch and defend a new interpretation or 'theory' of objective chance, one that lets us be sure such chances exist and shows how they can play the roles we traditionally grant them. The account is 'Humean' in claiming that objective chances supervene on the totality of actual events, but does not imply or presuppose a Humean approach to other metaphysical issues such as laws or causation. Like Lewis (1994) I take the Principal Principle (...) (PP) to be the key to understanding objective chance. After describing the main features of Humean objective chance (HOC), I deduce the validity of PP for Humean chances, and end by exploring the limitations of Humean chance. (shrink)
GRW Theory postulates a stochastic mechanism assuring that every so often the wave function of a quantum system is `hit', which leaves it in a localised state. How are we to interpret the probabilities built into this mechanism? GRW theory is a firmly realist proposal and it is therefore clear that these probabilities are objective probabilities (i.e. chances). A discussion of the major theories of chance leads us to the conclusion that GRW probabilities can be understood only as either single (...) case propensities or Humean objective chances. Although single case propensities have some intuitive appeal in the context of GRW theory, on balance it seems that Humean objective chances are preferable on conceptual grounds because single case propensities suffer from various well know problems such as unlimited frequency tolerance and lack of a rationalisation of the principal principle. (shrink)
Classical statistical mechanics posits probabilities for various events to occur, and these probabilities seem to be objective chances. This does not seem to sit well with the fact that the theory’s time evolution is deterministic. We argue that the tension between the two is only apparent. We present a theory of Humean objective chance and show that chances thus understood are compatible with underlying determinism and provide an interpretation of the probabilities we find in Boltzmannian statistical mechanics.
The underdetermination of theory by evidence must be distinguished from holism. The latter is a doctrine about the testing of scientific hypotheses; the former is a thesis about empirically adequate logically incompatible global theories or "systems of the world". The distinction is crucial for an adequate assessment of the underdetermination thesis. The paper shows how some treatments of underdetermination are vitiated by failure to observe this distinction, and identifies some necessary conditions for the existence of multiple empirically equivalent global theories. (...) We consider how empiricists should respond to the possibility of such systems of the world. (shrink)
The topics of gravitational field energy and energy-momentum conservation in General Relativity theory have been unjustly neglected by philosophers. If the gravitational field in space free of ordinary matter, as represented by the metric g ab itself, can be said to carry genuine energy and momentum, this is a powerful argument for adopting the substantivalist view of spacetime.This paper explores the standard textbook account of gravitational field energy and argues that (a) so-called stress-energy of the gravitational field is well-defined neither (...) locally nor globally; and (b) there is no general principle of energy-momentum conservation to be found in General Relativity. I discuss the nature and justification of the zero-divergence law for ordinary stress-energy, and its possible connection with the failure of General Relativity to realise Mach's principle. (shrink)
On the face of it ‘deterministic chance’ is an oxymoron: either an event is chancy or deterministic, but not both. Nevertheless, the world is rife with events that seem to be exactly that: chancy and deterministic at once. Simple gambling devices like coins and dice are cases in point. On the one hand they are governed by deterministic laws – the laws of classical mechanics – and hence given the initial condition of, say, a coin toss it is determined whether (...) it will land heads or tails.2 On the other hand, we commonly assign probabilities to the different outcomes a coin toss, and doing so has proven successful in guiding our actions. The same dilemma also emerges in less mundane contexts. Classical statistical mechanics (which is still an important part of modern physics) assigns probabilities to the occurrence of certain events – for instance to the spreading of a gas that is originally confined to the left half of a container – but at the same time assumes that the relevant systems are deterministic. How can this apparent conflict be resolved? (shrink)
In this paper I defend fundamental physical laws from the arguments mounted by Nancy Cartwright in her (1999) book The Dappled World (and other publications). I argue, positively, that we have a good deal of evidence for mathematical laws—not just causal capacities—underlying many natural phenomena. I also argue, negatively, that Cartwright's main arguments unfairly demand that a fundamentalist be a strong reductionist.
In recent years attacks on the Kripke-Putnam approach to natural kinds and natural kind terms have proliferated. In a recent paper, Häggqvist and Wikforss attack the once-dominant essentialist account of natural kinds. Häggqvist & Wikforss also suggest that it is time to return to some sort of cluster-based descriptivist semantics for natural kind terms, thus targeting both the metaphysical and semantic tenets that underpin the Kripke-Putnam approach. In our paper we want to challenge both parts of Häggqvist and Wikforss’ project. (...) We will argue that the anti-essentialist considerations and arguments they raise against the Kripke-Putnam view are far from compelling in some cases, and certainly not decisive against a reasonable form of the view. On the semantic side, although Häggqvist and Wikforss give few details about what a viable cluster-based descriptivist theory should look like, we will argue that one can already see the approach to be a non-starter. Ignorance and error arguments of the kinds provided by Kripke and Putnam continue to be decisive objections. The only way we can see to save the cluster descriptivist approach is to make the essential properties postulated by Kripke and Putnam become essential features of the descriptive cluster. But this makes the success of the approach parasitic on the correctness of the Kripke-Putnam view. (shrink)
This paper outlines a new interpretation of an argument of Kant's for the existence of absolute space. The Kant argument, found in a 1768 essay on topology, argues for the existence of Newtonian-Euclidean absolute space on the basis of the existence of incongruous counterparts (such as a left and a right hand, or any asymmetrical object and its mirror-image). The clear, intrinsic difference between a left hand and a right hand, Kant claimed, cannot be understood on a relational view of (...) space - for in terms of the spatial relations of their parts, there is no difference to be found. Kant's argument has been interpreted by, among others, Graham Nerlich (in 1973, Hands, Knees and Absolute Space, The Journal of Philosophy). I briefly discuss Nerlich, and then offer a different reconstruction of the argument, one that appears to be closer to Kant's text. The reconstruction, however, essentially involves ascription of primitive identity to parts of space. Comparing the Kantian absolutist account of incongruous counterparts using primitive identity to the correct relationist account, I conclude that the absolutist account pays a heavy metaphysical price, without buying any genuine explanatory advantage over the relationist. I go on to examine recent suggestions that parity-non-conservation phenomena in quantum physics allow a stronger version of Kant's challenge to relationism. On closer examination, it turns out that here too the absolutist or substantivalist must be appealing to space parts with primitive identity in order to claim an advantage over relationists; and here too, I argue the substantivalist story really has no advantage over the correct relationist account. (shrink)
The traditional absolutist-relationist debate is still clearly formulable in the context of General Relativity Theory (GTR), despite the important differences between Einstein's theory and the earlier context of Newtonian physics. This paper answers recent arguments by Robert Rynasiewicz against the significance of the debate in the GTR context. In his (1996) (‘Absolute vs. Relational Spacetime: An Outmoded Debate?’), Rynasiewicz argues that already in the late nineteenth century, and even more so in the context of General Relativity theory, the terms of (...) the original Descartes–Newton–Leibniz dispute about space are not to be found. Nineteenth-century ether theories of electromagnetism, and the metric field of GTR, he claims, do not lend themselves to being interpreted clearly as either absolute space à la Newton, or relational structures à la either Descartes or Leibniz. I argue that, while in some imaginable theories Rynasiewicz's claim that the classical debate dissolves would be correct, in fact in the most important historical theories he discusses, this is not the case. In particular, I argue that in both Lorentz's ether theory and General relativity theory, there is a clear and compelling way to establish connections to the classical absolutist-relationist disputes, and that in both these theories it is the absolutist position that is prima facie victorious. To support my arguments and give a clear overview of the whole debate, I end by offering definitional sketches of relationism and absolutism (substantivalism) about spacetime in the context of contemporary physics. The sketches show the clear connections between these views today and their ancestors in Newton and Leibniz. But at the same time, they indicate how both views are not just claims about existing physical theories, but rather also bets about how future physics will clarify the ontological picture. (shrink)
The story of Einstein's struggle to create a general theory of relativity, and his early discontentment with the final form of the theory , is well known in broad outline. Thanks to the work of John Norton and others, much of the fine detail of the story is also now known. One aspect of Einstein's work in this period has, however, been relatively neglected: Einstein's commitment to Mach's ideas on inertia, and the influence this commitment had on Einstein's work on (...) general relativity from 1907 to 1918. In this paper published writings and archival material are examined, to try to reconstruct the details of Einstein's thinking about inertia and gravitation, and the role that Mach's ideas played in Einstein's crucial work on the general theory. By the end, a clear picture of Einstein's conceptions of Mach's ideas on inertia, and their philosophical motivations, will emerge. Several surprising conclusions also emerge: Einstein's desire for a Machian gravitation theory was the central force driving his work from 1912 to 1915, keeping him going despite numerous frustrating setbacks; Einstein's continued commitment to Mach's ideas in 1916–1917 kept him at work trying various strategies of modification of the field equations, in order to exclude anti-Machian solutions ; and as late as early 1918, Einstein was ready to call the whole General Theory a failure if no way of squaring it with Mach's ideas on inertia could be found. But by 1920 Einstein advocated a view that granted spacetime independent existence with physical qualities of its own, a complete break with his earlier Machian views. (shrink)
Nancy Cartwright is one of the most distinguished and influential contemporary philosophers of science. Despite the profound impact of her work, there is neither a systematic exposition of Cartwright’s philosophy of science nor a collection of articles that contains in-depth discussions of the major themes of her philosophy. This book is devoted to a critical assessment of Cartwright’s philosophy of science and contains contributions from Cartwright's champions and critics. Broken into three parts, the book begins by addressing Cartwright's views on (...) the practice of model building in science and the question of how models represent the world before moving on to a detailed discussion of methodologically and metaphysically challenging problems. Finally, the book addresses Cartwright's original attempts to clarify profound questions concerning the metaphysics of science. With contributions from leading scholars, such as Ronald N. Giere and Paul Teller, this unique volume will be extremely useful to philosophers of science the world over. (shrink)
Much is asked of the concept of chance. It has been thought to play various roles, some in tension with or even incompatible with others. Chance has been characterized negatively, as the absence of causation; yet also positively—the ancient Greek τυχη´ reifies it—as a cause of events that are not governed by laws of nature, or as a feature of the laws themselves. Chance events have been understood epistemically as those whose causes are unknown; yet also objectively as a distinct (...) ontological kind, sometimes called ‘pure’ chance events. Chance gives rise to individual unpredictability and disorder; yet it yields collective predictability and order—stable long-run statistics, and in the limit, aggregate behavior susceptible to precise mathematical theorems. Some authors believe that to posit chances is to abjure explanation; yet others think that chances are themselves explanatory. During the Enlightenment, talk of ‘chance’ was regarded as unscientific, unphilosophical, the stuff of superstition or ignorance; yet today it is often taken to be a fundamental notion of our most successful scientific theory, quantum mechanics, and a central concept of contemporary metaphysics. Chance has both negative and positive associations in daily life. The old word in English for it, hazard, which derives from French and originally from Arabic, still has unwelcome connotations of risk; ‘chance’ evokes uncertainty, uncontrollability, and chaos. Yet chance is also allied with luck, fortune, freedom from constraint, and diversity. And it apparently has various practical uses and benefits. It forms the basis of randomized trials in statistics, and of mixed strategies in decision theory and game theory; it is appealed to in order to resolve problems of fair division and other ethical.. (shrink)
In a recent article, Gordon Belot uses the so-called undermining phenomenon to try to raise a new difficulty for reductive accounts of objective probability, such as Humean Best System accounts. In this paper I will give a critical discussion of Belot’s paper and argue that, in fact, there is no new difficulty here for chance reductionists to address.
One currently popular view about the nature of objective probabilities, or objective chances, is that they – or some of them, at least – are primitive features of the physical world, not reducible to anything else nor explicable in terms of frequencies, degrees of belief, or anything else. In this paper I explore the question of what the semantic content of primitive chance claims could be. Every attempt I look at to supply such content either comes up empty-handed, or begs (...) important questions against the skeptic who doubts the meaningfulness of primitive chance claims. In the second half of the paper I show that, by contrast, there are clear, and clearly contentful, ways to understand objective chance claims if we ground them on deterministic physical underpinnings. (shrink)
In the philosophical tradition, the notions of determinism and causality are strongly linked: it is assumed that in a world of deterministic laws, causality may be said to reign supreme; and in any world where the causality is strong enough, determinism must hold. I will show that these alleged linkages are based on mistakes, and in fact get things almost completely wrong. In a deterministic world that is anything like ours, there is no room for genuine causation. Though there may (...) be stable enough macro-level regularities to serve the purposes of human agents, the sense of “causality” that can be maintained is one that will at best satisfy Humeans and pragmatists, not causal fundamentalists. (shrink)
Our beliefs can have, or fail to have, a significant epistemic virtue: they can be true. What about our partial beliefs – that is, credences or subjective probabilities? Is there an epistemic virtue that credences can have or fail to have, whose nature or role with respect to credences is analogous to the role that truth has with respect to full beliefs? Van Fraassen argued in the 1980s that there is indeed such an analog virtue, and he claimed that it (...) is calibration: our credences should match, or be in tune with, some appropriately chosen frequencies of events. For example, if your credence in the proposition ‘It will rain today’ is 0.3 for any day that starts out like today did, and in fact it does rain on approximately 30% of such days, then your credence level is well‐calibrated, and is vindicated by the actual frequency of rain.In an unpublished paper Alan Hájek rejects this calibrationist answer, and proposes instead that having credences in agreement with the relevant objective chances is what it is to have vindicated credences, in the relevant sense analogous to truth for full beliefs. Although there are problems that afflict both van Fraassen's and Hájek's proposals, I will argue that Hájek and the calibrationist are both largely right – although the calibrationist answer is at bottom more right. In section I will propose an account of being in tune with the world that divides types of propositions into a few different classes; for one of those classes, Hájek's answer is exactly right, while for the others the calibrationist account gives the right answer. (shrink)
In a now-classic paper, Nancy Cartwright argued that the Humean conception of causation as mere regular co-occurrence is too weak to make sense of our everyday and scientific practices. Specifically she claimed that in order to understand our reasoning about, and uses of, effective strategies, we need a metaphysically stronger notion of causation and causal laws than Humeanism allows. Cartwright’s arguments were formulated in the framework of probabilistic causation, and it is precisely in the domain of (objective) probabilities that I (...) am interested in defending a form of Humeanism. In this paper I will unpack some examples of effective strategies and discuss how well they fit the framework of causal laws and criteria such as CC from Cartwright’s and others’ works on probabilistic causality. As part of this discussion, I will also consider the concept or concepts of objective probability presupposed in these works. I will argue that Cartwright’s notion of a nomological machine, or a mechanism as defined by Stuart Glennan, is better suited for making sense of effective strategies, and therefore that a metaphysically primitive notion of causal law (or singular causation, or capacity, as Cartwright argues in (1989)) is not – here, at least – needed. These conclusions, as well as the concept of objective probabilities I defend, are largely in harmony with claims Cartwright defends in The Dappled World. My discussion aims, thus, to bring out into the open how far Cartwright’s current views are from a radically anti-Humean, causal-fundamentalist picture. (shrink)
This dissertation takes up the project of showing that, in the context of the general theory of relativity , spacetime relationism is not a refuted or hopeless view, as many in the recent literature have maintained . Most of the challenges to the relationist view in General Relativity can be satisfactorily answered; in addition, the opposing absolutist and substantivalist views of spacetime can be shown to be problematic. The crucial burden for relationists concerned with GTR is to show that the (...) realistic cosmological models, i.e. those that may be roughly accurate representations of our universe, satisfy Mach's ideas about the origin of inertia. This dissertation clears the way for and begins such a demonstration. ;After a brief discussion of the problem of the nature of spacetime and its history in the Introduction, chapters 2 and 3 provide conceptual analysis and criticism of contemporary philosophical arguments about relationism, absolutism, and particularly substantivalism. The current best arguments in favor of substantivalism are shown to be flawed, with the exception of the argument from inertial and metrical structure; and on this issue, it is shown that both relationism and substantivalism need to argue for modifications of GTR in order to have a non-trivial explanation of inertial and metrical structure. For relationists, a Machian account of the origin of inertia in some models of GTR is required. Chapter 4 demonstrates that such a Machian account is equivalent to the demand for a truly general relativity of motion. Chapter 5 explores the history of Einstein's commitment to Mach's ideas in his work on GTR. Through an examination of the history of Einstein's attempts to impose Machian constraints on the models of General Relativity, further insight into the nature of this problem is obtained, as are reasons to believe that the project is by no means hopeless. (shrink)