A number of people have recently argued for a structural approach to accounting for the applications of mathematics. Such an approach has been called "the mapping account". According to this view, the applicability of mathematics is fully accounted for by appreciating the relevant structural similarities between the empirical system under study and the mathematics used in the investigation ofthat system. This account of applications requires the truth of applied mathematical assertions, but it does not require the existence of mathematical objects. (...) In this paper, we discuss the shortcomings of this account, and show how these shortcomings can be overcome by a broader view of the application of mathematics: the inferential conception. (shrink)
Hartry Field has shown us a way to be nominalists: we must purge our scientific theories of quantification over abstracta and we must prove the appropriate conservativeness results. This is not a path for the faint hearted. Indeed, the substantial technical difficulties facing Field's project have led some to explore other, easier options. Recently, Jody Azzouni, Joseph Melia, and Stephen Yablo have argued that it is a mistake to read our ontological commitments simply from what the quantifiers of our best (...) scientific theories range over. In this paper, I argue that all three arguments fail and they fail for much the same reason; would-be nominalists are thus left facing Field's hard road. (shrink)
Standard approaches to counterfactuals in the philosophy of explanation are geared toward causal explanation. We show how to extend the counterfactual theory of explanation to non-causal cases, involving extra-mathematical explanation: the explanation of physical facts by mathematical facts. Using a structural equation framework, we model impossible perturbations to mathematics and the resulting differences made to physical explananda in two important cases of extra-mathematical explanation. We address some objections to our approach.
David Malament argued that Hartry Field's nominalisation program is unlikely to be able to deal with non-space-time theories such as phase-space theories. We give a specific example of such a phase-space theory and argue that this presentation of the theory delivers explanations that are not available in the classical presentation of the theory. This suggests that even if phase-space theories can be nominalised, the resulting theory will not have the explanatory power of the original. Phase-space theories thus raise problems for (...) nominalists that go beyond Malament's initial concerns. Thanks to Mark Steiner, Jens Christian Bjerring, Ben Fraser, John Mathewson, and two anonymous referees for helpful comments on an earlier draft of this paper. CiteULike Connotea Del.icio.us What's this? (shrink)
Machine generated contents note: 1. Mathematics and its philosophy; 2. The limits of mathematics; 3. Plato's heaven; 4. Fiction, metaphor, and partial truths; 5. Mathematical explanation; 6. The applicability of mathematics; 7. Who's afraid of inconsistent mathematics?; 8. A rose by any other name; 9. Epilogue: desert island theorems.
We discuss a recent attempt by Chris Daly and Simon Langford to do away with mathematical explanations of physical phenomena. Daly and Langford suggest that mathematics merely indexes parts of the physical world, and on this understanding of the role of mathematics in science, there is no need to countenance mathematical explanation of physical facts. We argue that their strategy is at best a sketch and only looks plausible in simple cases. We also draw attention to how frequently Daly and (...) Langford find themselves in conflict with mathematical and scientific practice. (shrink)
ABSTRACT Our goal in this paper is to extend counterfactual accounts of scientific explanation to mathematics. Our focus, in particular, is on intra-mathematical explanations: explanations of one mathematical fact in terms of another. We offer a basic counterfactual theory of intra-mathematical explanations, before modelling the explanatory structure of a test case using counterfactual machinery. We finish by considering the application of counterpossibles to mathematical explanation, and explore a second test case along these lines.
In this paper I reply to Jody Azzouni, Otávio Bueno, Mary Leng, David Liggins, and Stephen Yablo, who offer defences of so-called ‘ easy road ’ nominalist strategies in the philosophy of mathematics.
One of the most intriguing features of mathematics is its applicability to empirical science. Every branch of science draws upon large and often diverse portions of mathematics, from the use of Hilbert spaces in quantum mechanics to the use of differential geometry in general relativity. It's not just the physical sciences that avail themselves of the services of mathematics either. Biology, for instance, makes extensive use of difference equations and statistics. The roles mathematics plays in these theories is also varied. (...) Not only does mathematics help with empirical predictions, it allows elegant and economical statement of many theories. Indeed, so important is the language of mathematics to science, that it is hard to imagine how theories such as quantum mechanics and general relativity could even be stated without employing a substantial amount of mathematics. (shrink)
The Eleatic Principle or causal criterion is a causal test that entities must pass in order to gain admission to some philosophers’ ontology.1 This principle justifies belief in only those entities to which causal power can be attributed, that is, to those entities which can bring about changes in the world. The idea of such a test is rather important in modern ontology, since it is neither without intuitive appeal nor without influential supporters. Its supporters have included David Armstrong (1978, (...) Vol 2, 5), Brian Ellis (1990, 22) and Hartry Field2 (1989, 68) to name but a few. (shrink)
In this paper we argue that there is a kind of moral disagreement that survives the Rawlsian veil of ignorance. While a veil of ignorance eliminates sources of disagreement stemming from self-interest, it does not do anything to eliminate deeper sources of disagreement. These disagreements not only persist, but transform their structure once behind the veil of ignorance. We consider formal frameworks for exploring these differences in structure between interested and disinterested disagreement, and argue that consensus models offer us a (...) solution concept for disagreements behind the veil of ignorance. (shrink)
We explore the prospects of a monist account of explanation for both non-causal explanations in science and pure mathematics. Our starting point is the counterfactual theory of explanation for explanations in science, as advocated in the recent literature on explanation. We argue that, despite the obvious differences between mathematical and scientific explanation, the CTE can be extended to cover both non-causal explanations in science and mathematical explanations. In particular, a successful application of the CTE to mathematical explanations requires us to (...) rely on counterpossibles. We conclude that the CTE is a promising candidate for a monist account of explanation in both science and mathematics. (shrink)
The argument from fine tuning is supposed to establish the existence of God from the fact that the evolution of carbon-based life requires the laws of physics and the boundary conditions of the universe to be more or less as they are. We demonstrate that this argument fails. In particular, we focus on problems associated with the role probabilities play in the argument. We show that, even granting the fine tuning of the universe, it does not follow that the universe (...) is improbable, thus no explanation of the fine tuning, theistic or otherwise, is required. (shrink)
The Quine-Putnam Indispensability argument is the argument for treating mathematical entities on a par with other theoretical entities of our best scientific theories. This argument is usually taken to be an argument for mathematical realism. In this chapter I will argue that the proper way to understand this argument is as putting pressure on the viability of the marriage of scientific realism and mathematical nominalism. Although such a marriage is a popular option amongst philosophers of science and mathematics, in light (...) of the indispensability argument, the marriage is seen to be very unstable. Unless one is careful about how the Quine-Putnam argument is disarmed, one can be forced to either mathematical realism or, alternatively, scientific instrumentalism. I will explore the various options: (i) finding a way to reconcile the two partners in the marriage by disarming the indispensability argument (Jody Azzouni [2], Hartry Field [13, 14], Alan Musgrave [18, 19], David Papineau [21]); (ii) embracing mathematical realism (W.V.O. Quine [23], Michael Resnik [25], J.J.C. Smart [27]); and (iii) embracing some form of scientific instrumentalism (Ot´ avio Bueno [7, 8], Bas van Fraassen [30]). Elsewhere [11], I have argued for option (ii) and I won’t repeat those arguments here. Instead, I will consider the difficulties for each of the three options just mentioned, with special attention to option (i). In relation to the latter, I will discuss an argument due to Alan Musgrave [19] for why option (i) is a plausible and promising approach. From the discussion of Musgrave’s argument, it will emerge that the issue of holist versus separatist theories of confirmation plays a curious role in the realism–antirealism debate in the philosophy of mathematics. I will argue that if you take confirmation to be an holistic matter—it’s whole theories (or significant parts thereof) that are confirmed in any experiment—then there’s an inclination to opt for (ii) in order to resolve the marital tension outlined above.. (shrink)
The present paper advances an analogy between cases of extra-mathematical explanation and cases of what might be termed ‘extra-logical explanation’: the explanation of a physical fact by a logical fact. A particular case of extra-logical explanation is identified that arises in the philosophical literature on time travel. This instance of extra-logical explanation is subsequently shown to be of a piece with cases of extra-mathematical explanation. Using this analogy, we argue extra-mathematical explanation is part of a broader class of non-causal explanation. (...) This has important implications for extra-mathematical explanation, for time travel and for theories of explanation more generally. (shrink)
In this paper we explore the connections between ethics and decision theory. In particular, we consider the question of whether decision theory carries with it a bias towards consequentialist ethical theories. We argue that there are plausible versions of the other ethical theories that can be accommodated by “standard” decision theory, but there are also variations of these ethical theories that are less easily accommodated. So while “standard” decision theory is not exclusively consequentialist, it is not necessarily ethically neutral. Moreover, (...) even if our decision-theoretic models get the right answers vis-`a-vis morally correct action, the question remains as to whether the motivation for the non-consequentialist theories and the psychological processes of the agents who subscribe to those ethical theories are lost or poorly represented in the resulting models. (shrink)
Games such as the St. Petersburg game present serious problems for decision theory.1 The St. Petersburg game invokes an unbounded utility function to produce an infinite expectation for playing the game. The problem is usually presented as a clash between decision theory and intuition: most people are not prepared to pay a large finite sum to buy into this game, yet this is precisely what decision theory suggests we ought to do. But there is another problem associated with the St. (...) Petersburg game. The problem is that standard decision theory counsels us to be indifferent between any two actions that have infinite expected utility. So, for example, consider the decision problem of whether to play the St. Petersburg game or a game where every payoff is $1 higher. Let’s call this second game the Petrograd game (it’s the same as St. Petersburg but with a bit of twentieth century inflation). Standard decision theory is indifferent between these two options. Indeed, it might be argued that any intuition that the Petrograd game is better than the St. Petersburg game is a result of misguided and na¨ıve intuitions about infinity.2 But this argument against the intuition in question is misguided. The Petrograd game is clearly better than the St. Petersburg game. And what is more, there is no confusion about infinity involved in thinking this. When the series of coin tosses comes to an end (and it comes to an end with probability 1), no matter how many tails precede the first head, the payoff for the Petrograd game is one dollar higher than the St. Petersburg game. Whatever the outcome, you are better off playing the Petrograd game. Infinity has nothing to do with it. Indeed, a straightforward application of dominance reasoning backs up this line of reasoning.3 Standard decision theory. (shrink)
In this paper I discuss the kinds of idealisations invoked in normative theories—logic, epistemology, and decision theory. I argue that very often the so-called norms of rationality are in fact mere idealisations invoked to make life easier. As such, these idealisations are not too different from various idealisations employed in scientific modelling. Examples of the latter include: fluids are incompressible (in fluid mechanics), growth rates are constant (in population ecology), and the gravitational influence of distant bodies can be ignored (in (...) celestial mechanics). Thinking of logic, epistemology, and decision theory as normative models employing various idealisations of these kinds, changes the way we approach the justification of the models in question. (shrink)
ON DECEMBER 10, 1991 Charles Shonubi, a Nigerian citizen but a resident of the USA, was arrested at John F. Kennedy International Airport for the importation of heroin into the United States.1 Shonubi's modus operandi was ``balloon swallowing.'' That is, heroin was mixed with another substance to form a paste and this paste was sealed in balloons which were then swallowed. The idea was that once the illegal substance was safely inside the USA, the smuggler would pass the balloons and (...) recover the heroin. On the date of his arrest, Shonubi was found to have swallowed 103 balloons containing a total of 427.4 grams of heroin. There was little doubt about Shonubi's guilt. In fact, there was considerable evidence that he had made at least seven prior heroin-smuggling trips to the USA (although he was not tried for these). In October 1992 Shonubi was convicted in a United States District Court for possessing and importing heroin. Although the conviction was only for crimes associated with Shonubi's arrest date of December 10, 1991, the sentencing judge, Jack B. Weinstein, also made a ®nding that Shonubi had indeed made seven prior drug-smuggling trips to the USA. The interesting part of this case was in the sentencing. According to the federal sentencing guidelines, the sentence in cases such as this should depend on the total quantity of heroin involved. This instruction was interpreted rather broadly.. (shrink)
This paper considers a generalisation of the sorites paradox, in which only topological notions are employed. We argue that by increasing the level of abstraction in this way, we see the sorites paradox in a new, more revealing light—a light that forces attention on cut-off points of vague predicates. The generalised sorites paradox presented here also gives rise to a new, more tractable definition of vagueness.
In philosophy of logic and elsewhere, it is generally thought that similar problems should be solved by similar means. This advice is sometimes elevated to the status of a principle: the principle of uniform solution. In this paper I will explore the question of what counts as a similar problem and consider reasons for subscribing to the principle of uniform solution.
This paper is a response to Paul Bartha’s ‘Making Do Without Expectations’. We provide an assessment of the strengths and limitations of two notable extensions of standard decision theory: relative expectation theory and Paul Bartha’s relative utility theory. These extensions are designed to provide intuitive answers to some well-known problems in decision theory involving gaps in expectations. We argue that both RET and RUT go some way towards providing solutions to the problems in question but neither extension solves all the (...) relevant problems. (shrink)
We argue that standard definitions of ‘vagueness’ prejudice the question of how best to deal with the phenomenon of vagueness. In particular, the usual understanding of ‘vagueness’ in terms of borderline cases, where the latter are thought of as truth-value gaps, begs the question against the subvaluational approach. According to this latter approach, borderline cases are inconsistent (i.e., glutty not gappy). We suggest that a definition of ‘vagueness’ should be general enough to accommodate any genuine contender in the debate over (...) how to best deal with the sorites paradox. Moreover, a definition of ‘vagueness’ must be able to accommodate the variety of forms sorites arguments can take. These include numerical, total-ordered sorites arguments, discrete versions, continuous versions, as well as others without any obvious metric structure at all. After considering the shortcomings of various definitions of ‘vagueness’, we propose a very general non-question-begging definition. (shrink)
This paper explores the scope and limits of rational consensus through mutual respect, with the primary focus on the best known formal model of consensus: the Lehrer–Wagner model. We consider various arguments against the rationality of the Lehrer–Wagner model as a model of consensus about factual matters. We conclude that models such as this face problems in achieving rational consensus on disagreements about unknown factual matters, but that they hold considerable promise as models of how to rationally resolve non-factual disagreements.
Mathematics has a great variety ofapplications in the physical sciences.This simple, undeniable fact, however,gives rise to an interestingphilosophical problem:why should physical scientistsfind that they are unable to evenstate their theories without theresources of abstract mathematicaltheories? Moreover, theformulation of physical theories inthe language of mathematicsoften leads to new physical predictionswhich were quite unexpected onpurely physical grounds. It is thought by somethat the puzzles the applications of mathematicspresent are artefacts of out-dated philosophical theories about thenature of mathematics. In this paper I argue (...) that this is not so.I outline two contemporary philosophical accounts of mathematics thatpay a great deal of attention to the applicability of mathematics and showthat even these leave a large part of the puzzles in question unexplained. (shrink)
In this paper I present an argument for belief in inconsistent objects. The argument relies on a particular, plausible version of scientific realism, and the fact that often our best scientific theories are inconsistent. It is not clear what to make of this argument. Is it a reductio of the version of scientific realism under consideration? If it is, what are the alternatives? Should we just accept the conclusion? I will argue (rather tentatively and suitably qualified) for a positive answer (...) to the last question: there are times when it is legitimate to believe in inconsistent objects. (shrink)
Philosophy of ecology has been slow to become established as an area of philosophical interest, but it is now receiving considerable attention. This area holds great promise for the advancement of both ecology and the philosophy of science. Insights from the philosophy of science can advance ecology in a number of ways. For example, philosophy can assist with the development of improved models of ecological hypothesis testing and theory choice. Philosophy can also help ecologists understand the role and limitations of (...) mathematical models in ecology. On the other side, philosophy of science will be advanced by having ecological case studies as part of the stock of examples. Ecological case studies can shed light on old philosophical topics as well as raise novel issues for the philosophy of science. For example, understanding theoretical terms such as “biodiversity” is important for scientific reasons, but such terms also carry political importance. Formulating appropriate definitions for such terms is thus not a purely scientific matter, and this may prompt a reevaluation of philosophical accounts of defining theoretical terms. We consider some of the topics currently receiving attention in the philosophy of ecology and other topics in need of attention. Our aim is to prompt further exchange between ecology and philosophy of science and to help set the agenda for future work in the philosophy of ecology. The topics covered include: the role of mathematical models, environmental problem formulation, biodiversity, and environmental ethics. (shrink)
JSTOR is a not-for-profit organization founded in 1995 to build trusted digital archives for scholarship. We work with the scholarly community to preserve their work and the materials they rely upon, and to build a common research platform that promotes the discovery and use of these resources. For more information about JSTOR, please contact [email protected]
Indispensability arguments for realism about mathematical entities have come under serious attack in recent years. To my mind the most profound attack has come from Penelope Maddy, who argues that scientific/mathematical practice doesn't support the key premise of the indispensability argument, that is, that we ought to have ontological commitment to those entities that are indispensable to our best scientific theories. In this paper I defend the Quine/Putnam indispensability argument against Maddy's objections.
In this paper I examine Quine''s indispensability argument, with particular emphasis on what is meant by ''indispensable''. I show that confirmation theory plays a crucial role in answering this question and that once indispensability is understood in this light, Quine''s argument is seen to be a serious stumbling block for any scientific realist wishing to maintain an anti-realist position with regard to mathematical entities.
We argue that explanations appealing to logical impossibilities are genuine explanations. Our defense is based on a certain picture of impossibility. Namely, that there are impossibilities and that the impossibilities have structure. Assuming this broad picture of impossibility we defend the genuineness of explanations that appeal to logical impossibilities against three objections. First, that such explanations are at odds with the perceived conceptual connection between explanation and counterfactual dependence. Second, that there are no genuinely contrastive why-questions that involve logical impossibilities (...) and, third, that explanations appealing to logical impossibilities rule nothing out. (shrink)
The Pasadena paradox presents a serious challenge for decision theory. The paradox arises from a game that has well-defined probabilities and utilities for each outcome, yet, apparently, does not have a well-defined expectation. In this paper, I argue that this paradox highlights a limitation of standard decision theory. This limitation can be (largely) overcome by embracing dominance reasoning and, in particular, by recognising that dominance reasoning can deliver the correct results in situations where standard decision theory fails. This, in turn, (...) pushes us towards pluralism about decision rules. (shrink)
At various times, mathematicians have been forced to work with inconsistent mathematical theories. Sometimes the inconsistency of the theory in question was apparent (e.g. the early calculus), while at other times it was not (e.g. pre-paradox na¨ıve set theory). The way mathematicians confronted such difficulties is the subject of a great deal of interesting work in the history of mathematics but, apart from the crisis in set theory, there has been very little philosophical work on the topic of inconsistent mathematics. (...) In this paper I will address a couple of philosophical issues arising from the applications of inconsistent mathematics. The first is the issue of whether finding applications for inconsistent mathematics commits us to the existence of inconsistent objects. I then consider what we can learn about a general philosophical account of the applicability of mathematics from successful applications of inconsistent mathematics. (shrink)
The idea that the phenomenon of vagueness might be modelled by a paraconsistent logic has been little discussed in contemporary work on vagueness, just as the idea that paraconsistent logics might be fruitfully applied to the phenomenon of vagueness has been little discussed in contemporary work on paraconsistency. This is prima facie surprising given that the earliest formalisations of paraconsistent logics presented in Jáskowski and Halldén were presented as logics of vagueness. One possible explanation for this is that, despite initial (...) advocacy by pioneers of paraconsistency, the prospects for a paraconsistent account of vagueness are so poor as to warrant little further consideration. In this paper we look at the reasons that might be offered in defence of this negative claim. As we shall show, they are far from compelling. Paraconsistent accounts of vagueness deserve further attention. (shrink)
The main focus of the book is the presentation of the 'inertial' view of population growth. This view provides a rather simple model for complex population dynamics, and is achieved at the level of the single species without invoking species interactions. An important part of this account is the maternal effect. Investment of mothers in the quality of their daughters makes the rate of reproduction of the current generation depend not only on the current environment, but also on the environment (...) experienced by the previous generation. (shrink)
In this article, I discuss an argument that purports to prove that probability theory is the only sensible means of dealing with uncertainty. I show that this argument can succeed only if some rather controversial assumptions about the nature of uncertainty are accepted. I discuss these assumptions and provide reasons for rejecting them. I also present examples of what I take to..
Consider the following denumerably infinite sequence of sentences: (s1) For all k > 1, sk is not true. (s2) For all k > 2, sk is not true. (s3) For all k > 3, sk is not true.
Mark Balaguer’s project in this book is extremely ambitious; he sets out to defend both platonism and fictionalism about mathematical entities. Moreover, Balaguer argues that at the end of the day, platonism and fictionalism are on an equal footing. Not content to leave the matter there, however, he advances the anti-metaphysical conclusion that there is no fact of the matter about the existence of mathematical objects.1 Despite the ambitious nature of this project, for the most part Balaguer does not shortchange (...) the reader on rigor; all the main theses advanced are argued for at length and with remarkable clarity and cogency. There are, of course, gaps in the account but these should not be allowed to overshadow the sig-. (shrink)
Mathematics has a great variety ofapplications in the physical sciences.This simple, undeniable fact, however,gives rise to an interestingphilosophical problem:why should physical scientistsfind that they are unable to evenstate their theories without theresources of abstract mathematicaltheories? Moreover, theformulation of physical theories inthe language of mathematicsoften leads to new physical predictionswhich were quite unexpected onpurely physical grounds. It is thought by somethat the puzzles the applications of mathematicspresent are artefacts of out-dated philosophical theories about thenature of mathematics. In this paper I argue (...) that this is not so.I outline two contemporary philosophical accounts of mathematics thatpay a great deal of attention to the applicability of mathematics and showthat even these leave a large part of the puzzles in question unexplained. (shrink)
A common response to those who question the Law of Non-Contradiction is that it is impossible to debate such a fundamental law of logic. The reasons for this response vary, but what seems to underlie them is the thought that there is a minimal set of logical resources without which rational debate is impossible. This chapter argues that this response is misguided. First, it defends non-apriorism in logic: the view that logic is in the same epistemic boat as other scientific (...) theories. It then offers an account of logical theory change in terms of this epistemology. The LNC is discussed in terms of this account of logical theory change, and it is shown that rational debate over this law can, and does, proceed. Finally, arguments for and against the LNC are discussed, and how and where non-a priori considerations arise in these arguments are illustrated. (shrink)
It has been argued in the conservation literature that giving conservation absolute priority over competing interests would best protect the environment. Attributing infinite value to the environment or claiming it is ‘priceless’ are two ways of ensuring this priority (e.g. Hargrove 1989; Bulte and van Kooten 2000; Ackerman and Heinzerling 2002; McCauley 2006; Halsing and Moore 2008). But such proposals would paralyse conservation efforts. We describe the serious problems with these proposals and what they mean for practical applications, and we (...) diagnose and resolve some conceptual confusions permeating the literature on this topic. (shrink)
This paper examines the paradox of revisability. This paradox was proposed by Jerrold Katz as a problem for Quinean naturalised epistemology. Katz employs diagonalisation to demonstrate what he takes to be an inconsistency in the constitutive principles of Quine's epistemology. Specifically, the problem seems to rest with the principle of universal revisability which states that no statement is immune to revision. In this paper it is argued that although there is something odd about employing universal revisability to revise itself, there (...) is nothing paradoxical about this. At least, there is no paradox along the lines suggested by Katz. (shrink)