In recent years, the value-freeness of science has come under extensive critique. Early objectors to the notion of value-free science can be found in Rudner and Churchman, later objections occur in Leach and Gaa, and more recent critics are Kitcher, Douglas, and Elliott. The goal of this paper is to examine and critique two arguments opposed to the notion of a value-free science. The first argument, the uncertainty argument, cites the endemic uncertainty of science and concludes that values are needed (...) to give direction to scientific investigation. The second, or moral argument, cites the fact that scientists have moral obligations just like everyone else, and.. (shrink)
Jean Perrin’s proof in the early 20th century of the reality of atoms and molecules is often taken as an exemplary form of robustness reasoning, where an empirical result receives validation if it is generated using multiple experimental approaches. In this paper, I describe in detail Perrin’s style of reasoning, and locate both qualitative and quantitative forms of argumentation. Particularly, I argue that his quantitative style of reasoning has mistakenly been viewed as a form of robustness reasoning, whereas I believe (...) it is something different, what I call ‘calibration’. From this perspective, I re-evaluate recent interpretations of Perrin provided by Stathis Psillos, Peter Achinstein, Alan Chalmers, and Bas van Fraassen, all of whom read Perrin as a robustness reasoner, though not necessarily in the same sort of way. I then argue that by viewing Perrin as a ‘calibration’ reasoner we gain a better understanding of why he believes himself to have established the reality of atoms and molecules. To conclude, I provide an alternative and more productive understanding of the basis of the dispute between realists and anti-realists. (shrink)
My task in this paper is to defend the legitimacy of historicist philosophy of science, defined as the philosophic study of science that takes seriously case studies drawn from the practice of science. Historicistphilosophy of science suffers from what I call the ’evidence problem’. The worry is that case studies cannot qualify as rigorous evidence for the adjudication of philosophic theories. I explore the reasons why one might deny to historical cases a probative value, then reply to these reasons on (...) behalf of historicism. The main proponents of the view I am criticizing are Pitt (2001) and Rasmussen (2001). (shrink)
In Seeing Things, Robert Hudson argues that robustness reasoning lacks the special value it is often claimed to have. Robustness reasoning claims that an observation report is more likely to be true if the report is produced by multiple, independent sources.
Culp (1994) provides a defense for a form of experimental reasoning entitled 'robustness'. Her strategy is to examine a recent episode in experimental microbiology--the case of the mistaken discovery of a bacterial organelle called a 'mesosome'--with an eye to showing how experimenters effectively used robust experimental reasoning (or could have used robust reasoning) to refute the existence of the mesosome. My plan is to criticize Culp's assessment of the mesosome episode and to cast doubt on the epistemic significance of robustness. (...) In turn, I present a different account of the experimental reasoning microbiologists used in arriving at the conclusion that mesosomes are artifacts. I call this form of reasoning 'reliable process reasoning', and close the paper with a brief discussion of how experimental microbiologists justify the claim that an experimental process is reliable. (shrink)
According to the methodological principle called ‘robustness’, empirical evidence is more reliable when it is generated using multiple, independent (experimental) routes that converge on the same result. As it happens, robustness as a methodological strategy is quite popular amongst philosophers. However, despite its popularity, my goal here is to criticize the value of this principle on historical grounds. My historical reasons take into consideration some recent history of astroparticle physics concerning the search for WIMPs (weakly interacting massive particles), one of (...) the main candidates for cosmic dark matter. On the basis of these reasons, I assert that robustness, at least in this historical case we are considering, has less value than usually assumed by philosophers. (shrink)
In this paper I distinguish two kinds of predictivism, ‘timeless’ and ‘historicized’. The former is the conventional understanding of predictivism. However, I argue that its defense in the works of John Worrall (Scerri and Worrall 2001, Studies in History and Philosophy of Science 32, 407–452; Worrall 2002, In the Scope of Logic, Methodology and Philosophy of Science, 1, 191–209) and Patrick Maher (Maher 1988, PSA 1988, 1, pp. 273) is wanting. Alternatively, I promote an historicized predictivism, and briefly defend such (...) a predictivism at the end of the paper. (shrink)
The goal of this paper is to defend the claim that there is such a thing as direct perception, where by âdirect perceptionâ I mean perception unmediated by theorizing or concepts. The basis for my defense is a general philosophic perspective which I call âempiricist philosophyâ. In brief, empiricist philosophy (as I have defined it) is untenable without the occurrence of direct perception. It is untenable without direct perception because, otherwise, one can't escape the hermeneutic circle, as this phrase is (...) used in van Fraassen (1980). The bulk of the paper is devoted to defending my belief in direct perception against various objections that can be posed against it. I discuss various anticipations of my view found in the literature, eventually focusing on Ian Hacking's related conception of `entity realism' (Hacking 1983). Hacking has been criticized by a number of philosophers and my plan is to respond to these criticisms on behalf of entity realism (or more precisely on behalf of the claim that direct perception is a reality) and to then respond to other possible criticisms that can be launched against direct perception. (shrink)
It has become more common recently for epistemologists to advocate the pragmatic encroachment on knowledge, the claim that the appropriateness ofknowledge ascriptions is dependent on the relevant practical circumstances. Advocacy of practicalism in epistemology has come at the expense of contextualism, the view that knowledge ascriptions are independent of pragmatic factors and depend alternatively on distinctively epistemological, semantic factors with the result that knowledge ascriptions express different knowledge properties on different occasions of use. Overall, my goal here is to defend (...) a particular version of contextualism drawn from work by Peter Ludlow, called ‘standards contextualism.’ My strategy will be to elaborate on this form of contextualism by defending it from various objections raised by the practicalists Jason Stanley, Jeremy Fantl and Matthew McGrath. In showing how standards contextualism can effectively repel these criticisms I hope to establish that standards contextualism is a viable alternative to practicalism. (shrink)
In this paper I distinguish two kinds of predictivism, 'timeless' and 'historicized'. The former is the conventional understanding of predictivism. However, I argue that its defense in the works of John Worrall and Patrick Maher is wanting. Alternatively, I promote an historicized predictivism, and briefly defend such a predictivism at the end of the paper.
Virtue epistemology is faced with the challenge of establishing the degree to which a knower’s cognitive success is attributable to her cognitive ability. As Duncan Pritchard notes, in some cases one is inclined to a strong version of virtue epistemology, one that requires cognitive success to be because of the exercise of the relevant cognitive abilities. In other cases, a weak version of virtue epistemology seems preferable, where cognitive success need only be the product of cognitive ability. Pritchard’s preference, with (...) his anti-luck virtue epistemology, is for the latter. But as Christoph Kelp has recently argued, this preference is not without controversy. Notably, Kelp argues that Pritchard on the basis of his anti-luck virtue epistemology is impelled to cast the wrong judgment in a case that Pritchard himself discusses many times in his writings, the so-called ‘Temp case’. Though Pritchard argues that Temp lacks knowledge because his cognitive success is not a result of his cognitive ability, I concur with Kelp that Pritchard’s epistemology should in fact attribute knowledge to Temp, and show this by locating weaknesses in three distinct arguments Pritchard uses to show that Temp lacks knowledge. I subsequently argue that if Pritchard wishes to persist in denying knowledge to Temp, he should endorse what I call the ‘true description’ requirement. I close the paper by providing an argument for this requirement, controversial though it is. (shrink)
In 1912, Henri Poincaré published an argument which apparently shows that the hypothesis of quanta is both necessary and sufficient for the truth of Planck''s experimentally corroborated law describing the spectral distribution of radiant energy in a black body. In a recent paper, John Norton has reaffirmed the authority of Poincarés argument, setting it up as a paradigm case in which empirical data can be used to definitively rule out theoretical competitors to a given theoretical hypothesis. My goal is to (...) dispute Norton ''s claim that there is no theoretical underdetermination problem arising between classical physics and early quantum theory. The strategy I use in defending my view is to adopt a suggestion made by Jarrett Leplin and Larry Laudan on how to assess the relative merits of competing theoretical alternatives, where each alternative has an equal capacity to save the phenomena. In the course of the paper, I distinguish between two branches of classical physics: classical mechanics and classical electromagnetism. The former is claimed by Norton and Poincaré to be determinately ruled out by the black body evidence; and it is the former that I argue is compatible with this evidence. (shrink)
The WIMP (weakly interacting dark matter) is currently the leading candidate for what is thought to be dark matter, the cosmological material claimed to make up almost 99% of the matter of the universe and which is indiscernible by means of electromagnetic radiation. There are many research groups dedicated to experimentally isolating WIMPs, and in this paper we describe the work of three of these groups, the Saclay group, DAMA and UKDM. This exploration into the recent history of astroparticle physics (...) serves to illuminate two philosophical issues. First, is confirmatory evidence more compelling if it coordinates results gleaned from independent experimental investigations? And secondly, in justifying experimental conclusions, how strong must this justification be? Are the high standards set by philosophers, in the spirit of Descartes, relevant to experimental research? (shrink)
Kurt Gödel criticizes Rudolf Carnap's conventionalism on the grounds that it relies on an empiricist admissibility condition, which, if applied, runs afoul of his second incompleteness theorem. Thomas Ricketts and Michael Friedman respond to Gödel's critique by denying that Carnap is committed to Gödel's admissibility criterion; in effect, they are denying that Carnap is committed to any empirical constraint in the application of his principle of tolerance. I argue in response that Carnap is indeed committed to an empirical requirement vis‐à‐vis (...) tolerance, a fact that becomes clear upon closer scrutiny of Carnap's relevant writings. *Received July 2009; revised January 2010. †To contact the author, please write to: Department of Philosophy, University of Saskatchewan, 9 Campus Drive, Saskatoon, SK S7N 5A5, Canada; e‐mail: firstname.lastname@example.org. (shrink)
The Structure of Scientific Revolutions) and Alan Musgrave argue that it is impossible to precisely date discovery events and precisely identify discoverers. They defend this claim mainly on the grounds that so-called discoverers have in many cases misconceived the objects of discovery. In this paper, I argue that Kuhn and Musgrave arrive at their view because they lack a substantive account of how well discoverers must be able to conceptualize discovered objects. I remedy this deficiency by providing just such an (...) account, and with this account I delineate how one can secure precision regarding the identity of discoverers and the times of discoveries. Near the end of my paper I bring my target of criticism up-to-date; it turns out that Steve Woolgar adopts an approach to discovery kindred to those of Kuhn and Musgrave and I close the paper by discussing what is at stake in rebutting him. (shrink)
It is often claimed that anti-realists are compelled to reject the inference of the knowability paradox, that there are no unknown truths. I call those anti-realists who feel so compelled ‘faint-hearted’, and argue in turn that anti-realists should affirm this inference, if it is to be consistent. A major part of my strategy in defending anti-realism is to formulate an anti-realist definition of truth according to which a statement is true only if it is verified by someone, at some time. (...) I also liberalize what is meant by a verification to allow for indirect forms of verification. From this vantage point, I examine a key objection to anti-realism, that it is committed to the necessary existence of minds, and reject a response to this problem set forth by Michael Hand. In turn I provide a more successful anti-realist response to the necessary minds problem that incorporates what I call an ‘agential’ view of verification. I conclude by considering what intellectual cost there is to being an anti-realist in the sense I am advocating. (shrink)
In this paper, I raise some questions about Pritchard ’s internalist argument for scepticism. I argue that his internalism begs the question in support of scepticism. Correlatively I advance what I take to be a better internalist argument for scepticism, one that leaves open the possibility of empirically adjudicating sceptical hypotheses. I close by discussing what it means to be an internalist.
Recent scholarship (by mainly Michael Friedman, but also by Thomas Uebel) on the philosophy of Rudolf Carnap covering the period from the publication of Carnap’s’ 1928 book Der Logische Aufbau der Welt through to the mid to late 1930’s has tended to view Carnap as espousing a form of conventionalism (epitomized by his adoption of the principle of tolerance) and not a form of empirical foundationalism. On this view, it follows that Carnap’s 1934 The Logical Syntax of Language is the (...) pinnacle of his work during this era, this book having developed in its most complete form the conventionalist approach to dissolving the pseudoproblems that often attend philosophical investigation. My task in this paper, in opposition to this trend, is to resuscitate the empiricist interpretation of Carnap’s work during this time period. The crux of my argument is that Carnap’s 1934 book, by eschewing for the most part the empiricism he espouses in the Aufbau and in his 1932 The Unity of Science, is led to a form of conventionalism that faces the serious hazard of collapsing into epistemological relativism. My speculation is that Carnap came to recognize this deficiency in his 1934 book, and in subsequent work (“Testability and Meaning”, published in 1936/37) felt the need to re-instate his empiricist agenda. This subsequent work provides a much improved empiricist epistemology from Carnap’s previous efforts and, ashistory informs us, sets the standard for future research in the theory of confirmation. (shrink)
My goal in this paper is to consider two separate but connected topics, one historical, the other philosophical. The first topic concerns the forms of reasoning contemporary experimental astrophysicists use to investigate the existence of WIMPs (weakly interacting massive particles). These forms of reasoning take two forms, one model-dependent and the other model-independent, and we examine the arguments one WIMP research group (DAMA) uses to support the latter. The second topic concerns recent support Kent Staley has offered for a form (...) of scientific reasoning called ‘robustness’, and I argue that the model-independent strategy propounded by DAMA improves on robustness. (shrink)
In his recent article, Nicolas Rasmussen (2001) is harshly critical of what he terms 'empirical philosophy of science', a philosophy that takes seriously the history of science in advancing philosophical pronouncements about science. He motivates his criticism by reflecting on recent history in microbiology involving the 'discovery' of a new bacterial organelle, the mesosome, during the 1950's and 1960's, and the subsequent retraction of this discovery by experimental microbiologists during the late 1970's and early 1980's. In particular, he argues that (...) there was a lack of constancy in the methods microbiologists used in approaching the issue of the existence of mesosomes, and that in fact a similar sort of 'methodological flux' pervades all experimental work. My goal here is to refute Rasmussen's doctrine of flux, and in turn to re-establish order in our understanding of the methods and strategies of experimenters. My strategy in achieving this goal is to re-visit the same crucial research articles in the history of the mesosome episode that Rasmussen (2001) visits; and what I find upon returning to this literature is not flux, as Rasmussen seems to find, but a constancy of method in experimental reasoning, a constancy codified by what I call 'reliable process reasoning'. (shrink)
Experimental data are often acclaimed on the grounds that they can be consistently generated. They are, it is said, reproducible. In this paper I describe how this feature of experimental-data (their pragmatic reliability) leads to their epistemic worth (their epistemic reliability). An important part of my description is the supposition that experimental procedures are to certain extent fixed and stable. Various illustrations from the actual practice of science are introduced, the most important coming at the end of the paper with (...) a discussion of Ray Davis' 1967 solar-neutrino detection experiment (as it is portrayed in Pinch, 1980). (shrink)
My task in this paper is to defend the legitimacy of historicist philosophy of science, defined as the philosophic study of science that takes seriously case studies drawn from the practice of science. Historicistphilosophy of science suffers from what I call the ’evidence problem’. The worry is that case studies cannot qualify as rigorous evidence for the adjudication of philosophic theories. I explore the reasons why one might deny to historical cases a probative value, then reply to these reasons on (...) behalf of historicism. The main proponents of the view I am criticizing are Pitt and Rasmussen. (shrink)
This collection of essays aims to investigate the complex issues surrounding contemporary cultural discourses on land and identity – their production, construction, and reconstruction across a range of different texts and materials. The chapters offer disciplinary and trans-disciplinary approaches opening up discussion and new routes for research in a number of interrelated areas such as Countryside vs. City, Diaspora, Landscapes of Memory and Trauma, Migrational Spaces, and Ecology. They represent a number of innovative contemporary responses to how concepts of land (...) intersect and dialogue with notions of identity across and between regions, nations, races, and cultures. Through employing interdisciplinary methods and theories drawn from diverse sources, such as cultural studies, spatial theory, philosophy and literary theory, the chapters chart varied and complex themes of identity formation in relation to spatiality. (shrink)