Scientific realists claim we can justifiably believe that science is getting at the truth. But they have long faced historical challenges: various episodes across history appear to demonstrate that even strongly supported scientific theories can be overturned and left behind. In response, realists have developed new positions and arguments. As a result of specific challenges from the history of science, and realist responses, we find ourselves with an ever increasing data-set bearing on the (possible) relationship between science and truth. The (...) present volume introduces new historical cases impacting the debate, and advances the discussion of cases that have only very recently been introduced. At the same time, shifts in philosophical positions affect the very kind of case study that is relevant. Thus the historical work must proceed hand in hand with philosophical analysis of the different positions and arguments in play. It is with this in mind that the volume is divided into two sections, entitled “Historical cases for the debate,” and “Contemporary scientific realism”. All sides agree that historical cases are informative with regard to how, or whether, science connects with truth. Defying proclamations as early as the 1980s announcing the death knell of the scientific realism debate, here is that rare thing: a philosophical debate making steady and definite progress. Moreover, the progress it is making concerns one of humanity’s most profound and important questions: the relationship between science and truth, or, put more boldly, the epistemic relation between humankind and the reality in which we find ourselves. (shrink)
Although scientific realism is the default position in the life sciences, philosophical accounts of realism are geared towards physics and run into trouble when applied to fields such as biology or neuroscience. In this paper, I formulate a new robustness-based version of entity realism, and show that it provides a plausible account of realism for the life sciences that is also continuous with scientific practice. It is based on the idea that if there are several independent ways of measuring, detecting (...) or deriving something, then we are justified in believing that it is real. I also consider several possible objections to robustness-based entity realism, discuss its relationship to ontic structural realism, and show how it has the potential to provide a novel response to the pessimistic induction argument. (shrink)
If scientists embrace scientific realism, they can use a scientific theory to explain and predict observables and unobservables. If, however, they embrace scientific antirealism, they cannot use a scientific theory to explain observables and unobservables, and cannot use a scientific theory to predict unobservables. Given that explanation and prediction are means to make scientific progress, scientists can make more scientific progress, if they embrace scientific realism than if they embrace scientific antirealism.
First, I identify a methodological thesis associated with scientific realism. This has different variants, but each concerns the reliability of scientific methods in connection with acquiring, or approaching, truth or approximate truth. Second, I show how this thesis bears on what scientists should do when considering new theories that significantly contradict older theories. Third, I explore how vulnerable scientific realism is to a reductio ad absurdum as a result. Finally, I consider which variants of the methodological thesis are the most (...) defensible in light of the earlier findings. (shrink)
Does science move toward truths? Are present scientific theories (approximately) true? Should we invoke truths to explain the success of science? Do our cognitive faculties track truths? Some philosophers say yes, while others say no, to these questions. Interestingly, both groups use the same scientific theory, viz., evolutionary theory, to defend their positions. I argue that it begs the question for the former group to do so because their positive answers imply that evolutionary theory is warranted, whereas it is self-defeating (...) for the latter group to do so because their negative answers imply that evolutionary theory is unwarranted. (shrink)
The pessimistic induction is built upon the uniformity principle that the future resembles the past. In daily scientific activities, however, scientists sometimes rely on what I call the disuniformity principle that the future differs from the past. They do not give up their research projects despite the repeated failures. They believe that they will succeed although they failed repeatedly, and as a result they achieve what they intended to achieve. Given that the disuniformity principle is useful in certain cases in (...) science, we might reasonably use it to infer that present theories are true unlike past theories. Hence, pessimists have the burden to show that our prediction about the fate of present theories is more likely to be true if we use the uniformity principle than if we use the disuniformity principle. (shrink)
It has recently been suggested that realist responses to historical cases featured in pessimistic meta-inductions are not as successful as previously thought. In response, selective realists have updated the basic divide et impera strategy specifically to take such cases into account and to argue that more modern realist accounts are immune to the historical challenge. Using a case study—that of the nineteenth-century zymotic theory of disease—I argue that these updated proposals fail and that even the most sophisticated recent realist accounts (...) remain vulnerable to the challenge from history. (shrink)
In this paper, I examine the transition from zymotic views of disease to germ views in Britain in the mid-1800s. I argue that neither realist nor anti-realist accounts of theory-change can account for this case, because both rely on a well-defined notion of theory, which, as the paper will show, is inapplicable in this instance. After outlining the zymotic theory of disease, I show that, even though it hardly had anything in common with the germ theory, it was highly successful. (...) However, despite this success, it is not possible to identify stable elements that were carried over to the germ theory; thus, realists cannot account for the shift from one to the other. Anti-realists, however, don’t do much better: their focus tends to be on (radical) discontinuities across theories, yet the zymotic case does not exemplify this, either. Instead, there is a slow and complex evolution from zymotic to germ views, during which various zymotic elements are assimilated into the germ theory, until, eventually, none of the zymotic theory’s original elements are left. (shrink)
Robustness is often presented as a guideline for distinguishing the true or real from mere appearances or artifacts. Most of recent discussions of robustness have focused on the kind of derivational robustness analysis introduced by Levins, while the related but distinct idea of robustness as multiple accessibility, defended by Wimsatt, has received less attention. In this paper, I argue that the latter kind of robustness, when properly understood, can provide justification for ontological commitments. The idea is that we are justified (...) in believing that things studied by science are real insofar as we have robust evidence for them. I develop and analyze this idea in detail, and based on concrete examples show that it plays an important role in science. Finally, I demonstrate how robustness can be used to clarify the debate on scientific realism and to formulate new arguments. (shrink)
I review prominent historical arguments against scientific realism to indicate how they display a systematic overshooting in the conclusions drawn from the historical evidence. The root of the overshooting can be located in some critical, undue presuppositions regarding realism. I will highlight these presuppositions in connection with both Laudan’s ‘Old induction’ and Stanford’s New induction, and then delineate a minimal realist view that does without the problematic presuppositions.
A scientific community can be modeled as a collection of epistemic agents attempting to answer questions, in part by communicating about their hypotheses and results. We can treat the pathways of scientific communication as a network. When we do, it becomes clear that the interaction between the structure of the network and the nature of the question under investigation affects epistemic desiderata, including accuracy and speed to community consensus. Here we build on previous work, both our own and others’, in (...) order to get a firmer grasp on precisely which features of scientific communities interact with which features of scientific questions in order to influence epistemic outcomes. (shrink)
The most influential arguments for scientific realism remain centrally concerned with an inference from scientific success to the approximate truth of successful theories. Recently, however, and in response to antirealists' objections from radical discontinuity within the history of science, the arguments have been refined. Rather than target entire theories, realists narrow their commitments to only certain parts of theories. Despite an initial plausibility, the selective realist strategy faces significant challenges. In this article, I outline four prerequisites for a successful selective (...) realist defence and argue that adopting a comparative sense of success both satisfies those requirements and partially in consequence provides a more compelling, albeit more modest, realist thesis. (shrink)
Scientific realism is the position that success of a scientific theory licenses an inference to its approximate truth. The argument from pessimistic meta-induction maintains that this inference is undermined due to the existence of theories from the history of science that were successful, but false. I aim to counter pessimistic meta-induction and defend scientific realism. To do this, I adopt a notion of success that admits of degrees, and show that our current best theories enjoy far higher degrees of success (...) than any of the successful, but refuted theories of the past. (shrink)
This paper utilizes a logical correspondence theorem (which has been proved elsewhere) for the justification of weak conceptions of scientific realism and convergence to truth which do not presuppose Putnam's no-miracles-argument (NMA). After presenting arguments against the reliability of the unrestricted NMA in Sect. 1, the correspondence theorem is explained in Sect. 2. In Sect. 3, historical illustrations of the correspondence theorem are given, and its ontological consequences are worked out. Based on the transitivity of the concept of correspondence, a (...) correspondence-based notion of convergence to truth is developed in Sect. 4. In the final Sect. 5 it is argued that the correspondence theorem together with the assumption of ' minimal realism' yields a justification of a weak version of scientific realism, which is then compared to metaphysical realism and to instrumentalism. (shrink)
Inferences from scientific success to the approximate truth of successful theories remain central to the most influential arguments for scientific realism. Challenges to such inferences, however, based on radical discontinuities within the history of science, have motivated a distinctive style of revision to the original argument. Conceding the historical claim, selective realists argue that accompanying even the most revolutionary change is the retention of significant parts of replaced theories, and that a realist attitude towards the systematically retained constituents of our (...) scientific theories can still be defended. Selective realists thereby hope to secure the argument from success against apparent historical counterexamples. Independently of that objective, historical considerations have inspired a further argument for selective realism, where evidence for the retention of parts of theories is itself offered as justification for adopting a realist attitude towards them. Given the nature of these arguments from success and from retention, a reasonable expectation is that they would complement and reinforce one another, but although several theses purport to provide such a synthesis the results are often unconvincing. In this paper I reconsider the realist’s favoured type of scientific success, novel success, offer a revised interpretation of the concept, and argue that a significant consequence of reconfiguring the realist’s argument from success accordingly is a greater potential for its unification with the argument from retention.Keywords: Scientific realism; Pessimistic meta-induction; Success; Progress; Aether; Novelty. (shrink)
In this response, doubts are expressed relating to the treatment by Hoyningen-Huene and Oberheim of the relation between incommensurability and content comparison. A realist response is presented to their treatment of ontological replacement. Further questions are raised about the coherence of the neo-Kantian idea of the world-in-itself as well as the phenomenal worlds hypothesis. The notion of common sense is clarified. Meta-incommensurability is dismissed as a rhetorical device which obstructs productive discussion.Keywords: Scientific realism; Incommensurability; Meta-incommensurability; Paul Hoyningen-HueneArticle Outline.
In recent years, two challenges stand out against scientific realism: the argument from the underdetermination of theories by evidence (UTE) and the pessimistic induction argument (PI). In his book, Kyle Stanford accepts the gravity of these challenges, but argues that the most serious and powerful challenge to scientific realism has been neglected. The problem of unconceived alternatives (PUA), as he calls it, is introduced in chapter one and refined in chapter two. In short, PUA holds that throughout history scientists have (...) failed to conceive alternative theories roughly equally well-confirmed to the theories of the day by the available evidence and, crucially, that such alternatives eventually were conceived and adopted by some section of the scientific community. PUA is a version of UTE, but, unlike its kin, enjoys substantial historical support. It leads to a sort of pessimistic induction that Stanford brands ‘the new induction’ (NI), according to which we should be doubtful about the truth claims of current theories since the historical record suggests that unconceived alternatives are typically lurking in the shadows. His proposal contains two important shifts of focus: First, there is a shift from artificially produced rival theories - of the kind typically talked about in the underdetermination debate - to actual rivals. Second, instead of focusing on empirically equivalent rivals, he urges a shift to rivals that are more or less equally well-confirmed to existing theories by the available evidence at a given point in time. Prima facie, PUA sounds like a welcome addition to the anti-realist arsenal, drawing on historical evidence to support the induction that current theories probably face genuine alternatives waiting to be conceived. (shrink)
Abstract The paper examines the differences between Kuhn's account, in The Structure of Scientific Revolutions, of the sciences as necessarily communal activities with internally set standards of procedure and achievement, and that view of the sciences which calls itself ?Scientific Realism? and regards them as striving toward, and perhaps asymptotically approaching, some external and objective reality that bestows truth or falsity on scientific theories. The main argument turns on Poincaré's demonstration that Newton's Second Law (f = ma) is not a (...) testable, provable proposition with a truth value, but something that is simply adopted. It is adopted in the light of experience, certainly, but there is no logical necessity in the adoption. My suggestion is that it is a ?way of looking? and ?a method of analysis? and that the necessity of its adoption by any individual lies in its being a necessary condition of entry into the scientific community. That community itself adopts ways of looking or methods of analysis for their fruitfulness in dealing with old problems and defining new ones. Incoherences in the ?approach? account of scientific progress are looked at, and the individualistic assumptions that motivate it. These require the sciences to be presented as the source and basis of agreement and community amongst separated individuals. This picture and its requirement inverts reality as well as Kuhn's account, which makes community and agreement the starting point. The notion of reality as a transcendental convergence point becomes redundant. The old problem of the incommensurability of paradigms is discussed by relating them to the notions of ways of looking and methods of analysis. These may be incompatible in that one cannot look at things in two different ways at once, but at the same time they cannot be measured on any common scale. (shrink)
We presuppose a position of scientific realism to the effect (i) that the world exists and (ii) that through the working out of ever more sophisticated theories our scientific picture of reality will approximate ever more closely to the world as it really is. Against this background consider, now, the following question: 1. Do the empirical theories with the help of which we seek to approximate a good or true picture of reality rest on any non-empirical presuppositions? One can answer (...) this question with either a 'yes' or a 'no'. 'No' is the preferred answer of most contemporary methodologists -- Murray Rothbard is one distinguished counterexample to this trend -- who maintain that empirical theories are completely free of non-empirical ('a priori') admixtures and who see science as a matter of the gathering of pure 'data' obtained through simple observation. From such data scientific propositions are then supposed to be somehow capable of being established. (shrink)
Scientific Realists argue that it would be a miracle if scientific theories were getting more predictive without getting closer to the truth; so they must be getting closer to the truth. Van Fraassen, Laudan et al. argue that owing to the underdetermination of theory by data (UDT) for all we know, it is a miracle, a fluke. So we should not believe in even the approximate truth of theories. I argue that there is a test for who is right: suppose (...) we are at the limit of inquiry. Suppose that we then have all the logically possible theories that are adequate to all the actual data. If they all resembled in their theoretical claims, since one of them must be true, all of them would then resemble it, whichever it is. We would thus be justified in saying they all approximated the truth in the degree to which they co-resembled. If they don't all co-resemble, the SRs are wrong; more predictive theories are not necessarily closer to the theoretical truth. Prior to the limit, if, in spite of our best efforts to the contrary, all the theories we can make adequate to current data tend to co-resemble, we have inductive warrant for thinking more predictive theories are closer to the truth. If they don't resemble, we have inductive warrant for thinking that more predictive theories are not necessarily closer to the truth. (shrink)
Pure causal theories of reference cannot account for cases of theoretical term reference failure and do not capture the scientific point of introducing new theoretical terminology. In order to account for paradigm cases of reference failure and the point of new theoretical terminology, a descriptive element must play a role in fixing the reference of theoretical terms. Richard Boyd's concept of theory constituitive metaphors provides the necessary descriptive element in reference fixing. In addition to providing a plausible account of reference (...) failure and success, a metaphor approach to reference fixing provides the basis for a plausible realist account of the progress of science. Indeed, the metaphor approach undermines the sceptical force of the meta-induction and Laudan's objections to scientific realism. (shrink)
Although our theories are not precisely true, scientific realists contend that we should admit their objects into our ontology. One justification--offered by Sellars and Putnam--is that current theories belong to series that converge to ideally adequate theories. I consider the way the commitment to convergence reflects on the interpretation of lawlike claims. I argue that the distinction between lawlike and accidental generalizations depends on our cognitive interests and reflects our commitment to the direction of scientific progress. If the sciences disagree (...) about the lawlikeness of some generalization(s), as an argument of Davidson's suggests, it follows from the interest relatively of lawlikeness that the laws of a science do not determine the essences of their objects. I conclude that this form of scientific realism provides no metaphysical support for essentialism. (shrink)