What are the features of a good scientific theory? Samuel Schindler's book revisits this classical question in the philosophy of science and develops new answers to it. Theoretical virtues matter not only for choosing theories 'to work with', but also for what we are justified in believing: only if the theories we possess are good ones can we be confident that our theories' claims about nature are actually correct. Recent debates have focussed rather narrowly on a theory's capacity to predict (...) new phenomena successfully, but Schindler argues that the justification for this focus is thin. He discusses several other theory properties such as testability, accuracy, and consistency, and highlights the importance of simplicity and coherence. Using detailed historical case studies and careful philosophical analysis, Schindler challenges the received view of theoretical virtues and advances arguments for the view that science uncovers reality through theory. (shrink)
Predictivism is the view that successful predictions of “novel” evidence carry more confirmational weight than accommodations of already known evidence. Novelty, in this context, has traditionally been conceived of as temporal novelty. However temporal predictivism has been criticized for lacking a rationale: why should the time order of theory and evidence matter? Instead, it has been proposed, novelty should be construed in terms of use-novelty, according to which evidence is novel if it was not used in the construction of a (...) theory. Only if evidence is use-novel can it fully support the theory entailing it. As I point out in this paper, the writings of the most influential proponent of use-novelty contain a weaker and a stronger version of use-novelty. However both versions, I argue, are problematic. With regard to the appraisal of Mendeleev’ periodic table, the most contentious historical case in the predictivism debate, I argue that temporal predictivism is indeed supported, although in ways not previously appreciated. On the basis of this case, I argue for a form of so-called symptomatic predictivism according to which temporally novel predictions carry more confirmational weight only insofar as they reveal the theory’s presumed coherence of facts as real. (shrink)
The thesis of theory-ladenness of observations, in its various guises, is widely considered as either ill-conceived or harmless to the rationality of science. The latter view rests partly on the work of the proponents of New Experimentalism who have argued, among other things, that experimental practices are efficient in guarding against any epistemological threat posed by theory-ladenness. In this paper I show that one can generate a thesis of theory-ladenness for experimental practices from an influential New Experimentalist account. The notion (...) I introduce for this purpose is the concept of ‘theory-driven data reliability judgments’, according to which theories which are sought to be tested with a particular set of data guide reliability judgments about those very same data. I provide various prominent historical examples to show that TDRs are used by scientists to resolve data conflicts. I argue that the rationality of the practices which employ TDRs can be saved if the independent support of the theories driving TDRs is construed in a particular way. (shrink)
In this article I assess Alisa Bokulich’s idea that explanatory model fictions can be genuinely explanatory. I draw attention to a tension in her account between the claim that model fictions are explanatorily autonomous, and the demand that model fictions be justified in order for them to be genuinely explanatory. I also explore the consequences that arise from Bokulich’s use of Woodward’s account of counterfactual explanation and her abandonment of Woodward’s notion of an intervention. As it stands, Bokulich’s account must (...) be deemed unworkable. (shrink)
In a recent book and an article, Carl Craver construes the relations between different levels of a mechanism, which he also refers to as constitutive relations, in terms of mutual manipulability (MM). Interpreted metaphysically, MM implies that inter-level relations are symmetrical. MM thus violates one of the main desiderata of scientific explanation, namely explanatory asymmetry. Parts of Craver’s writings suggest a metaphysical interpretation of MM, and Craver explicitly commits to constitutive relationships being symmetrical. The paper furthermore explores the option of (...) interpreting MM epistemologically, as a means for individuating mechanisms. It is argued that MM then is redundant. MM should therefore better be abandoned. (shrink)
A theory’s fertility is one of the standard theoretical virtues. But how is it to be construed? In current philosophical discourse, particularly in the realism debate, theoretical fertility is usually understood in terms of novel success: a theory is fertile if it manages to make successful novel predictions. Another, more permissible, notion of fertility can be found in the work of Ernan McMullin. This kind of fertility, McMullin claims, gives us just as strong grounds for realism. My paper critically assesses (...) McMullin’s notion of fertility and its realist rationale. It concludes that McMullin’s preferred example, namely the fertile development of the Bohr-Sommerfeld model of the atom, does not support McMullin’s argument for realism. Although the implications for the realism debate are as of yet unclear, the case study offers some important methodological lessons. (shrink)
Some twenty years ago, Bogen and Woodward challenged one of the fundamental assumptions of the received view, namely the theory-observation dichotomy and argued for the introduction of the further category of scientific phenomena. The latter, Bogen and Woodward stressed, are usually unobservable and inferred from what is indeed observable, namely scientific data. Crucially, Bogen and Woodward claimed that theories predict and explain phenomena, but not data. But then, of course, the thesis of theory-ladenness, which has it that our observations are (...) influenced by the theories we hold, cannot apply. On the basis of two case studies, I want to show that this consequence of Bogen and Woodward’s account is rather unrealistic. More importantly, I also object against Bogen and Woodward’s view that the reliability of data, which constitutes the precondition for data-to-phenomena inferences, can be secured without the theory one seeks to test. The case studies I revisit have figured heavily in the publications of Bogen and Woodward and others: the discovery of weak neutral currents and the discovery of the zebra pattern of magnetic anomalies. I show that, in the latter case, data can be ignored if they appear to be irrelevant from a particular theoretical perspective (TLI) and that, in the former case, the tested theory can be critical for the assessment of the reliability of the data (TLA). I argue that both TLI and TLA are much stronger senses of theory-ladenness than the classical thesis and that neither TLI nor TLA can be accommodated within Bogen and Woodward’s account. (shrink)
In this article I argue that a methodological challenge to an integrated history and philosophy of science approach put forth by Ronald Giere almost forty years ago can be met by what I call the Kuhnian mode of History and Philosophy of Science (HPS). Although in the Kuhnian mode of HPS norms about science are motivated by historical facts about scientific practice, the justifiers of the constructed norms are not historical facts. The Kuhnian mode of HPS therefore evades the naturalistic (...) fallacy which Giere’s challenge is a version of. Against the backdrop of a discussion of Laudan’s normative naturalism I argue that the Kuhnian mode of HPS is a superior form of naturalism: it establishes contact to the practice of science without making itself dependent on its contingencies. (shrink)
In this paper I comment on a recent paper by [Scerri, E., & Worrall, J. . Prediction and the periodic table. Studies in History and Philosophy of Science, 32, 407–452.] about the role temporally novel and use-novel predictions played in the acceptance of Mendeleev’s periodic table after the proposal of the latter in 1869. Scerri and Worrall allege that whereas temporally novel predictions—despite Brush’s earlier claim to the contrary—did not carry any special epistemic weight, use-novel predictions did indeed contribute to (...) the acceptance of the table. Although I agree with their first claim, I disagree with their second. In order to spell out my disagreement, I not only revisit Scerri and Worrall’s interpretation of crucial historical evidence they have cited in support of the ‘heuristic account’ of use-novel predictions, but I also criticise the latter on general grounds.Keywords: Periodic table; Dmitri Mendeleev; Noble gases; Use-novel predictions; Heuristic account; Ad hoc hypotheses. (shrink)
In this paper, I discuss the discovery of the DNA structure by Francis Crick and James Watson, which has provoked a large historical literature but has yet not found entry into philosophical debates. I want to redress this imbalance. In contrast to the available historical literature, a strong emphasis will be placed upon analysing the roles played by theory, model, and evidence and the relationship between them. In particular, I am going to discuss not only Crick and Watson's well-known model (...) and Franklin's x-ray diffraction pictures (the evidence) but also the less well known theory of helical diffraction, which was absolutely crucial to Crick and Watson's discovery. The insights into this groundbreaking historical episode will have consequences for the new received view of scientific models and their function and relationship to theory and world. The received view, dominated by works by Cartwright and Morgan and Morrison ([1999]), rather than trying to put forth a theory of models, is interested in questions to do with (i) the function of models in scientific practice and (ii) the construction of models. In regard to (i), the received view locates the model (as an idealized, simplified version of the real system under investigation) between theory and the world and sees the model as allowing the application of the former to the latter. As to (ii) Cartwright has argued for a phenomenologically driven view and Morgan and Morrison ([1999]) for the autonomy of models in the construction process: models are determined neither by theory nor by the world. The present case study of the discovery of the DNA structure strongly challenges both (i) and (ii). In contrast to claim (i) of the received view, it was not Crick and Watson's model but rather the helical diffraction theory which served a mediating purpose between the model and the x-ray diffraction pictures. In particular, Cartwright's take on (ii) is refuted by a comparison of Franklin's bottom-up approach with Crick and Watson's top-down approach in constructing the model. The former led to difficulties, which only a strong confidence in the structure incorporated in the model could circumvent. How to Get to the Structure 1.1 X-ray diffraction and its synthesis 1.2 Model building and Pauling's panache 1.3 The structure of proteins 1.3.1 A failed inference to the best explanation 1.3.2 The misleading 5.1 Å spot in proteins and how to get rid of it 1.3.3 Derived predictions from Pauling's alpha-helix of protein molecules The CCV Theory of Helical X-Ray Diffraction 2.1 The role of the CCV theory in the discovery of the DNA structure Killing the Helix 3.1 Appreciating all evidence—in vain Conclusion Epilogue: Chargaff's Ratios CiteULike Connotea Del.icio.us What's this? (shrink)
In this paper I inquire into Bogen and Woodward’s data/phenomena distinction, which in a similar way to Cartwright’s construal of the model of superconductivity —although in a different domain—argues for a ‘bottom-up’ construction of phenomena from data without the involvement of theory. I criticise Bogen and Woodward’s account by analysing their melting point of lead example in depth, which is usually cited in the literature to illustrate the data/phenomenon distinction. Yet, the main focus of this paper lies on Matthias Kaiser’s (...) case study of the plate tectonic revolution, the most extensive case study that has been put forth to support the bottom-up construction of phenomena. On the basis of new historical evidence, which has been overlooked not only by Kaiser but also by the entire historical literature on the plate tectonic revolution, I demonstrate that phenomena are not constructed from the bottom-up but rather, admittedly counter-intuitively, from the top-down.Keywords: Data; Phenomena; Bottom-up; Theory-ladenness; Plate tectonic revolution. (shrink)
In this paper I defend Kuhn’s view of scientific discovery, which involves two central tenets, namely that a scientific discovery always requires a discovery-that and a discovery-what, and that there are two kinds of scientific discovery, resulting from the temporal order of the discovery-that and the discovery-what. I identify two problems with Kuhn’s account and offer solutions to them from a realist stance. Alternatives to Kuhn’s account are also discussed.
In a widely received paper on theory choice, Kuhn made three central claims. First, as a matter of empirical fact, different theories tend to score differently with regard to what Kuhn considered to be the standard set of theoretical virtues, i.e., empirical accuracy, internal and external consistency, scope, simplicity, and fertility. Whereas some theories will for instance be more empirically accurate than others, other theories will have greater external coherence with our background theories. Second, hardly ever does a theory’s being (...) virtuous in one particular respect legitimize a choice for that theory—not even when that virtue is empirical accuracy. Third, following from the first two .. (shrink)
What does it mean for a hypothesis to be ad hoc? One prominent account has it that ad hoc hypotheses have no independent empirical support. Others have viewed ad hoc judgements as subjective. Here I critically review both of these views and defend my own Coherentist Conception of Ad hocness by working out its conceptual and descriptive attractions.
Proponents of the “negative program” in experimental philosophy have argued that judgements in philosophical cases, also known as case judgements, are unreliable and that the method of cases should be either strongly constrained or even abandoned. Here we put one of the main proponent’s account of why philosophical cases may cause the unreliability of case judgements to the test. We conducted our test with thought experiments from physics, which exhibit the exact same supposedly “disturbing characteristics” of philosophical cases.
Philosophers use historical case studies to support wide-ranging claims about science. This practice is often criticized as problematic. In this paper we suggest that the function of case studies can be understood and justified by analogy to a well-established practice in biology: the investigation of model organisms. We argue that inferences based on case studies are no more problematic than inferences from model organisms to larger classes of organisms in biology. We demonstrate our view in detail by reference to a (...) case study with a long history: Semmelweis’s discovery of the cause of childbed fever. (shrink)
Perhaps the strongest argument for scientific realism, the no-miracles-argument, has been said to commit the so-called base rate fallacy. The apparent elusiveness of the base rate of true theories has even been said to undermine the rationality of the entire realism debate. In this paper, I confront this challenge by arguing, on the basis of the Kuhnian picture of theory choice, that a theory is likely to be true if it possesses multiple theoretical virtues and is embraced by numerous scientists, (...) even when the base rate converges to zero. (shrink)
This book examines the evidential status and use of linguistic intuitions, a topic that has seen increased interest in recent years. Linguists use native speakers' intuitions - such as whether or not an utterance sounds acceptable - as evidence for theories about language, but this approach is not uncontroversial. The two parts of this volume draw on the most recent work in both philosophy and linguistics to explore the two major issues at the heart of the debate. Chapters in the (...) first part address the 'justification question', critically analysing and evaluating the theoretical rationale for the evidential use of linguistic intuitions. The second part discusses recent developments in the domain of experimental syntax, focusing on the question of whether formal and systematic models of gathering intuitions are epistemically and methodologically superior to the informal methods that have traditionally been used. -/- The volume provides valuable insights into whether and how linguistic intuitions can be used in theorizing about language, and will be of interest to graduate students and researchers in linguistics, philosophy, and cognitive science. (shrink)