Proponents of the value ladenness of science rely primarily on arguments from underdetermination or inductive risk, which share the premise that we should only consider values where the evidence runs out or leaves uncertainty; they adopt a criterion of lexical priority of evidence over values. The motivation behind lexical priority is to avoid reaching conclusions on the basis of wishful thinking rather than good evidence. This is a real concern, however, that giving lexical priority to evidential considerations over values (...) is a mistake and unnecessary for avoiding the wishful thinking. Values have a deeper role to play in science. (shrink)
String theory promises to be able to provide us with a working theory of quantum gravity and a unified description of all fundamental forces. In string theory there are so called ‘dualities’; i.e. different theoretical formulations that are physically equivalent. In this article these dualities are investigated from a philosophical point of view. Semantic and epistemic questions relating to the problem of underdetermination of theories by data and the debate on realism concerning scientific theories are discussed. Depending on ones (...) views on semantic issues and realism different interpretations are possible of the dualities. (shrink)
Various forms of underdetermination that might threaten the realist stance are examined. That which holds between different 'formulations' of a theory (such as the Hamiltonian and Lagrangian formulations of classical mechanics) is considered in some detail, as is the 'metaphysical' underdetermination invoked to support 'ontic structural realism'. The problematic roles of heuristic fruitfulness and surplus structure in attempts to break these forms of underdetermination are discussed and an approach emphasizing the relevant structural commonalities is defended.
I focus on a key argument for global external world scepticism resting on the underdetermination thesis: the argument according to which we cannot know any proposition about our physical environment because sense evidence for it equally justifies some sceptical alternative (e.g. the Cartesian demon conjecture). I contend that the underdetermination argument can go through only if the controversial thesis that conceivability is per se a source of evidence for metaphysical possibility is true. I also suggest a reason to (...) doubt that conceivability is per se a source of evidence for metaphysical possibility, and thus to doubt the underdetermination argument. (shrink)
Structural realism is sometimes said to undermine the theory underdetermination (TUD) argument against realism, since, in usual TUD scenarios, the supposed underdetermination concerns the object-like theoretical content but not the structural content. The paper explores the possibility of structural TUD by considering some special cases from modern physics, but also questions the validity of the TUD argument itself. The upshot is that cases of structural TUD cannot be excluded, but that TUD is perhaps not such a terribly serious (...) anti-realistic argument. (shrink)
If cosmology is to obtain knowledge about the whole universe, it faces an underdetermination problem: Alternative space-time models are compatible with our evidence. The problem can be avoided though, if there are good reasons to adopt the Cosmological Principle (CP), because, assuming the principle, one can confine oneself to the small class of homogeneous and isotropic space-time models. The aim of this paper is to ask whether there are good reasons to adopt the Cosmological Principle in order to avoid (...)underdetermination in cosmology. Various strategies to justify the CP are examined. For instance, arguments to the effect that the truth of the CP follows generically from a large set of initial conditions; an inference to the best explanation; and an inductive strategy are assessed. I conclude that a convincing justification of the CP has not yet been established, but this claim is contingent on a number of results that may have to be revised in the future. (shrink)
Kyle Stanford’s arguments against scientific realism are assessed, with a focus on the underdetermination of theory by evidence. I argue that discussions of underdetermination have neglected a possible symmetry which may ameliorate the situation.
In this paper, I argue (i) that there are certain methodological practices that are epistemically significant, and (ii) that we can test for the success of these practices empirically by examining case-studies in the history of science. Analysing a particular episode from the history of medicine, I explain how this can help us resolve specific cases of underdetermination. I conclude that, while the anti-realist is (more or less legitimately) able to construct underdetermination scenarios on a case-by-case basis, he (...) will have to abandon the strategy of using algorithms to do so, thus losing the much needed guarantee that there will always be rival cases of the required kind. (shrink)
In 1991 Larry Laudan and Jarret Leplin proposed a solution for the problem of empirical equivalence and the empirical underdetermination that is often thought to result from it. In this paper we argue that, even though Laudan and Leplin’s reasoning is essentially correct, their solution should be accurately assessed in order to appreciate its nature and scope. Indeed, Laudan and Leplin’s analysis does not succeed in completely removing the problem or, as they put it, in refuting the thesis of (...)underdetermination as a consequence of empirical equivalence. Instead, what they show is merely that science possesses tools that may eventually lead out of an underdetermination impasse. We apply their argument to a real case of two empirically equivalent theories: Lorentz’s ether theory and Einstein’s special relativity. This example illustrates the validity of Laudan and Leplin’s reasoning, but also shows the importance of the reassessment we argue for. (shrink)
It is argued that, contrary to prevailing opinion, Bas van Fraassen nowhere uses the argument from underdetermination in his argument for constructive empiricism. It is explained that van Fraassen’s use of the notion of empirical equivalence in The Scientific Image has been widely misunderstood. A reconstruction of the main arguments for constructive empiricism is offered, showing how the passages that have been taken to be part of an appeal to the argument from underdetermination should actually be interpreted.
What attitude should we take toward a scientific theory when it competes with other scientific theories? This question elicited different answers from instrumentalists, logical positivists, constructive empiricists, scientific realists, holists, theory-ladenists, antidivisionists, falsificationists, and anarchists in the philosophy of science literature. I will summarize the diverse philosophical responses to the problem of underdetermination, and argue that there are different kinds of underdetermination, and that they should be kept apart from each other because they call for different responses.
There are many parts of science in which a certain sort of underdetermination of theory by evidence is known to be common. It is argued that reflection on this fact should serve to shift the burden of proof from scientific anti-realists to scientific realists at a crucial point in the debate between them.
Scott Soames argues that interpreted in the light of Quine's holistic verificationism, Quine's thesis of underdetermination leads to a contradiction. It is contended here that if we pay proper attention to the evolution of Quine's thinking on the subject, particularly his criterion of theory individuation, Quine's thesis of underdetermination escapes Soames' charge of paradoxicality.
One of the objections against the thesis of underdetermination of theories by observations is that it is unintelligible. Any two empirically equivalent theories — so the argument goes—are in principle intertranslatable, hence cannot count as rivals in any non-trivial sense. Against that objection, this paper shows that empirically equivalent theories may contain theoretical sentences that are not intertranslatable. Examples are drawn from a related discussion about incommensurability that shows that theoretical non-intertranslatability is possible.
There are results which show that measure-theoretic deterministic models and stochastic models are observationally equivalent. Thus there is a choice between a deterministic and an indeterministic model and the question arises: Which model is preferable relative to evidence? If the evidence equally supports both models, there is underdetermination. This paper first distinguishes between different kinds of choice and clarifies the possible resulting types of underdetermination. Then a new answer is presented: the focus is on the choice between a (...) Newtonian deterministic model supported by indirect evidence from other Newtonian models which invoke similar additional assumptions about the physical systems and a stochastic model that is not supported by indirect evidence. It is argued that the deterministic model is preferable. The argument against underdetermination is then generalised to a broader class of cases. Finally, the paper criticises the extant philosophical answers in relation to the preferable model. Winnie’s (1998) argument for the deterministic model is shown to deliver the correct conclusion relative to observations which are possible in principle and where there are no limits, in principle, on observational accuracy (the type of choice Winnie was concerned with). However, in practice the argument fails. A further point made is that Hoefer’s (2008) argument for the deterministic model is untenable. (shrink)
Duhem—Quine underdetermination plays a constructive role in epistemology by pinpointing the impact of non-empirical virtues or cognitive values on theory choice. Underdetermination thus contributes to illuminating the nature of scientific rationality. Scientists prefer and accept one account among empirical equivalent alternatives. The non-empirical virtues operating in science are laid open in such theory choice decisions. The latter act as an epistemological test tube in making explicit commitments to how scientific knowledge should be like.
As climate policy decisions are decisions under uncertainty, being based on a range of future climate change scenarios, it becomes a crucial question how to set up this scenario range. Failing to comply with the precautionary principle, the scenario methodology widely used in the Third Assessment Report of the International Panel on Climate Change (IPCC) seems to violate international environmental law, in particular a provision of the United Nations Framework Convention on Climate Change. To place climate policy advice on a (...) sound methodological basis would imply that climate simulations which are based on complex climate models had, in stark contrast to their current hegemony, hardly an epistemic role to play in climate scenario analysis at all. Their main function might actually consist in ‘foreseeing future ozone-holes’. In order to argue for these theses, I explain first of all the plurality of climate models used in climate science by the failure to avoid the problem of underdetermination. As a consequence, climate simulation results have to be interpreted as modal sentences, stating what is possibly true of our climate system. This indicates that climate policy decisions are decisions under uncertainty. Two general methodological principles which may guide the construction of the scenario range are formulated and contrasted with each other: modal inductivism and modal falsificationism. I argue that modal inductivism, being the methodology implicitly underlying the third IPCC report, is severely flawed. Modal falsificationism, representing the sound alternative, would in turn require an overhaul of the IPCC practice. (shrink)
In contemporary epistemology, sceptical arguments are motivated either by the closure principle or the underdetermination principle. Therefore, it is very important to figure out the structure of the sceptical argument before coming up with an anti-sceptic strategy. With a review of the debate on the relationship between the two principles from Anthony Brueckner to Kevin McCain, it is argued that while maintaining the weak closed justification , closure and underdetermination are not logically equivalent. As a result, two independent (...) responses are needed to answer the sceptical problem satisfactorily. Also, in order to secure a philosophically significant notion of justification and evidence, reasons are given to hold WCJ*, as opposed to rejecting it. This understanding of the sceptical argument would help to focus the real challenge of scepticism. (shrink)
Thick terms and concepts in ethics somehow combine evaluation and non-evaluative description. The non-evaluative aspects of thick terms and concepts underdetermine their extensions. Many writers argue that this underdetermination point is best explained by supposing that thick terms and concepts are semantically evaluative in some way such that evaluation plays a role in determining their extensions. This paper argues that the extensions of thick terms and concepts are underdetermined by their meanings in toto, irrespective of whether their extensions are (...) partly determined by evaluation; the underdetermination point can therefore be explained without supposing that thick terms and concepts are semantically evaluative. My argument applies general points about semantic gradability and context-sensitivity to the semantics of thick terms and concepts. (shrink)
This paper examines the epistemological significance of the present situation of underdetermination in quantum mechanics. After analyzing this underdetermination at three levels---formal, ontological, and methodological---the paper considers implications for a number of variants of the thesis of scientific realism in fundamental physics and reassesses Lakatos‘ characterization of progress in physical theory in light of the present situation. Next, this paper considers the implications of underdetermination for Weinberg’s ‘‘dream of a final theory.’’ Finally, the paper concludes by suggesting (...) how one might still think of realism and progress in fundamental physics despite the possibility of persistent underdetermination in quantum mechanics. (shrink)
The underdetermination of theory by data argument (UD) is traditionally construed as an argument that tells us that we ought to favour an anti-realist position over a realist position. I argue that when UD is constructed as an argument saying that theory choice is to proceed between theories that are empirically equivalent and adequate to the phenomena up until now, the argument will not favour constructive empiricism over realism. A constructive empiricist cannot account for why scientists are reasonable in (...) expecting one theory to be empirically adequate rather than another, given the criteria he suggests for theory choice. (shrink)
Thomas Bonk has dedicated a book to analyzing the thesis of underdetermination of scientific theories, with a chapter exclusively devoted to the analysis of the relation between this idea and the indeterminacy of meaning. Both theses caused a revolution in the philosophic world in the sixties, generating a cascade of articles and doctoral theses. Agitation seems to have cooled down, but the point is still debated and it may be experiencing a renewed resurgence.
This paper pursues Ernan McMullin‘s claim ("Virtues of a Good Theory" and related papers on theory-choice) that talk of theory virtues exposes a fault-line in philosophy of science separating "very different visions" of scientific theorizing. It argues that connections between theory virtues and virtue epistemology are substantive rather than ornamental, since both address underdetermination problems in science, helping us to understand the objectivity of theory choice and more specifically what I term the ampliative adequacy of scientific theories. The paper (...) argues therefore that virtue epistemologies can make substantial contributions to the epistemology and methodology of the sciences, helping to bridge the gulf between realists and anti-realists, and to re-enforce moderation over claims about the implications of underdetermination problems for scientific inquiry. It finally makes and develops the suggestion that virtue epistemologies, at least of the kind developed here, offer support to the position that philosophers of science know as normative naturalism. (shrink)
This paper reassesses the question of whether Craig’s theorem poses a challenge to Quine's empirical underdetermination thesis. It will be demonstrated that Quine’s account of this issue in his paper “Empirically Equivalent Systems of the World” (1975) is flawed and that Quine makes too strong a concession to the Craigian challenge. It will further be pointed out that Craig’s theorem would threaten the empirical underdetermination thesis only if the set of all relevant observation conditionals could be shown to (...) be recursively enumerable — a condition which Quine seems to overlook — and it will be argued that, at least within the framework of Quine’s philosophy, it is doubtful whether this condition is satisfiable. (shrink)
In this paper, I will show that the Miracle Argument is unsound if one assumes a certain form of transient underdetermination. For this aim, I will first discuss and formalize several variants of underdetermination, especially that of transient underdetermination, by means of measure theory. I will then formalize a popular and persuasive form of the Miracle Argument that is based on "use novelty". I will then proceed to the proof that the miracle argument is unsound by means (...) of a mathematical example. Finally, I will expose two hidden presuppositions of the Miracle Argument that make it so immensely though deceptively persuasive. (shrink)
According to Duncan Pritchard, there are two kinds of radical sceptical problem; the closure-based problem, and the underdetermination-based problem. He argues that distinguishing these two problems leads to a set of desiderata for an anti-sceptical response, and that the way to meet all of these desiderata is by supplementing a form of Wittgensteinian contextualism with disjunctivist views about factivity. I agree that an adequate response should meet most of the initial desiderata Pritchard puts forward, and that some version of (...) Wittgensteinian contextualism shows the most promise as a starting point for this, but I argue, contra Pritchard, that the addition of disjunctivism is unnecessary and potentially counter-productive. If we draw on lessons from Michael Williams's inferential contextualism then it is both possible, and preferable, to meet the most important of Pritchard's desiderata, undercutting both closure-based and underdetermination-based sceptical problems in a unified way, without the need to resort to disjunctivism. (shrink)
Anthony Brueckner has argued that claims about underdetermination of evidence are suppressed in closure-based scepticism (“The Structure of the Skeptical Argument”, Philosophy and Phenomenological Research 54:4, 1994). He also argues that these claims about underdetermination themselves lead to a paradoxical sceptical argument—the underdetermination argument—which is more fundamental than the closure argument. If Brueckner is right, the status quo focus of some predominant anti-sceptical strategies may be misguided. In this paper I focus specifically on the relationship between these (...) two arguments. I provide support for Brueckner’s claim that the underdetermination argument is the more fundamental sceptical argument. I do so by responding to a challenge to this claim put forward by Stewart Cohen (“Two Kinds of Skeptical Argument”, Philosophy and Phenomenological Research 58:1, 1998). Cohen invokes an alternative epistemic principle which he thinks can be used to challenge Brueckner. Cohen’s principle raises interesting questions about the relationship between evidential considerations and explanatory considerations in the context of scepticism about our knowledge of the external world. I explore these questions in my defence of Brueckner. (shrink)
This paper examines the underdetermination between the Ptolemaic, Copernican, and the Tychonic theories of planetary motions and its attempted resolution by Kepler. I argue that past philosophical analyses of the problem of the planetary motions have not adequately grasped a method through which the underdetermination might have been resolved. This method involves a procedure of what I characterize as decomposition and identification. I show that this procedure is used by Kepler in the first half of the Astronomia Nova, (...) where he ultimately claims to have refuted the Ptolemaic theory, thus partially overcoming the underdetermination. Finally, I compare this method with other views of scientific inference such as bootstrapping. (shrink)
In his influential 1960 paper ‘The Unreasonable Effectiveness of Mathematics in the Natural Sciences’, Eugene P. Wigner raises the question of why something that was developed without concern for empirical facts—mathematics—should turn out to be so powerful in explaining facts about the natural world. Recent philosophy of science has developed ‘Wigner’s puzzle’ in two different directions: First, in relation to the supposed indispensability of mathematical facts to particular scientific explanations and, secondly, in connection with the idea that aesthetic criteria track (...) theoretical desiderata such as empirical success. An important aspect of Wigner’s article has, however, been overlooked in these debates: his worries about the underdetermination of physical theories by mathematical frameworks. The present paper argues that, by restoring this aspect of Wigner’s argument to its proper place, Wigner’s puzzle may become an instructive case study for the teaching of core issues in the philosophy of science and its history. (shrink)
Are theories 'underdetermined by the evidence' in any way that should worry the scientific realist? I argue that no convincing reason has been given for thinking so. A crucial distinction is drawn between data equivalence and empirical equivalence. Duhem showed that it is always possible to produce a data equivalent rival to any accepted scientific theory. But there is no reason to regard such a rival as equally well empirically supported and hence no threat to realism. Two theories are empirically (...) equivalent if they share all consequences expressed in purely observational vocabulary. This is a much stronger requirement than has hitherto been recognised—two such 'rival' theories must in fact agree on many claims that are clearly theoretical in nature. Given this, it is unclear how much of an impact on realism a demonstration that there is always an empirically equivalent 'rival' to any accepted theory would have—even if such a demonstration could be produced. Certainly in the case of the version of realism that I defend—structural realism—such a demonstration would have precisely no impact: two empirically equivalent theories are, according to structural realism, cognitively indistinguishable. (shrink)
The paper explicates unique events and investigates their epistemology. Explications of unique events as individuated, different, and emergent are philosophically uninteresting. Unique events are topics of why-questions that radically underdetermine all their potential explanations. Uniqueness that is relative to a level of scientific development is differentiated from absolute uniqueness. Science eliminates relative uniqueness by discovery of recurrence of events and properties, falsification of assumptions of why-questions, and methodological simplification e.g. by explanatory methodological reduction. Finally, an overview of contemporary philosophical disputes (...) that hinge on issues of uniqueness emphasizes its philosophical significance. (shrink)
Newton claims to have proven the heterogeneity of light through his experimentum crucis. However, Olaf Müller has worked out in detail Goethe’s idea that one could likewise prove the heterogeneity of darkness by inverting Newton’s famous experiment. Müller concludes that this invalidates Newton’s claim of proof. Yet this conclusion only holds if the heterogeneity of light and the heterogeneity of darkness is logically incompatible. This paper shows that this is not the case. Instead, in Quine’s terms, we have two logically (...) compatible theories based on mutually irreducible theoretical terms. From a Quinean point of view, this does no harm to the provability of the corresponding statements. (shrink)
Goethe's objections to Newton's theory of light and colours are better than often acknowledged. You can accept the most important elements of these objections without disagreeing with Newton about light and colours. As I will argue, Goethe exposed a crucial weakness of Newton's methodological self-assessment. Newton believed that with the help of his prism experiments, he could prove that sunlight was composed of variously coloured rays of light. Goethe showed that this step from observation to theory is more problematic than (...) Newton wanted to admit. By insisting that the step to theory is not forced upon us by the phenomena, Goethe revealed our own free, creative contribution to theory construction. And Goethe's insight is surprisingly significant, because he correctly claimed that all of the results of Newton's prism experiments fit a theoretical alternative equally well. If this is correct, then by suggesting an alternative to a well-established physical theory, Goethe developed the problem of underdete... (shrink)
This paper examines the state of the field of “science and values”—particularly regarding the implications of the thesis of transient underdetermination for the ideal of value-free science, or what I call the “ideal of epistemic purity.” I do this by discussing some of the main arguments in the literature, both for and against the ideal. I examine a preliminary argument from transient underdetermination against the ideal of epistemic purity, and I discuss two different formulations of an objection to (...) this argument—an objection that requires the strict separation of the epistemic from the practical. A secondary aim of the paper is to suggest some future directions for the field, one of which is to replace the vocabulary of values that is often employed in the literature with a more precise one. (shrink)
Quantum field theory (QFT) presents a genuine example of the underdetermination of theory by empirical evidence. There are variants of QFT—for example, the standard textbook formulation and the rigorous axiomatic formulation—that are empirically indistinguishable yet support different interpretations. This case is of particular interest to philosophers of physics because, before the philosophical work of interpreting QFT can proceed, the question of which variant should be subject to interpretation must be settled. New arguments are offered for basing the interpretation of (...) QFT on a rigorous axiomatic variant of the theory. The pivotal considerations are the roles that consistency and idealization play in this case. *Received June 2009; revised August 2009. †To contact the author, please write to: Department of Philosophy, University of Waterloo, 200 University Avenue West, Waterloo, ON N2L 3G1, Canada; e‐mail: firstname.lastname@example.org. (shrink)
Theories of verisimilitude have routinely been classified into two rival camps—the content approach and the likeness approach—and these appear to be motivated by very different sets of data and principles. The question thus naturally arises as to whether these approaches can be fruitfully combined. Recently Zwart and Franssen (Synthese 158(1):75–92, 2007) have offered precise analyses of the content and likeness approaches, and shown that given these analyses any attempt to meld content and likeness orderings violates some basic desiderata. Unfortunately their (...) characterizations of the approaches do not embrace the paradigm examples of those approaches. I offer somewhat different characterizations of these two approaches, as well as of the consequence approach (Schurz and Weingartner (Synthese 172(3):415–436, 2010) which happily embrace their respective paradigms. Finally I prove that the three approaches are indeed compatible, but only just, and that the cost of combining them is too high. Any account which combines the strictures of what I call the strong likeness approach with the demands of either the content or the consequence approach suffers from precisely the same defect as Popper’s—namely, it entails the trivialization of truthlikeness. The downside of eschewing the strong likeness constraints and embracing the content constraints alone is the underdetermination of the concept of truthlikeness. (shrink)
Advocates have sought to prove that underdetermination obtains because all theories have empirical equivalents. But algorithms for generating empirical equivalents simply exchange underdetermination for familiar philosophical chestnuts, while the few convincing examples of empirical equivalents will not support the desired sweeping conclusions. Nonetheless, underdetermination does not depend on empirical equivalents: our warrant for current theories is equally undermined by presently unconceived alternatives as well-confirmed merely by the existing evidence, so long as this transient predicament recurs for each (...) theory and body of evidence we consider. The historical record supports the claim that this recurrent, transient underdetermination predicament is our own. (shrink)
Anthony Brueckner argues for a strong connection between the closure and the underdetermination argument for scepticism. Moreover, he claims that both arguments rest on infallibilism: In order to motivate the premises of the arguments, the sceptic has to refer to an infallibility principle. If this were true, fallibilists would be right in not taking the problems posed by these sceptical arguments seriously. As many epistemologists are sympathetic to fallibilism, this would be a very interesting result. However, in this paper (...) I will argue that Brueckner’s claims are wrong: The closure and the underdetermination argument are not as closely related as he assumes and neither rests on infallibilism. Thus even a fallibilist should take these arguments to raise serious problems that must be dealt with somehow. (shrink)
The underdetermination of theory by evidence must be distinguished from holism. The latter is a doctrine about the testing of scientific hypotheses; the former is a thesis about empirically adequate logically incompatible global theories or "systems of the world". The distinction is crucial for an adequate assessment of the underdetermination thesis. The paper shows how some treatments of underdetermination are vitiated by failure to observe this distinction, and identifies some necessary conditions for the existence of multiple empirically (...) equivalent global theories. We consider how empiricists should respond to the possibility of such systems of the world. (shrink)
Earman (1993) distinguishes three notions of empirical indistinguishability and offers a rigorous framework to investigate how each of these notions relates to the problem of underdetermination of theory choice. He uses some of the results obtained in this framework to argue for a version of scientific anti- realism. In the present paper we first criticize Earman's arguments for that position. Secondly, we propose and motivate a modification of Earman's framework and establish several results concerning some of the notions of (...) indistinguishability in this modified framework. Finally, we interpret these results in the light of the realism/anti- realism debate. (shrink)
Four empirically equivalent versions of general relativity, namely standard GR, Lorentz-invariant gravitational theory,and the gravitational gauge theories of the Lorentz and translation groups, are investigated in the form of a case study for theory underdetermination. The various ontological indeterminacies (both underdetermination and inscrutability of reference) inherent in gravitational theories are analyzed in a detailed comparative study. The concept of practical underdetermination is proposed, followed by a discussion of its adequacy to describe scientific progress.
The goal of this article is to show that the structuralist approachprovides a powerful framework for the analysis of certain holistic phenomena in empirical theories.We focus on two aspects of holism. The first refers to the involvement of comprehensive complexes of hypothesesin the theoretical treatment of systems regarded in isolation. By contrast, the second refers to thecorrelation between the theoretical descriptions of different systems. It is demonstrated how these two aspectscan be analysed by making use of the structuralist notion of (...) theory-nets, and how they are reflected by a refinedversion of the Ramsey sentence. Furthermore, it is argued that there exists a tight correlation between theoccurrence of these two holistic phenomena, a specific form of underdetermination of terms which occur in thefundamental principles of an empirical theory, and the shaping of the theory's protective belt. After having dealtwith these questions in abstracto, the relevance of these considerations for a better understanding of the dynamicsof empirical theories is demonstrated in a concrete case study. It refers to the role holistic phenomenaplayed in the investigation of the anomalous advance of Mercury's perihelion and in the various attempts to eliminate this anomaly. (shrink)
The first part of this paper discusses Quine’s views on underdetermination of theory by evidence, and the indeterminacy of translation, or meaning, in relation to certain physical theories. The underdetermination thesis says different theories can be supported by the same evidence, and the indeterminacy thesis says the same component of a theory that is underdetermined by evidence is also meaning indeterminate. A few examples of underdetermination and meaning indeterminacy are given in the text. In the second part (...) of the paper, Quine’s scientific realism is discussed briefly, along with some of the difficulties encountered when considering the ‘truth’ of different empirically equivalent theories. It is concluded that the difference between underdetermination and indeterminacy, while significant, is not as great as Quine claims. It just means that after we have chosen a framework theory, from a number of empirically equivalent ones, we still have further choices along two different dimensions. (shrink)
: Where there are cases of underdetermination in scientific controversies, such as the case of the molecular clock, scientists may direct the course and terms of dispute by playing off the multidimensional framework of theory evaluation. This is because assessment strategies themselves are underdetermined. Within the framework of assessment, there are a variety of trade-offs between different strategies as well as shifting emphases as specific strategies are given more or less weight in assessment situations. When a strategy is underdetermined, (...) scientists can change the dynamics of a controversy by making assessments using different combinations of evaluation strategies and/or weighting whatever strategies are in play in different ways. Following an underdetermination strategy does not end or resolve a scientific dispute. Consequently, manipulating underdetermination is a feature of controversy dynamics and not controversy closure. (shrink)
According to the thesis of semantic underdetermination, most sentences of a natural language lack a definite semantic interpretation. This thesis supports an argument against the use of natural language as an instrument of thought, based on the premise that cognition requires a semantically precise and compositional instrument. In this paper we examine several ways to construe this argument, as well as possible ways out for the cognitive view of natural language in the introspectivist version defended by Carruthers. Finally, we (...) sketch a view of the role of language in thought as a specialized tool, showing how it avoids the consequences of semantic underdetermination. (shrink)
I discuss how modern cosmology illustrates underdetermination of theoretical hypotheses by data, in ways that are different from most philosophical discussions. I confine the discussion to the history of the observable universe from about one second after the Big Bang, as described by the mainstream cosmological model: in effect, what cosmologists in the early 1970s dubbed the ‘standard model’, as elaborated since then. Or rather, the discussion is confined to a (very!) few aspects of that history. I emphasize that (...) despite the underdetermination, a scientific realist can, and should, endorse this description. (shrink)
I examine the argument that scientific theories are typically 'underdetermined' by the data, an argument which has often been used to combat scientific realism. I deal with two objections to the underdetermination argument: (i) that the argument conflicts with the holistic nature of confirmation, and (ii) that the argument rests on an untenable theory/data dualism. I discuss possible responses to both objections, and argue that in both cases the proponent of underdetermination can respond in ways which are individually (...) plausible, but that the best response to the first objection conflicts with the best response to the second. Consequently underdetermination poses less of a problem for scientific realism than has often been thought. (shrink)
Quine’s thesis of underdetermination is significantly weaker than it has been taken to be in the recent literature, for the following reasons: (i) it does not hold for all theories, but only for some global theories, (ii) it does not require the existence of empirically equivalent yet logically incompatible theories, (iii) it does not rule out the possibility that all perceived rivalry between empirically equivalent theories might be merely apparent and eliminable through translation, (iv) it is not a fundamental (...) thesis within Quine’s philosophy, and (v) it does not carry with it the anti-realistic consequences often associated with the thesis in recent debates. The paper analyzes Quine’s views on the matter and the changes they underwent over the years. A conjecture is put forth about why Quine’s thesis has been so widely misrepresented: Quine’s writings up to 1975 tackled primarily the formulation and justification of the thesis, but afterwards were concerned mostly with the question whether empirically equivalent rivals to the theory we hold are to be considered true also. When this latter discussion is read without bearing in mind Quine’s earlier formulation and justification of the thesis, his thesis seems to have stronger epistemic consequences than it actually does. A careful reading of his later writings shows, however, that the formulation of the thesis remained unchanged after 1975, and that his mature and considered views supported only a very mitigated version of the thesis. (shrink)
This paper offers a general characterization of underdetermination and gives a prima facie case for the underdetermination of the topology of the universe. A survey of several philosophical approaches to the problem fails to resolve the issue: the case involves the possibility of massive reduplication, but Strawson on massive reduplication provides no help here; it is not obvious that any of the rival theories are to be preferred on grounds of simplicity; and the usual talk of empirically equivalent (...) theories misses the point entirely. (If the choice is underdetermined, then the theories are not empirically equivalent!) Yet the thought experiment is analogous to a live scientific possibility, and actual astronomy faces underdetermination of this kind. This paper concludes by suggesting how the matter can be resolved, either by localizing the underdetermination or by defeating it entirely. Introduction A brief preliminary Around the universe in 80 days Some attempts at resolving the problem 4.1 Indexicality 4.2 Simplicity 4.3 Empirical equivalence 4.4 Is this just a philosophers' fantasy? Move along... ...nothing to see here 6.1 Rules of repetition 6.2 Some possible replies Conclusion. (shrink)
Advocates of the "strong programme" in the sociology of knowledge have argued that, because scientific theories are "underdetermined" by data, sociological factors must be invoked to explain why scientists believe the theories they do. I examine this argument, and the responses to it by J.R. Brown (1989) and L. Laudan (1996). I distinguish between a number of different versions of the underdetermination thesis, some trivial, some substantive. I show that Brown's and Laudan's attempts to refute the sociologists' argument fail. (...) Nonetheless, the sociologists' argument falls to a different criticism, for the version of the underdetermination thesis that the argument requires, has not been shown to be true. (shrink)