This paper offers a novel way of reconstructing conceptual change in empirical theories. Changes occur in terms of the structure of the dimensions—that is to say, the conceptual spaces—underlying the conceptual framework within which a given theory is formulated. Five types of changes are identified: (1) addition or deletion of special laws, (2) change in scale or metric, (3) change in the importance of dimensions, (4) change in the separability of dimensions, and (5) addition or (...) deletion of dimensions. Given this classification, the conceptual development of empirical theories becomes more gradual and rationalizable. Only the most extreme type—replacement of dimensions—comes close to a revolution. The five types are exemplified and applied in a case study on the development within physics from the original Newtonian mechanics to special relativity theory. (shrink)
Scientific realism is the position that success of a scientific theory licenses an inference to its approximate truth. The argument from pessimistic meta-induction maintains that this inference is undermined due to the existence of theories from the history of science that were successful, but false. I aim to counter pessimistic meta-induction and defend scientific realism. To do this, I adopt a notion of success that admits of degrees, and show that our current best theories enjoy far higher degrees of (...) success than any of the successful, but refuted theories of the past. (shrink)
This paper extends earlier work by its authors on formal aspects of the processes of contracting a theory to eliminate a proposition and revising a theory to introduce a proposition. In the course of the earlier work, Gardenfors developed general postulates of a more or less equational nature for such processes, whilst Alchourron and Makinson studied the particular case of contraction functions that are maximal, in the sense of yielding a maximal subset of the theory (or alternatively, (...) of one of its axiomatic bases), that fails to imply the proposition being eliminated. In the present paper, the authors study a broader class, including contraction functions that may be less than maximal. Specifically, they investigate "partial meet contraction functions", which are defined to yield the intersection of some nonempty family of maximal subsets of the theory that fail to imply the proposition being eliminated. Basic properties of these functions are established: it is shown in particular that they satisfy the Gardenfors postulates, and moreover that they are sufficiently general to provide a representation theorem for those postulates. Some special classes of partial meet contraction functions, notably those that are "relational" and "transitively relational", are studied in detail, and their connections with certain "supplementary postulates" of Gardenfors investigated, with a further representation theorem established. (shrink)
I argue against the currently popular view that a radical change in theory affects the meaning of theoretical terms, and hence render pre- and post-shift theories incomparable. I first show how to pose the meaning-change issue without appeal to meanings reified. I contend that arguments against theory-neutral observation languages are faulty, but that even if they were sound, there are semantic devices that allow a theory to refer to the factual basis of a competitor. This (...) suggests a picture of science as the accumulation of truths, with each successive stage being more stable. (shrink)
Value dimensions of mature theorychange in science are considered. It is argued that the interaction of the values of the cross-theories constitutes the major mechanism of theorychange in this dimension. Examples from history of science describing the details of the mechanism are given.
In this paper I claim that Quinean naturalist accounts of science, that deny that there are any a priori statements in scientific frameworks, cannot account for the foundational role of certain classes of statements in scientific practice. In this I follow Michael Friedman who claims that certain a priori statements must be presupposed in order to formulate empirical hypotheses. I also show that Friedman's account, in spite of his claims to the contrary, is compatible with a type of non-Quinean naturalism (...) that I sketch. Finally I also show that Friedman's account needs amending because it cannot provide a rational account of theorychange. I accomplish this by arguing for a structural realist view of theorychange. I show how this view fits well with an account like Friedman's and helps it deal with the problem of theorychange and in retaining its superiority over Quinean naturalism. (shrink)
This paper is concerned with formal aspects of the logic of theorychange, and in particular with the process of shrinking or contracting a theory to eliminate a proposition. It continues work in the area by the authors and Peter Gärdenfors. The paper defines a notion of safe contraction of a set of propositions, shows that it satisfies the Gärdenfors postulates for contraction and thus can be represented as a partial meet contraction, and studies its properties both (...) in general and under various natural constraints. (shrink)
This paper addresses the problem that Bayesian statistical inference cannot accommodate theorychange, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent the theoretical structure underlying the scheme. This is followed by an example of a change of hypotheses. The paper then presents a general framework for hypotheses change, and proposes the minimization of the distance (...) between hypotheses as a rationality criterion. Finally the paper discusses the import of this for Bayesian statistical inference. (shrink)
Synchronic norms of theory choice, a traditional concern in scientific methodology, restrict the theories one can choose in light of given information. Diachronic norms of theorychange, as studied in belief revision, restrict how one should change one’s current beliefs in light of new information. Learning norms concern how best to arrive at true beliefs. In this paper, we undertake to forge some rigorous logical relations between the three topics. Concerning, we explicate inductive truth conduciveness in (...) terms of optimally direct convergence to the truth, where optimal directness is explicated in terms of reversals and cycles of opinion prior to convergence. Concerning, we explicate Ockham’s razor and related principles of choice in terms of the information topology of the empirical problem context and show that the principles are necessary for reversal or cycle optimal convergence to the truth. Concerning, we weaken the standard principles of agm belief revision theory in intuitive ways that are also necessary for reversal or cycle optimal convergence. Then we show that some of our weakened principles of change entail corresponding principles of choice, completing the triangle of relations between,, and. (shrink)
In this paper I consider two accounts of scientific discovery, Robert Hudson's and Peter Achinstein's. I assess their relative success and I show that while both approaches are similar in promising ways, and address experimental discoveries well, they could address the concerns of the discovery sceptic more explicitly than they do. I also explore the implications of their inability to address purely theoretical discoveries, such as those often made in mathematical physics. I do so by showing that extending Hudson's or (...) Achinstein's account to such cases can sometimes provide a misleading analysis about who ought to be credited as a discoverer. In the final sections of the paper I work out some revisions to the Hudson/Achinstein account by drawing from a so-called structural realist view of theorychange. Finally, I show how such a modified account of discovery can answer sceptical critics such as Musgrave or Woolgar without producing misleading analyses about who ought to receive credit as a discoverer in cases from the mathematical sciences. I illustrate the usefulness of this approach by providing an analysis of the case of the discovery of the Casimir effect. (shrink)
What are the reasons for theorychange in economics? – The author tries to give a sober answer on the basis of his epistemological model reconstructing the internal aspects of theorychange. It is conjectures that a more subtle approach including the external facets can be provided with the “communicative rationality” concept. Key words: economics, theorychange, internalism, external factors .
Synchronic norms of theory choice, a traditional concern in scientific methodology, restrict the theories one can choose in light of given information. Diachronic norms of theorychange, as studied in belief revision, restrict how one should change one’s current beliefs in light of new information. Learning norms concern how best to arrive at true beliefs. In this paper, we undertake to forge some rigorous logical relations between the three topics. Concerning, we explicate inductive truth conduciveness in (...) terms of optimally direct convergence to the truth, where optimal directness is explicated in terms of reversals and cycles of opinion prior to convergence. Concerning, we explicate Ockham’s razor and related principles of choice in terms of the information topology of the empirical problem context and show that the principles are necessary for reversal or cycle optimal convergence to the truth. Concerning, we weaken the standard principles of agm belief revision theory in intuitive ways that are also necessary for reversal or cycle optimal convergence. Then we show that some of our weakened principles of change entail corresponding principles of choice, completing the triangle of relations between,, and. (shrink)
This paper dwells upon formal models of changes of beliefs, or theories, which are expressed in languages containing a binary conditional connective. After defining the basic concept of a (non-trivial) belief revision model. I present a simple proof of Gärdenfors''s (1986) triviality theorem. I claim that on a proper understanding of this theorem we must give up the thesis that consistent revisions (additions) are to be equated with logical expansions. If negated or might conditionals are interpreted on the basis of (...) autoepistemic omniscience, or if autoepistemic modalities (Moore) are admitted, even more severe triviality results ensue. It is argued that additions cannot be philosophically construed as parasitic (Levi) on expansions. In conclusion I outline somed logical consequences of the fact that we must not expect monotonic revisions in languages including conditionals. (shrink)
The postulate of Recovery, among the six postulates for theory contraction, formulated and studied by Alchourrón, Gärdenfors and Makinson is the one that has provoked most controversy. In this article we construct withdrawal functions that do not satisfy Recovery, but try to preserve minimal change, and relate these withdrawal functions with the AGM contraction functions.
The paper explores the relativistic implications of the thesis of incommensurability. A semantic form of incommensurability due to semantic variation between theories is distinguished from a methodological form due to variation in methodological standards between theories. Two responses to the thesis of semantic incommensurability are dealt with: the first challenges the idea of untranslatability to which semantic incommensurability gives rise; the second holds that relations of referential continuity eliminate semantic incommensurability. It is then argued that methodological incommensurability poses little risk (...) to the rationality or objectivity of science. For rational theory choice need neither be dictated by an algorithm nor governed by a binding set of rules. The upshot of the discussion is deflationary. There is little prospect for a relativistic conception of science based on inflated claims about the incommensurability of scientific theories. (shrink)
The Theory of theorychange has contraction and revision as its central notions. Of these, contraction is the more fundamental. The best-known theory, due to Alchourrón, Gärdenfors, and Makinson, is based on a few central postulates. The most fundamental of these is the principle of recovery: if one contracts a theory with respect to a sentence, and then adds that sentence back again, one recovers the whole theory. Recovery is demonstrably false. This paper shows (...) why, and investigates how one can nevertheless characterize contraction in a theoretically fruitful way. The theory proposed lends itself to implementation, which in turn could yield new theoretical insights. The Main proposal is a ‘staining algorithm’ which identifies which sentences to reject when contracting a theory. The algorithm requires one to be clear about the structure of reasons one has for including sentences within one's theory. (shrink)
Traditional approaches to theory structure and theorychange in science do not fare well when confronted with the practice of certain fields of science. We offer an account of contemporary practice in molecular biology designed to address two questions: Is theorychange in this area of science gradual or saltatory? What is the relation between molecular biology and the fields of traditional biology? Our main focus is a recent episode in molecular biology, the discovery of (...) enzymatic RNA. We argue that our reconstruction of this episode shows that traditional approaches to theory structure and theorychange need considerable refinement if they are to be defended as generally applicable. 1This paper emerged from discussions between us, and we are both equally responsible for its errors. We would like to thank Yvonne Paterson for helpful comments. (shrink)
(2013). Is scientific theorychange similar to early cognitive development? Gopnik on science and childhood. Philosophical Psychology: Vol. 26, No. 1, pp. 109-128. doi: 10.1080/09515089.2011.625114.
The thesis that scientists give greater weight to novel predictions than to explanations of known facts is tested against historical cases in physical science. Several theories were accepted after successful novel predictions but there is little evidence that extra credit was given for novelty. Other theories were rejected despite, or accepted without, making successful novel predictions. No examples were found of theories that were accepted primarily because of successful novel predictions and would not have been accepted if those facts had (...) been previously known. (shrink)
This two-part article examines the competition between the clonal selection theory and the instructive theory of the immune response from 1957–1967. In Part I the concept of a temporally extended theory is introduced, which requires attention to the hitherto largely ignored issue of theory individuation. Factors which influence the acceptability of such an extended theory at different temporal points are also embedded in a Bayesian framework, which is shown to provide a rational account of belief (...)change in science. In Part II these factors, as elaborated in the Bayesian framework, are applied to the case of the success of the clonal selection theory and the failure of the instructive theory. (shrink)
After a presentation of some relevant aspects of Chakravartty's semi-realism (A Metaphysics for scientific realism. Knowing the unobservable. Cambridge University Press, Cambridge, 2007), this paper addresses two difficulties that appear to be inherent to important components of his proposed metaphysics for scientific realism. First, if particulars and laws are concrete structures, namely actual groupings of causal properties as the semirealist contends, the relation between particulars and laws becomes also a relation between particulars with some annoying consequences. This worry—and some others—are (...) resolved if laws are taken to be statements and particulars are construed not only as groupings of properties, but things that contain a non-conceptual ingredient which can be given in perceptual awareness. Second, on the semirealist's view of particulars it becomes difficult to defend an epistemological version of scientific realism according to which we have good reasons to believe that the same things are referred to despite the fact that successive theories may attribute different properties to them. (shrink)
A comprehensible model is proposed aimed at an analysis of the reasons for theorychange in science. According to the model the origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” fundamental theories with each other, leading to contradictions that can only be eliminated in a more general theory. The model is illustrated with reference to physics in the early 20th century, the three “old” theories in this case being (...) Maxwellian electrodynamics, statistical mechanics and thermodynamics. Modern example, referring to general relativity and quantum field theory fusion, is highlighted. Key words: Popper, Kuhn, Lakatos, Feyerabend, Stepin, Bransky,Mamchur, mature theory, structure, Einstein, Lorentz, , Boltzmann, Planck, Hawking, De Witt. (shrink)
This two-part article examines the competition between the clonal selection theory and the instructive theory of the immune response from 1957–1967. In Part I the concept of a temporally extended theory is introduced, which requires attention to the hitherto largely ignored issue of theory individuation. Factors which influence the acceptability of such an extended theory at different temporal points are also embedded in a Bayesian framework, which is shown to provide a rational account of belief (...)change in science. In Part II these factors, as elaborated in the Bayesian framework, are applied to the case of the success of the clonal selection theory and the failure of the instructive theory. (shrink)
There is a great deal of justified concern about continuity through scientific theorychange. Our thesis is that, particularly in physics, such continuity can be appropriately captured at the level of conceptual frameworks using conceptual space models. Indeed, we contend that the conceptual spaces of three of our most important physical theories—Classical Mechanics, Special Relativity Theory, and Quantum Mechanics —have already been so modelled as phase-spaces. Working with their phase-space formulations, one can trace the conceptual changes and (...) continuities in transitioning from CM to QM, and from CM to SRT. By offering a revised severity-ordering of changes that conceptual frameworks can undergo, we provide reasons to doubt the commonly held view that CM is conceptually closer to SRT than QM. (shrink)
Changing the Theory of TheoryChange: Towards a Computational Approach’ (Tennant [1994]; henceforth CTTC) claimed that the AGM postulate of recovery is false, and that AGM contractions of theories can be more than minimally mutilating. It also described an alternative, computational method for contracting theories, called the Staining Algorithm. Makinson [1995] and Hansson and Rott [1995] criticized CTTC's arguments against AGM-theory, and its specific proposals for an alternative, computational approach. This paper replies as comprehensively as space (...) allows. (shrink)
Prioritized bases, i.e., weakly ordered sets of sentences, have been used for specifying an agent’s ‘basic’ or ‘explicit’ beliefs, or alternatively for compactly encoding an agent’s belief state without the claim that the elements of a base are in any sense basic. This paper focuses on the second interpretation and shows how a shifting of priorities in prioritized bases can be used for a simple, constructive and intuitive way of representing a large variety of methods for the change of (...) belief states—methods that have usually been characterized semantically by a system-of-spheres modeling. Among the methods represented are ‘radical’, ‘conservative’ and ‘moderate’ revision, ‘revision by comparison’ in its raising and lowering variants, as well as various constructions for belief expansion and contraction. Importantly, none of these methods makes any use of numbers. (shrink)
Two views of scientific theories dominated the philosophy of science during the twentieth century, the syntactic view of the logical empiricists and the semantic view of their successors. I show that neither view is adequate to provide a proper understanding of the connections that exist between theories at different times. I outline a new approach, a hybrid of the two, that provides the right structural connection between earlier and later theories, and that takes due account of the importance of the (...) mathematical models of a theory and of the various distinct formulations that pick out these models. (shrink)
This article examines various (in my view) failed or problematic attempts to overcome the limits of logical empiricism in epistemology and philosophy of science. It focuses on Quine's influential critique of that doctrine and on subsequent critiques of Quine that challenge his appeal to the scheme/content dichotomy as a third residual 'dogma' of empiricism (Davidson) or his espousal of a radically physicalist approach that rejects the possibility of quantifying into modal contexts (Marcus). I endorse these criticisms as valid on their (...) own terms but argue that they have taken rise within a context of debate that is artificially narrowed through its failure to engage with developments in the 'other' (continental) tradition, among them Husserlian phenomenology and the critical epistemology of thinkers like Bachelard and Canguilhem. I suggest that these provide a promising alternative to some of the more extreme relativist positions adopted in post-Kuhnian philosophy of science. Above all they offer a means of relating historical approaches concerned with the scientific 'context of discovery' to analytic approaches that typically address logical, conceptual, or procedural issues in the 'context of justification'. (shrink)