Under what circumstances, if any, are we warranted to assert that a theory is true or at least approximately true? Scientific realists answer that such assertions are warranted only for those theories that enjoy explanatory and predictive success. A number of challenges to this answer have emerged, chief among them the argument from pessimistic meta-induction. According to this challenge, the history of science supplies ample evidence against realism in the form of successful theories that are now considered false. The main (...) realist reaction to this challenge questions the legitimacy of the pessimistic meta-inductivist inference. Advocates of this approach argue that upon closer scrutiny the historical record can be reconciled with scientific realism. When a successful theory is abandoned, not all of its components are discarded but only those that are inessential or idle for the theory’s success. Their abandonment is thus inconsequential for the realist. So long as the essential components survive into the new theory there is no cause for alarm. More precisely, an outdated theory T which enjoyed some measure of success must, according to the realist, be: (i) partially true precisely because some of its theoretical claims are.. (shrink)
One of the first to criticize the verifiability theory of meaning embraced by logical empiricists, Reichenbach ties the significance of scientific statements to their predictive character, which offers the condition for their testability. While identifying prediction as the task of scientific knowledge, Reichenbach assigns induction a pivotal role, and regards the theory of knowledge as a theory of prediction based on induction. Reichenbach's inductivism is grounded on the frequency notion of probability, of which he prompts a more flexible version (...) than that of Richard von Mises. Unlike von Mises, Reichenbach attempts to account for single case probabilities, and entertains a restricted notion of randomness, more suitable for practical purposes. Moreover, Reichenbach developed a theory of induction, absent from von Mises's perspective, and argued for the justification of induction. This article outlines the main traits of Reichenbach's inductivism, with special reference to his book Experience and prediction. (shrink)
This article suggests a ‘best alternative' justification of induction (in the sense of Reichenbach) which is based on meta-induction . The meta-inductivist applies the principle of induction to all competing prediction methods which are accessible to her. It is demonstrated, and illustrated by computer simulations, that there exist meta-inductivistic prediction strategies whose success is approximately optimal among all accessible prediction methods in arbitrary possible worlds, and which dominate the success of every noninductive prediction strategy. The proposed justification of meta-induction is (...) mathematically analytical. It implies, however, an a posteriori justification of object-induction based on the experiences in our world. *Received November 2005; revised March 2008. †To contact the author, please write to: Department of Philosophy, University of Duesseldorf, Universitaetsstrasse 1, Geb. 23.21, Duesseldorf, Germany D-40225; e-mail: email@example.com. (shrink)
Inductivism is understood as the explication of the degree of confirmation as conditional logical probability. Inductivism is not recommendable in the form of Carnap's λ-system, but tenable in the form of Bayesianism. Objections directed at it are either irrelevant or can be taken account of within Bayesianism.
Falsificationism has dominated 20th century philosophy of science. It seemed to have eclipsed all forms of inductivism. Yet recent debates have revived a specific form of eliminative inductivism, the basic ideas of which go back to F. Bacon and J.S. Mill. These modern endorsements of eliminative inductivism claim to show that progressive problem solving is possible using induction, rather than falsification as a method of justification. But this common ground between falsificationism and eliminative inductivism has not (...) led to a detailed investigation into the relationship, if any, which may exist between these two methodologies. This paper reviews several versions of eliminative inductivism, establishes a natural relation between eliminative inductivism and falsificationism, which derives from the distinction between models and theories, and carries out this investigation against a case study of the construction of atom models. The result of the investigation is that falsificationism is a form of eliminative inductivism in the limit of certain constraints. (shrink)
Philip Kitcher has developed a sort of inductivist-reliabilist justification for scientific realism. After distinguishing his argument from a well-known abductivist one (the "no-miracles" argument), I will argue that Kitcher's proposal cannot adequately meet the antirealist challenge. Firstly, it begs the question against the antirealists; secondly, it can hardly support a plausible - piecemeal - scientific realism. I will explore an alternative inductivist approach that exploits correlations between theoretical properties and empirical success. On my view, its prospects for avoiding the aforementioned (...) shortcomings are better than Kitcher's standpoint. I dare say, however, that an inductivist strategy alone cannot satisfy the demands of scientific realism since, in the end, an abductive move may well be mandatory for grounding it. (shrink)
I I set out my view that all inference is essentially deductive and pinpoint what I take to be the major shortcomings of the induction rule.II The import of data depends on the probability model of the experiment, a dependence ignored by the induction rule. Inductivists admit background knowledge must be taken into account but never spell out how this is to be done. As I see it, that is the problem of induction.III The induction rule, far from providing a (...) method of discovery, does not even serve to detect pattern. Knowing that there is uniformity in the universe is no help to discovering laws. A critique of Reichenbach's justification of the straight rule is constructed along these lines.IV The induction rule, by itself, cannot account for the varying rates at which confidence in an hypothesis mounts with data. The mathematical analysis of this salient feature of inductive reasoning requires prior probabilities. We also argue, against orthodox statisticians, that prior probabilities make a substantive contribution to the objectivity of inductive methods, viz. to the design of experiments and the selection of decision rules.V Carnap's general criticisms of various estimation rules, like the straight rule and the ‘impervious rule’, are seen to be misguided when the prior densities to which they correspond are taken into account.VI Analysis of Hempel's definition of confirmation qua formalization of the enumerative (naive) conception of instancehood. We show that from the standpoint of the quantitative measure P(H/E):P(H) for the degree to which E confirms H, Hempel's classificatory concept yields correct results only for sampling at large from a finite population with a two-way classification all of whose compositions are equally probable. We extend the analysis to Goodman's paradox, finding cases in which grue-like hypotheses do receive as much confirmation as their opposite numbers. We argue, moreover, the irrelevancy of entrenchment, and maintain that Goodman's paradox is no more than a straightforward counter-example to the enumerative conception of instancehood embodied in Hempel's definition.VII We rebutt the objection that prior probabilities, qua inputs of Bayesian analysis, can only be obtained by enumerative induction (insofar as they are objective). The divergence in the prior densities of two rational agents is less a function of subjectivity, we maintain, than of vagueness.VIII Our concluding remarks stress that, for Bayesians, there is no problem of induction in the usual sense. (shrink)
It is often noted that if someone has a tertiary degree in a scientific field who promotes an anti-science-establishment, antiscience, or pseudoscience agenda, they are very often engineers, dentists, surgeons or medical practitioners. While this does not mean that all members of these professions or disciplines are antiscience, of course, the higher frequency of pseudoscience among them is indicative of what I call the “deductivist mindset” regarding science itself. Opposing this is the “inductivist mindset”, a view that has been deprecated (...) among philosophers since Popper. Roughly, the deductivist mindset tends to see problems as questions that can be resolved by deduction from known theory or principle, while the inductivist sees problems as questions to be resolved by discovery. These form cognitive poles, which nobody ever purely instantiates, but a cognitive tendency to be a deductivist may explain why some people find results that conflict with prior theoretical commitments, whether scientific or not, unacceptable. The deductivist tends to be a cognitive conservative, where the inductivist tends to be a cognitive progressive, and the conservative mindset leads to a ressentiment about modernism, and hence about certain scientific results, more often, or so I shall argue in this chapter. (shrink)
Georg J. W. Dorn (1991). Inductive Support. In Gerhard Schurz & Georg J. W. Dorn (eds.), Advances in Scientific Philosophy. Essays in Honour of Paul Weingartner on the Occasion of the 60th Anniversary of his Birthday. Rodopi.score: 9.0
I set up two axiomatic theories of inductive support within the framework of Kolmogorovian probability theory. I call these theories ‘Popperian theories of inductive support’ because I think that their specific axioms express the core meaning of the word ‘inductive support’ as used by Popper (and, presumably, by many others, including some inductivists). As is to be expected from Popperian theories of inductive support, the main theorem of each of them is an anti-induction theorem, the stronger one of them saying, (...) in fact, that the relation of inductive support is identical with the empty relation. It seems to me that an axiomatic treatment of the idea(s) of inductive support within orthodox probability theory could be worthwhile for at least three reasons. Firstly, an axiomatic treatment demands from the builder of a theory of inductive support to state clearly in the form of specific axioms what he means by ‘inductive support’. Perhaps the discussion of the new anti-induction proofs of Karl Popper and David Miller would have been more fruitful if they had given an explicit definition of what inductive support is or should be. Secondly, an axiomatic treatment of the idea(s) of inductive support within Kolmogorovian probability theory might be accommodating to those philosophers who do not completely trust Popperian probability theory for having theorems which orthodox Kolmogorovian probability theory lacks; a transparent derivation of anti-induction theorems within a Kolmogorovian frame might bring additional persuasive power to the original anti-induction proofs of Popper and Miller, developed within the framework of Popperian probability theory. Thirdly, one of the main advantages of the axiomatic method is that it facilitates criticism of its products: the axiomatic theories. On the one hand, it is much easier than usual to check whether those statements which have been distinguished as theorems really are theorems of the theory under examination. On the other hand, after we have convinced ourselves that these statements are indeed theorems, we can take a critical look at the axioms—especially if we have a negative attitude towards one of the theorems. Since anti-induction theorems are not popular at all, the adequacy of some of the axioms they are derived from will certainly be doubted. If doubt should lead to a search for alternative axioms, sheer negative attitudes might develop into constructive criticism and even lead to new discoveries. -/- I proceed as follows. In section 1, I start with a small but sufficiently strong axiomatic theory of deductive dependence, closely following Popper and Miller (1987). In section 2, I extend that starting theory to an elementary Kolmogorovian theory of unconditional probability, which I extend, in section 3, to an elementary Kolmogorovian theory of conditional probability, which in its turn gets extended, in section 4, to a standard theory of probabilistic dependence, which also gets extended, in section 5, to a standard theory of probabilistic support, the main theorem of which will be a theorem about the incompatibility of probabilistic support and deductive independence. In section 6, I extend the theory of probabilistic support to a weak Popperian theory of inductive support, which I extend, in section 7, to a strong Popperian theory of inductive support. In section 8, I reconsider Popper's anti-inductivist theses in the light of the anti-induction theorems. I conclude the paper with a short discussion of possible objections to our anti-induction theorems, paying special attention to the topic of deductive relevance, which has so far been neglected in the discussion of the anti-induction proofs of Popper and Miller. (shrink)
This paper seeks to show that Achinstein's recent attempt to establish that both parties to the wave-particle debate in 19th-century optics were Bayesian conditionalizers forces us to ignore several of the key conceptual issues in that controversy-not least the role of the vera causa principle and, more important still, the role of positive evidence in securing acceptance for the wave theory of light.
Edgar Allan Poe’s standing as a literary figure, who drew on (and sometimes dabbled in) the scientific debates of his time, makes him an intriguing character for any exploration of the historical interrelationship between science, literature and philosophy. His sprawling ‘prose-poem’ Eureka (1848), in particular, has sometimes been scrutinized for anticipations of later scientific developments. By contrast, the present paper argues that it should be understood as a contribution to the raging debates about scientific methodology at the time. This methodological (...) interest, which is echoed in Poe’s ‘tales of ratiocination’, gives rise to a proposed new mode of—broadly abductive—inference, which Poe attributes to the hybrid figure of the ‘poet-mathematician’. Without creative imagination and intuition, Science would necessarily remain incomplete, evenby its own standards. This concern with imaginative (abductive) inference ties in nicely with his coherentism, which grants pride of place to the twin virtues of Simplicity and Consistency, which must constrain imagination lest it degenerate into mere fancy. (shrink)
Zwischen 1987 und 1994 sandte ich 20 Briefe an Karl Popper. Die meisten betrafen Fragen bezüglich seiner Antiinduktionsbeweise und seiner Wahrscheinlichkeitstheorie, einige die organisatorische und inhaltliche Vorbereitung eines Fachgesprächs mit ihm in Kenly am 22. März 1989 (worauf hier nicht eingegangen werden soll), einige schließlich ganz oder in Teilen nicht-fachliche Angelegenheiten (die im vorliegenden Bericht ebenfalls unberücksichtigt bleiben). Von Karl Popper erhielt ich in diesem Zeitraum 10 Briefe. Der bedeutendste ist sein siebter, bestehend aus drei Teilen, geschrieben am 21., 22. (...) und 23. Oktober 1992, in dem er eine Vorform jener Definition der probabilistischen Unabhängigkeit entwickelte, die er 1994 im neuen Anhang *XX der 10. Auflage seiner Logik der Forschung (LdF) der wissenschaftstheoretischen Forschergemeinde vorstellte. Der berührendste ist sein letzter, geschrieben am 26. Juli 1994, in dem er trotz Erschöpfung mit Humor schildert, wie mühselig der Druck des Anhangs *XX verlaufen ist. Mein Bericht ist zugleich chronologisch und systematisch gegliedert: die ersten, vergleichsweise wenigen Briefe, großteils 1987 geschrieben, handeln von der Induktion; der große Rest, zeitlicher Schwerpunkt 1992, beschäftigt sich mit der Wahrscheinlichkeitstheorie. Das Kapitel 1 über Induktion ist in vier Abschnitte unterteilt: 1.1 Das Popper/Miller-Argument: eine Nachkonstruktion 1.2 Karl Poppers Brief vom 25.8.1987: Deduktive Stützung 1.3 Karl Poppers Brief vom 29.9.1987: Nochmals zur deduktiven Stützung 1.4 Echt induktive Stützung und Schwächung: zwei eigene Beweise Das Kapitel 2 über Wahrscheinlichkeit ist ebenfalls in vier Abschnitte unterteilt: 2.1 Ein Mangel an Überschußgesetzen in der Logic of Scientific Discovery 2.2 Probabilistische Unabhängigkeit 2.3 Wahrscheinlichkeitstheorie und Wahrscheinlichkeitssemantik 2.4 Die neue Unabhängigkeitsdefinition im Anhang *XX der LdF. (shrink)
This work is in two parts. The main aim of part 1 is a systematic examination of deductive, probabilistic, inductive and purely inductive dependence relations within the framework of Kolmogorov probability semantics. The main aim of part 2 is a systematic comparison of (in all) 20 different relations of probabilistic (in)dependence within the framework of Popper probability semantics (for Kolmogorov probability semantics does not allow such a comparison). Added to this comparison is an examination of (in all) 15 purely inductive (...) dependence relations. ————Part 1 leads in an axiomatic step-by-step development from the elementary classical truth value semantics of a sentential-logical language, called ‘L’, (chapter 1) to the elementary Kolmogorov probability semantics of L (chapter 2), which is then extended to four axiomatic semantical theories of dependence relations between the formulae of L. First the elementary Kolmogorov probability semantics of L is extended to a theory, called ‘Kdd’, of the relations of deductive dependence and deductive independence between formulae of L (chapter 3). Then Kdd is extended to a theory, called ‘Kpd1’, of the degree to which formulae of L probabilistically depend on each other in regard to a given probability distribution on the set of all formulae of L (chapter 4). Kpd1, in its turn, gets extended to a theory, called ‘Kpd2’, of the relations of probabilistic dependence and independence, relativized to unary Kolmogorov probability functions defined on L (chapter 5). Then Kpd2 is extended to a theory, called ‘Kid’, of the relations of inductive dependence and inductive independence, again relativized to unary Kolmogorov probability functions defined on L (chapter 6). Finally, Kid is extended to a theory, called ‘Kpid’, of the relations of purely inductive positive and negative dependence, relativized to unary Kolmogorov probability functions defined on L (chapter 7). ——Chapter 1, which deals with the familiar notions of truth value functions, tautologies, consequence relations and relations of logical opposition, is naturally the shortest chapter of part 1.——In chapter 2, the elementary classical semantics of L is extended to the elementary Kolmogorov probability semantics of L, i.e. to an axiomatic theory of unary and of binary Kolmogorov probability functions defined on the set of formulae of L. Because of the elementary character of this theory, chapter 2 is also rather short.——Chapter 3 introduces the first theory on dependence relations, to wit: Kdd, the theory of deductive (in)dependence between formulae of L. I follow here the well-known idea of Popper and Miller, who have used it in a famous discussion on the nature of probabilistic support for their arguments that probabilistic support is deductive, not inductive. I develop Kdd in the form of about 100 theorems, making ample use of the fact that deductive independence is nothing but subcontrary opposition, and close with a remark on the fundamental difference between deductive and logical dependence—two relations the ideas of which are all too easily mixed up.——Chapters 4 and 5 deal extensively with the traditional ideas of probabilistic (in)dependence, applied to formulae rather than to events. As always, I proceed axiomatically in a step-by-step process under systematic viewpoints and obtain about 300 theorems in this way. In the formulation of the theorems, I took special care to state clearly and expressly so-called tacit assumptions, especially those concerning the probability values of the formulae said to be dependent on each other. These assumptions are usually missing in the literature, due either to economy of writing or to sloppiness of thinking. Presumably, both chapters contain little that is new, their value lying more in the systematic grouping and organic development of the theorems than in the newness of these.——In chapter 6, I extend the axiomatic theory about probabilistic (in)dependence which has been elaborated in chapter 5, to an axiomatic theory of inductive (in)dependence by requiring of the relation of inductive (in)dependence that it be probabilistic (in)dependence, but not also logical implication or logical opposition. I point out the differences between probabilistic and inductive (in)dependence by means of some 60 theorems and close my examination of inductive (in)dependence by considering its relationship to the notion of support in the philosophy of science.——Finally, in chapter 7, the last of part 1, I take the step from inductive dependence to what I call ‘purely inductive dependence’ by combining the idea of inductive dependence with that of deductive independence in a way which is suggested by writings of Popper and Miller. I arrive at two noteworthy theorems. Firstly, there is indeed no purely inductive support. But secondly, and perhaps amazingly, countersupport is purely inductive.————Whereas the probabilistic framework of part 1 of the present work is Kolmogorov probability semantics, the framework of part 2 is Popper probability semantics, which is not only worth examining as a fascinating alternative to orthodox Kolmogorov probability semantics, but also allows us to examine dependence relations more deeply, than Kolmogorov probability semantics does. Part 2 leads—again in an axiomatic step-by-step development—from the basic Popper probability semantics of L, called ‘Pb’, (chapter 8) via a probabilistic theory of logical attributes, called ‘Ps’, (chapter 9) to four axiomatic semantical theories of dependence relations between the formulae of L. First, Ps is extended to a theory, called ‘Pdd’, of the relations of deductive dependence and deductive independence between formulae of L (chapter 10). Then Pdd is extended to a theory, called ‘Ppd’, of (in all) 20 relations of probabilistic (in)dependence, relativized to binary Popper probability functions defined on L (chapter 11). Ppd, in its turn, is extended to a theory, called ‘Pid’, of (in all) 10 relations of inductive dependence, again relativized to binary Popper probability functions defined on L (first part of chapter 12). Finally, Pid is extended to a theory, called ‘Ppid’, of (in all) 15 relations of purely inductive positive and negative dependence, relativized to binary Popper probability functions defined on L (second part of chapter 12).——Chapter 8, the first chapter of part 2 of the present work, is entirely preparatory. It introduces the axioms and about 180 theorems (150 of them together with their proofs) of basic Popper probability semantics in order to set this kind of semantics under way.——Then, in chapter 9, basic Popper probability semantics is extended to a probabilistic theory of logical properties of and relations between the formulae of L. Although I think that the way I did this extension is of some interest in itself, the main task of chapter 9 is again a preparatory one: to yield the indispensable lemmata (about 90 in number) for the theorems concerning probabilistic dependence relations in chapter 11 and concerning inductive dependence relations in chapter 12.——Chapter 10 brings the extension of Ps to the theory Pdd of deductive (in)dependence. Only half a dozen theorems are noted here for later use in the Pdd-extensions Ppd and Ppid. In view of the over 100 theorems already gained on this topic in the Kolmogorovian framework (cf. chapter 3), a similar extensive elaboration of Pdd would have been superfluous.——Chapter 11 is the most important one of part 2. It consists of a systematic comparison of 20 probabilistic (in)dependence concepts by means of about 230 theorems, obtained within the axiomatic theory Ppd, which is built up as an extension of Pdd. The main points of comparison were: differences in logical strength; reflexivity and symmetry; behaviour under the condition that the probability values of the formulae in question are extreme. It turned out that each of the examined concepts violates a strong and straightforward version of the intuitive requirement that probabilistic dependence should go with logical dependence. Whereas the corresponding chapter 5 in part 1 of the present work may not have led to new theorems, chapter 11 yields dozens of them in the process of comparison of concepts of dependence and independence which had—as far as I know—never before been treated in a single theoretical framework. With Popper probability semantics, this framework has become available, and here I have simply made full use of it.——In chapter 12, I extend the theory Ppd of probabilistic (in)dependence to the theories Pid and Ppid of inductive and purely inductive dependence, in a way very similar to that in which I have extended the theory Kpd2 to the theories Kid and Kpid in chapters 6 and 7. The first main result of Kpid (roughly: there is no purely inductive support) could be repeated for four of the five purely inductive positive dependence relations considered in chapter 12, whereas the second main result of Kpid (roughly: there is purely inductive countersupport ) could be repeated for each of the five examined purely inductive negative dependence relations. Chapter 12 closes with a brief recapitulation and critical discussion of the main results. (shrink)
Is there a universal set of rules for discovering and testing scientific hypotheses? Since the birth of modern science, philosophers, scientists, and other thinkers have wrestled with this fundamental question of scientific practice. Efforts to devise rigorous methods for obtaining scientific knowledge include the twenty-one rules Descartes proposed in his Rules for the Direction of the Mind and the four rules of reasoning that begin the third book of Newton's Principia , and continue today in debates over the very possibility (...) of such rules. Bringing together key primary sources spanning almost four centuries, Science Rules introduces readers to scientific methods that have played a prominent role in the history of scientific practice. Editor Peter Achinstein includes works by scientists and philosophers of science to offer a new perspective on the nature of scientific reasoning. For each of the methods discussed, he presents the original formulation of the method selections written by a proponent of the method together with an application to a particular scientific example and a critical analysis of the method that draws on historical and contemporary sources. The methods included in this volume are Cartesian rationalism with an application to Descartes' laws of motion Newton's inductivism and the law of gravity two versions of hypothetico-deductivism -- those of William Whewell and Karl Popper -- and the nineteenth-century wave theory of light Paul Feyerabend's principle of proliferation and Thomas Kuhn's views on scientific values, both of which deny that there are universal rules of method, with an application to Galileo's tower argument. Included also is a famous nineteenth-century debate about scientific reasoning between the hypothetico-deductivist William Whewell and the inductivist John Stuart Mill and an account of the realism-antirealism dispute about unobservables in science, with a consideration of Perrin's argument for the existence of molecules in the early twentieth century. (shrink)
The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding (...) conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. (shrink)
This paper seeks to defend the following conclusions: The program advanced by Carnap and other necessarians for probability logic has little to recommend it except for one important point. Credal probability judgments ought to be adapted to changes in evidence or states of full belief in a principled manner in conformity with the inquirer’s confirmational commitments—except when the inquirer has good reason to modify his or her confirmational commitment. Probability logic ought to spell out the constraints on rationally coherent confirmational (...) commitments. In the case where credal judgments are numerically determinate confirmational commitments correspond to Carnap’s credibility functions mathematically represented by so—called confirmation functions. Serious investigation of the conditions under which confirmational commitments should be changed ought to be a prime target for critical reflection. The necessarians were mistaken in thinking that confirmational commitments are immune to legitimate modification altogether. But their personalist or subjectivist critics went too far in suggesting that we might dispense with confirmational commitments. There is room for serious reflection on conditions under which changes in confirmational commitments may be brought under critical control. Undertaking such reflection need not become embroiled in the anti inductivism that has characterized the work of Popper, Carnap and Jeffrey and narrowed the focus of students of logical and methodological issues pertaining to inquiry. (shrink)
Many scientists believe that there is a uniform, interdisciplinary method for the prac- tice of good science. The paradigmatic examples, however, are drawn from classical ex- perimental science. Insofar as historical hypotheses cannot be tested in controlled labo- ratory settings, historical research is sometimes said to be inferior to experimental research. Using examples from diverse historical disciplines, this paper demonstrates that such claims are misguided. First, the reputed superiority of experimental research is based upon accounts of scientific methodology (Baconian (...) class='Hi'>inductivism or falsificationism) that are deeply flawed, both logically and as accounts of the actual practices of scientists. Second, although there are fundamental differences in methodology between experimental scien- tists and historical scientists, they are keyed to a pervasive feature of nature, a time asymmetry of causation. As a consequence, the claim that historical science is methodo- logically inferior to experimental science cannot be sustained. (shrink)
Francis Bacon (15611626) wrote that good scientists are not like ants (mindlessly gathering data) or spiders (spinning empty theories). Instead, they are like bees, transforming nature into a nourishing product. This essay examines Bacon's "middle way" by elucidating the means he proposes to turn experience and insight into understanding. The human intellect relies on "machines" to extend perceptual limits, check impulsive imaginations, and reveal nature's latent causal structure, or "forms." This constructivist interpretation is not intended to supplant inductivist or experimentalist (...) interpretations, but is designed to explicate Bacon's account of science as a collaborative project with several interdependent methodological goals. (shrink)
As climate policy decisions are decisions under uncertainty, being based on a range of future climate change scenarios, it becomes a crucial question how to set up this scenario range. Failing to comply with the precautionary principle, the scenario methodology widely used in the Third Assessment Report of the International Panel on Climate Change (IPCC) seems to violate international environmental law, in particular a provision of the United Nations Framework Convention on Climate Change. To place climate policy advice on a (...) sound methodological basis would imply that climate simulations which are based on complex climate models had, in stark contrast to their current hegemony, hardly an epistemic role to play in climate scenario analysis at all. Their main function might actually consist in ‘foreseeing future ozone-holes’. In order to argue for these theses, I explain first of all the plurality of climate models used in climate science by the failure to avoid the problem of underdetermination. As a consequence, climate simulation results have to be interpreted as modal sentences, stating what is possibly true of our climate system. This indicates that climate policy decisions are decisions under uncertainty. Two general methodological principles which may guide the construction of the scenario range are formulated and contrasted with each other: modal inductivism and modal falsificationism. I argue that modal inductivism, being the methodology implicitly underlying the third IPCC report, is severely flawed. Modal falsificationism, representing the sound alternative, would in turn require an overhaul of the IPCC practice. (shrink)
An approach to inference to the best explanation integrating a Popperianconception of natural laws together with a modified Hempelian account of explanation, one the one hand, and Hacking's law of likelihood (in its nomicguise), on the other, which provides a robust abductivist model of sciencethat appears to overcome the obstacles that confront its inductivist,deductivist, and hypothetico-deductivist alternatives.This philosophy of scienceclarifies and illuminates some fundamental aspects of ontology and epistemology, especially concerning the relations between frequencies and propensities. Among the most important (...) elements of this conception is thecentral role of degrees of nomic expectability in explanation, prediction,and inference, for which this investigation provides a theoretical defense. (shrink)
The non-justificationist deductivism (or critical rationalism) of Karl Popper constitutes the only approach to human knowledge, including of course the natural and social sciences, that is capable of overcoming all the failings, and the plain contradictions, of the traditional doctrine of inductivism and of its modern incarnation, Bayesianism.
Some philosophers of science suggest that philosophical assumptions must influence historical scholarship, because history (like science) has no neutral data and because the treatment of any particular historical episode is going to be influenced to some degree by one's prior philosophical conceptions of what is important in science. However, if the history of science must be laden with philosophical assumptions, then how can the history of science be evidence for the philosophy of science? Would not an inductivist history of science (...) confirm an inductivist philosophy of science and a conventionalist history of science confirm a conventionalist philosophy of science? I attempt to resolve this problem; essentially, I deny the claim that the history of science must be influenced by one's conception of what is important in science — one's general philosophy of science. To accomplish the task I look at a specific historical episode, together with its history, and draw some metamethodological conclusions from it. The specific historical episode I examine is Descartes' critique of Galileo's scientific methodology. (shrink)
The justification of induction is of central significance for cross-cultural social epistemology. Different ‘epistemological cultures’ do not only differ in their beliefs, but also in their belief-forming methods and evaluation standards. For an objective comparison of different methods and standards, one needs (meta-)induction over past successes. A notorious obstacle to the problem of justifying induction lies in the fact that the success of object-inductive prediction methods (i.e., methods applied at the level of events) can neither be shown to be universally (...) reliable (Hume's insight) nor to be universally optimal. My proposal towards a solution of the problem of induction is meta-induction. The meta-inductivist applies the principle of induction to all competing prediction methods that are accessible to her. By means of mathematical analysis and computer simulations of prediction games I show that there exist meta-inductive prediction strategies whose success is universally optimal among all accessible prediction strategies, modulo a small short-run loss. The proposed justification of meta-induction is mathematically analytical. It implies, however, an a posteriori justification of object-induction based on the experiences in our world. In the final section I draw conclusions about the significance of meta-induction for the social spread of knowledge and the cultural evolution of cognition, and I relate my results to other simulation results which utilize meta-inductive learning mechanisms. (shrink)
Duhem first expounds the holistic thesis, according to which an experimental test always involves several hypotheses, in articles dating from the 1890s. Poincaré's analysis of a recent experiment in optics provides the incentive, but Duhem generalizes this analysis and develops a highly original methodological position. He is led to reject inductivism. I will endeavor to show the crucial role history of science comes to play in the development of Duhem's holism.
There’s a tendency to suppose that a naturalist is automatically, by virtue of her naturalism, committed to some particular view of logic. These days, for example, the classical Quinean picture is sometimes taken to be the naturalistic standard: logic lies at the center of the web of belief; remote from sense experience, but widely confirmed by its role in all our successful theorizing; a posteriori like the rest, but also the most resistant to change, given the principle of minimum mutilation; (...) and thus apparently, or even practically, a priori. 1 But others, at other times, have held that other views of logic followed directly from naturalism, say psychologism, or simple inductivism, or some form of linguistic conventionalism. The trouble is that ‘naturalism’ means something different in each case, or that it comes encumbered with various inessential add-ons (like holism). (shrink)
The place of induction in the framing and test of scientific hypotheses is investigated. The meaning of 'induction' is first equated with generalization on the basis of case examination. Two kinds of induction are then distinguished: the inference of generals from particulars (first degree induction), and the generalization of generalizations (second degree induction). Induction is claimed to play a role in the framing of modest empirical generalizations and in the extension of every sort of generalizations--not however in the invention of (...) high-level hypotheses containing theoretical predicates. It is maintained, on the other hand, that induction by enumeration is essential in the empirical test of the lowest-level consequences of scientific theories, since it occurs in the drawing of "conclusions" from the examination of empirical evidence. But it is also held that the empirical test is insufficient, and must be supplemented with theorification, or the expansion of isolated hypotheses into theories. Refutation is not viewed as a substitute for confirmation but as its complement, since the very notion of unfavorable case is meaningful only in connection with the concept of positive instance. Although the existence of an inductive method is disclaimed, it is maintained that the various patterns of plausible reasoning (inductive inference included) are worth being investigated. It is concluded that scientific research follows neither the advice of inductivism nor the injunction of deductivism, but takes a middle course in which induction is instrumental both heuristically and methodologically, although the over-all pattern of research is hypothetico-deductive. (shrink)
Despite his well‐known deductivism, in his early (unpublished) writings, Popper held an inductivist position. Up to 1929 epistemology entered Popper's reflections only as far as the problem was that of the justification of the scientific character of these fields of research. However, in that year, while surveying the history of non‐Euclidean geometries, Popper explicitly discussed the cognitive status of geometry without referring to psycho‐pedagogical aspects, thus turning from cognitive psychology to the logic and methodology of science. As a consequence of (...) his reflections on the problematic relationship between geometrical‐mathematical constructions and physical reality Popper was able to get over a too direct notion of such a relationship, cast doubts on inductive inference and started conceiving in a new (strictly non‐inductivist) manner the relationship between theoretical and observational propositions. (shrink)
Galileo's Philosophy of Science - or: Contra Feyerabend. In analyzing Galileo's methodology, philosophers of science were using, misusing, and abusing his ideas rather unashamedly to suit their own purposes. Like so many others before him, Paul Feyerabend had come to the conclusion that his methodological ideas might gain momentum by demonstrating their compatibility with those of Galileo. The reinterpretation of Galileo as a true, though disguised, anarchist, was considered by Feyerabend as the most forceful, and indeed conclusive, case against rationalism (...) in methodology which might be conceived in view of the privileged position ascribed to Galileo by both philosophers which might be conceived in view of the privileged position ascribed to Galileo by both philosophers and historians of science. The paper argues - against Feyerabend - that Galileo was not a methodological anarchist, neither in theory nor in practice. He had firm methodological convictions that remained basically the same throughout his entire career. In his view, essential and accidental causes of phenomena were not given by experience. Although mathematical and geometrical analysis was needed to discriminate between them, experience and experiment was considered by Galileo from his middle periode on as a means to identify among the set of explanations, demonstrable "ex suppositione" as being mathematically correct, those which could in addition be applied to reality. Thus, Galileo was neither an inductivist nor a naive falsificationist, nor a Copernican zealot adapting his methodology to the needs of his presumed fight for heliocentrism, come what be. Only after the reconstruction of mechanics was in a fairly advanced stage, and after his own telescopic observations had provided independent evidence in favor of the new astronomy, Galileo was in a position to appreciate the Copernican system as a most forceful ally in his fight for the recognition of his physical achievements. Through the end of his life, his view of the heliocentric system remained rather traditional in adhering firmly to the principles of epicyclic and circular motion, as far as the heavens were concerned. (shrink)
In this paper I argue that neuroscience has been harmed by the widespread adoption of seriously inadequate methodologies or philosophies of science - most notably inductivism and falsificationism. I argue that neuroscience, in seeking to understand the human brain and mind, needs to follow in the footsteps of evolution.
<span class='Hi'>Naturalized</span> <span class='Hi'>epistemology</span>—the recent attempt to transform the theory of knowledge into a branch of natural science—is often criticized for dispensing with the distinctively philosophical content of <span class='Hi'>epistemology</span>. In this dissertation, I argue that epistemologists are correct to reject naturalism, but that new arguments are needed to show why this is so. I establish my thesis first by evaluating two prominent varieties of naturalism—optimistic and pessimistic—and then by offering a proposal for how a new version of non-naturalistic <span (...) class='Hi'>epistemology</span> must move forward. Optimistic naturalism attempts to use scientific methods to give positive answers to traditional epistemological questions. Epistemologists, for example, are urged to draw on psychology and evolutionary biology in order to show our beliefs are justified. I argue that this project fails. First, the naturalist’s thesis that theory is underdetermined by evidence poses difficulties for the optimist’s attempt to show that our beliefs are justified, even according to <span class='Hi'>naturalized</span> standards. Second, while critics usually contest naturalists’ logical right to use the concept of normative justification, I suggest that a deeper problem is with the naturalists’ use of the concept of belief. Naturalistic philosophy of mind, while perhaps acceptable for other purposes, does not deliver a concept of “belief” consistent with the constraints and needs of <span class='Hi'>naturalized</span> <span class='Hi'>epistemology</span>. Pessimistic naturalism—Quine’s project—takes it for granted that “belief” is problematic and logical justification elusive, and instead offers a pragmatic account of the development of our theory of the world. This project, while deeply unsatisfactory to the traditional epistemologist, also faces the challenge of privileging scientific discourse over other pragmatically successful modes of discourse. Whatever its merits, we can undermine its motivation by challenging the underdetermination thesis it rests on. We can do this by appealing to facts about scientific practice that undermine the conception of confirmation driving the thesis, by appealing to other facts about scientific practice, and by challenging some philosophical preconceptions, in order to make room for a new brand of inductivist foundationalism.. (shrink)
A study of the development of Mill's thought through successive editions of _A System of Logic. His view of the genesis of most scientific laws, it is argued, progressively shifted from inductivism to hypothetico-deductivism. Mill's analysis of hypotheses and of methods for their assessment is considered in detail. New light is shed on relations between Mill's metascience and that of William Whewell.
The question whether attempts to vindicate induction should be abandoned in favor of (other) problems of rationality is pressing and difficult. How may we decide rationally when standards for rationality are at issue? It may be useful to first know how we have decided in the past. Whewell's philosophy of science and the reaction to it are discussed. Whewell's contemporaries mistakenly thought that only an inductivist research program could produce an adequate theory of rationality. But this very move violated their (...) own standards of rationality. We should now avoid making the same mistake again. We should return to Whewell's rejected proposal to make the philosophy of science historical and seek thereby to improve rational practice. (shrink)
This volume brings together eleven essays by the distinguished philosopher of science, Peter Achinstein. The unifying theme is the nature of the philosophical problems surrounding the postulation of unobservable entities such as light waves, molecules, and electrons. How, if at all, is it possible to confirm scientific hypotheses about "unobservables"? Achinstein examines this question as it arose in actual scientific practice in three nineteenth-century episodes: the debate between particle and wave theorists of light, Maxwell's kinetic theory of gases, and J.J. (...) Thomson's discovery of the electron. The book contains three parts, each devoted to one of these topics, beginning with an essay presenting the historical background of the episode and an introduction to the philosophical issues. There is an illuminating evaluation of various scientific methodologies, including hypothetico-deductivism, inductivism, and the method of independent warrant which combines features of the first two. Achinstein assesses the philosophical validity of both nineteenth-century and modern answers to questions about unobservables, and presents and defends his own solutions. (shrink)
In a recent Philosophy of Science article Gerhard Schurz proposes meta-inductivistic prediction strategies as a new approach to Hume's. This comment examines the limitations of Schurz's approach. It can be proven that the meta-inductivist approach does not work any more if the meta-inductivists have to face an infinite number of alternative predictors. With his limitation it remains doubtful whether the meta-inductivist can provide a full solution to the problem of induction.