This book comes to the rescue of scientific realism, showing that reports of its death have been greatly exaggerated. Philosophical realism holds that the aim of a particular discourse is to make true statements about its subject matter. Ilkka Niiniluoto surveys different kinds of realism in various areas of philosophy and then sets out his own critical realist philosophy of science.
This book examines the philosophical conception of abductive reasoning as developed by Charles S. Peirce, the founder of American pragmatism. It explores the historical and systematic connections of Peirce's original ideas and debates about their interpretations. Abduction is understood in a broad sense which covers the discovery and pursuit of hypotheses and inference to the best explanation. The analysis presents fresh insights into this notion of reasoning, which derives from effects to causes or from surprising observations to explanatory theories. The (...) author outlines some logical and AI approaches to abduction as well as studies various kinds of inverse problems in astronomy, physics, medicine, biology, and human sciences to provide examples of retroductions and abductions. The discussion covers also everyday examples with the implication of this notion in detective stories, one of Peirce’s own favorite themes. The author uses Bayesian probabilities to argue that explanatory abduction is a method of confirmation. He uses his own account of truth approximation to reformulate abduction as inference which leads to the truthlikeness of its conclusion. This allows a powerful abductive defense of scientific realism. This up-to-date survey and defense of the Peircean view of abduction may very well help researchers, students, and philosophers better understand the logic of truth-seeking. (shrink)
According to the foundationalist picture, shared by many rationalists and positivist empiricists, science makes cognitive progress by accumulating justified truths. Fallibilists, who point out that complete certainty cannot be achieved in empirical science, can still argue that even successions of false theories may progress toward the truth. This proposal was supported by Karl Popper with his notion of truthlikeness or verisimilitude. Popper’s own technical definition failed, but the idea that scientific progress means increasing truthlikeness can be expressed by defining degrees (...) of truthlikeness in terms of similarities between states of affairs. This paper defends the verisimilitude approach against Alexander Bird who argues that the “semantic” definition is not sufficient to define progress, but the “epistemic” definition referring to justification and knowledge is more adequate. Here Bird ignores the crucial distinction between real progress and estimated progress, explicated by the difference between absolute degrees of truthlikeness and their evidence-relative expected values. Further, it is argued that Bird’s idea of returning to the cumulative model of growth requires an implausible trick of transforming past false theories into true ones. (shrink)
The modern history of verisimilitude can be divided into three periods. The first began in 1960, when Karl Popper proposed his qualitative definition of what it is for one theory to be more truthlike than another theory, and lasted until 1974, when David Miller and Pavel Trich published their refutation of Popper's definition. The second period started immediately with the attempt to explicate truthlikeness by means of relations of similarity or resemblance between states of affairs (or their linguistic representations); the (...) work within this similarity approach was summarized in the books of Graham Oddie [1986] and Ilkka Niiniluoto [1987]. During the subsequent third period, studies in verisimilitude have been actively continued, and interesting results and applications have been achieved, but not many dramatic novelties. While it is now obsolete to claim that truthlikeness with reasonable properties cannot be defined at all, there is still a lot of controversy about the best and least arbitrary approach to doing this. (shrink)
Scientific realists use the “no miracle argument” to show that the empirical and pragmatic success of science is an indicator of the ability of scientific theories to give true or truthlike representations of unobservable reality. While antirealists define scientific progress in terms of empirical success or practical problem-solving, realists characterize progress by using some truth-related criteria. This paper defends the definition of scientific progress as increasing truthlikeness or verisimilitude. Antirealists have tried to rebut realism with the “pessimistic metainduction”, but critical (...) realists turn this argument into an optimistic view about progressive science. (shrink)
Charles S. Peirce argued that, besides deduction and induction, there is a third mode of inference which he called " hypothesis " or " abduction." He characterized abduction as reasoning " from effect to cause," and as " the operation of adopting an explanatory hypothesis." Peirce ' s ideas about abduction, which are related also to historically earlier accounts of heuristic reasoning, have been seen as providing a logic of scientific discovery. Alternatively, abduction is interpreted as giving reasons for pursuing (...) a hypothesis. Inference to the best explanation has also been regarded as an important mode of justification, both in everyday life, detective stories, and science. In particular, scientific realism has been defended by an abductive nomiracle argument, while the critics of realism have attempted to show that this appeal to abduction is question - begging, circular, or incoherent. This paper approaches these issues by distinguishing weaker and stronger forms of abduction, and by showing how these types of inferences can be given Peircean and Bayesian probabilistic reconstructions. (shrink)
The modern discussion on the concept of truthlikeness was started in 1960. In his influential Word and Object, W. V. O. Quine argued that Charles Peirce's definition of truth as the limit of inquiry is faulty for the reason that the notion 'nearer than' is only "defined for numbers and not for theories". In his contribution to the 1960 International Congress for Logic, Methodology, and Philosophy of Science at Stan ford, Karl Popper defended the opposite view by defining a comparative (...) notion of verisimilitude for theories. The concept of verisimilitude was originally introduced by the Ancient sceptics to moderate their radical thesis of the inaccessibility of truth. But soon verisimilitudo, indicating likeness to the truth, was confused with probabilitas, which expresses an opiniotative attitude weaker than full certainty. The idea of truthlikeness fell in disrepute also as a result of the careless, often confused and metaphysically loaded way in which many philosophers used - and still use - such concepts as 'degree of truth', 'approximate truth', 'partial truth', and 'approach to the truth'. Popper's great achievement was his insight that the criticism against truthlikeness - by those who urge that it is meaningless to speak about 'closeness to truth' - is more based on prejudice than argument. (shrink)
Conceptual change and its connection to the development of new seien tific theories has reeently beeome an intensively discussed topic in philo sophieal literature. Even if the inductive aspects related to conceptual change have already been discussed to some extent, there has so far existed no systematic treatment of inductive change due to conceptual enrichment. This is what we attempt to accomplish in this work, al though most of our technical results are restricted to the framework of monadic languages. We (...) extend Hintikka's system of inductive logic to apply to situations in which new concepts are introduced to the original language. By interpreting them as theoretica1 concepts, it is possible to discuss a number of currently debated philosophical and methodological problems which have previously escaped systematic and exact treatment. For instance, the role which seientific theories employing theoretical con cepts may play within inductive inference can be studied within this framework. From the viewpoint of seientific realism, sueh a study gives outlines for a theory of what we call hypothetico-induetive inference. Some parts of this work which are based on Hintikka's system of in ductive logic are fairly technical. However, no previous knowledge of this system is required, but, in general, acquaintance with the basic ideas of elementary logic and probability theory is suffieient. This work is part of a project, originated by Professors Jaakko Hintikka and Raimo Tuomela, concerning the role of theoretical concepts in science. (shrink)
The notion of truthlikeness or verisimilitude has been a topic of intensive discussion ever since the definition proposed by Karl Popper was refuted in 1974. This paper gives an analysis of old and new debates about this notion. There is a fairly large agreement about the truthlikeness ordering of conjunctive theories, but the main rival approaches differ especially about false disjunctive theories. Continuing the debate between Niiniluoto’s min-sum measure and Schurz’s relevant consequence measure, the paper also gives a critical assessment (...) of Oddie’s new defense of the average measure and Kuipers’ refined definition of truth approximation. (shrink)
The distinction between basic and applied research is notoriously vague, despite its frequent use in science studies and in science policy. In most cases it is based on such pragmatic factors as the knowledge and intentions of the investigator or the type of research institute. Sometimes the validity of the distinction is denied altogether. This paper suggests that there are two ways of distinguishing systematically between basic and applied research: (i) in terms of the utilities that define the aims of (...) inquiry, and (ii) by reference to the structure of the relevant knowledge claims. An important type of applied research aims at results that are expressed by techical norms (in von Wright's sense): if you wish to achieveA, and you believe you are in a situationB, then you should doX. This conception of design sciences allows us to re-evaluate many issues in the history, philosophy, and ethics of science. (shrink)
The modern history of verisimilitude can be divided into three periods. The first began in 1960, when Karl Popper proposed his qualitative definition of what it is for one theory to be more truthlike than another theory, and lasted until 1974, when David Miller and Pavel Trichý published their refutation of Popper's definition. The second period started immediately with the attempt to explicate truthlikeness by means of relations of similarity or resemblance between states of affairs (or their linguistic representations); the (...) work within this similarity approach was summarized in the books of Graham Oddie [1986] and Ilkka Niiniluoto [1987]. During the subsequent third period, studies in verisimilitude have been actively continued, and interesting results and applications have been achieved, but not many dramatic novelties. While it is now obsolete to claim that truthlikeness with reasonable properties cannot be defined at all, there is still a lot of controversy about the best and least arbitrary approach to doing this. (shrink)
Belief revision (BR) and truthlikeness (TL) emerged independently as two research programmes in formal methodology in the 1970s. A natural way of connecting BR and TL is to ask under what conditions the revision of a belief system by new input information leads the system towards the truth. It turns out that, for the AGM model of belief revision, the only safe case is the expansion of true beliefs by true input, but this is not very interesting or realistic as (...) a model of theory change in science. The new accounts of non-prioritized belief revision do not seem more promising in this respect, and the alternative BR account of updating by imaging leads to other problems. Still, positive results about increasing truthlikeness by belief revision may be sought by restricting attention to special kinds of theories. Another approach is to link truthlikeness to epistemic matters by an estimation function which calculates expected degrees of truthlikeness relative to evidence. Then we can study how the expected truthlikeness of a theory changes when probabilities are revised by conditionalization or imaging. Again, we can ask under what conditions such changes lead our best theories towards the truth. (shrink)
In a seminar with the title “Deduction and Induction in the Sciences”, it is intriguing to ask the following questions: Is there a third type of inference besides deduction and induction? Does this third type of inference play a significant role within scientific inquiry? A positive answer to both of these questions was advocated by Charles S. Peirce throughout his career, even though his opinions changed in important ways during the fifty years between 1865 and 1914. Peirce called the third (...) kind of inference “hypothesis”, “abduction”, or “retroduction”.1 In this paper, I shall follow Peirce’s steps in discussing abduction by analyzing its logical form , its role in science , and the grounds of its validity . We shall see that Peirce’s discussion is more insightful than many recent attempts to analyze abductive inference. Still, recently some progress has been made in the treatment of abduction within the Bayesian theory of epistemic probability and truth-approximation . The results of this work support the view of scientific realism: abduction or inference to the best explanation , combined with empirical and experimental testing of scientific theories, is the best method of seeking informative truths in science. (shrink)
From its inception in 1987 social epistemology has been divided into analytic and critical approaches, represented by Alvin I. Goldman and Steve Fuller, respectively. In this paper, the agendas and some basic ideas of ASE and CSE are compared and assessed by bringing into the discussion also other participants of the debates on the social aspects of scientific knowledge—among them Raimo Tuomela, Philip Kitcher and Helen Longino. The six topics to be analyzed include individual and collective epistemic agents; the notion (...) of scientific community; realism and constructivism; truth-seeking communities; epistemic and social values; science, experts, and democracy. (shrink)
The use of idealized scientific theories in explanations of empirical facts and regularities is problematic in two ways: they don’t satisfy the condition that the explanans is true, and they may fail to entail the explanandum. An attempt to deal with the latter problem was proposed by Hempel and Popper with their notion of approximate explanation. A more systematic perspective on idealized explanations was developed with the method of idealization and concretization by the Poznan school in the 1970s. If idealizational (...) laws are treated as counterfactual conditionals, they can be true or truthlike, and the concretizations of such laws may increase their degree of truthlikeness. By replacing Hempel’s truth requirement with the condition that an explanatory theory is truthlike one can distinguish several important types of approximate, corrective, and contrastive explanations by idealized theories. The conclusions have important consequences for the debates about scientific realism and anti-realism. (shrink)
In 1958, to refute the argument known as the theoretician's dilemma, Hempel suggested that theoretical terms might be logically indispensable for inductive systematization of observational statements. This thesis, in some form or another, has later been supported by Scheffler, Lehrer, and Tuomela, and opposed by Bohnert, Hooker, Stegmüller, and Cornman. In this paper, a critical survey of this discussion is given. Several different putative definitions of the crucial notion inductive systematization achieved by a theory are discussed by reference to the (...) properties of inductive inference. The consequences of the following differences between deductive and inductive inference are emphasized: the lack of simple transitivity properties (even in a modified sense) of inductive inference, and the failure of the inductive analogue of the converse of The Deduction Theorem. The main conclusions are: (i) Hempel's original thesis may very well be right but his argument for it is unsatisfactory, (ii) theoretical terms can be logically indispensable for a non-Hempelian kind of inductive systematization, relative to both Craigian and Ramseyan elimination, (iii) Lehrer's attempt to prove the indispensability of theoretical terms for inductive-probabilistic systematization is, as a modification of Hempelian kind of inductive-deterministic systematization, unsatisfactory, and (iv) there does not seem to be much hope of escaping the conclusion (ii), if it is true, by extending the Craigian replacement programme along the lines suggested by Cornman. (shrink)
This paper employs possible worlds semantics to develop a systematic framework for studying the syntax and the semantics of imagination sentences. Following Hintikka's treatment of prepositional attitudes like knowledge and perception, the propositional construction “a imagines that p” is taken as the basic form to which other sentences (such as “a imagines b”, “a imagines an F”, “a imagines b as an F”) are reduced through quantifiers ranging over ‘world lines’, i.e., functions picking out individuals from the relevant possible worlds (...) or scenes. This intensional analysis is compared and contrasted with Barwise and Perry's situation semantics. It is also suggested that the logic of imagination helps us to understand some peculiarities of fictional discourse. For example, acts of imagination can be directed towards fictional entities (e.g. Donald Duck, Anna Karenina) as well as real ones. Further, fictional texts, like novels, can be thought of as occurring within the scope of an imagination operator, relative to the author or the reader. The author of a fictional text T can be viewed as performing an illocutionary act of recommendation of the form: Let us imagine that T. (shrink)
A holistic account of the meaning of theoretical terms leads scientific realism into serious troubles. Alternative methods of reference fixing are needed by a realist who wishes to show how reference invariance is possible in spite of meaning variance. This paper argues that the similarity theory of truthlikeness and approximate truth, developed by logicians since the mid 1970s, helps to make precise the idea of charitable theoretical reference. Comparisons to the recent proposals by Kitcher and Psillos are given. This argument (...) helps to undermine the scepticist meta-induction about theories, and thereby to reevaluate Laudan's alleged confutation of scientific realism. (shrink)
Following Herbert Simon’s idea of “the sciences of the artificial”, one may contrast descriptive sciences and design sciences: the former are concerned with “how things are”, the latter tell us “how things ought to be in order to attain goals, and to function”. Typical results of design sciences are thus expressions about means—ends relations or technical norms in G. H. von Wright’s sense. Theorizing and modeling are important methods of giving a value-free epistemic justification for such technical norms. The values (...) of design sciences are not criteria for the acceptance of theories or models, but rather antecedents of conditional recommendations of actions. Design sciences are thus value-neutral and value-laden at the same time. (shrink)
In a recent article [4], Fred I. Dretske has proposed a new analysis of natural laws. Dretske rejects the more or less standard view which says that laws are universal truths with a special function or status in science. As an alternative account, he suggests that laws are expressed by singular statements describing the relationship between universal properties and magnitudes: the statement It is a law that F's are G's3.is to be analysed as F-ness ↦ G-ness.I shall argue, however, that (...) Dretske's reasons for rejecting the standard account are not conclusive. Moreover, his own proposal seems to be either inadequate or reducible to that version of the standard account which takes laws to be nomically necessary generalizations. (shrink)
This paper studies the interplay between two notions which are important for the project of defending scientific realism: abduction and truthlikeness. The main focus is the generalization of abduction to cases where the conclusion states that the best theory is truthlike or approximately true. After reconstructing the recent proposals of Theo Kuipers within the framework of monadic predicate logic, I apply my own notion of truthlikeness. It turns out that a theory with higher truthlikeness does not always have greater empirical (...) success than its less truthlike rivals. It is further shown that the notion of expected truthlikeness provides a fallible link from the approximate explanatory success of a theory to its truthlikeness. This treatment can be applied also in cases where even the best potential theory is an idealization that is known to be false. (shrink)
The most elaborate recent accounts of truthlikeness (verisimilitude) apply this notion primarily to generalizations in first-order languages with qualitative predicates. This paper outlines a new approach to the definition of truthlikeness for quantitative statements, including singular statements (point estimation), interval statements (interval estimation), and quantitative laws. In the case of laws, the basic issue is reduced to the topological problem of measuring the distance between two real-valued functions. The solution of this problem makes it possible to define also the notion (...) of approximate truth for quantitative laws. (shrink)
In a series of carefully argued and stimulating papers on realism, Usakli Maki has pointed out that economic theories typically are unrealistic in two senses: by violating "the-whole-truth" and "nothing-but-the-truth" (Maki 1989, 1992b, 1994b). He suggests that realism in economics can still be rescued by regarding theories as partially true descriptions of essences and as lawlike statements about tendencies. In this chapter, I defend realism by an alternative strategy: idealizational (or "isolational") statements are counterfactual conditional (Niiniluoto 1986), and the concepts (...) of truth and truthlikeness can be applied to them (Niiniluoto 1987). Further, false but sufficiently truthlike theories may be taken to refer to real entities in the world (Niiniluoto 1997, 1999). (shrink)
This paper gives a critical evaluation of the philosophical presuppositions and implications of two current schools in the sociology of knowledge: the Strong Programme of Bloor and Barnes; and the Constructivism of Latour and Knorr-Cetina. Bloor's arguments for his externalist symmetry thesis (i.e., scientific beliefs must always be explained by social factors) are found to be incoherent or inconclusive. At best, they suggest a Weak Programme of the sociology of science: when theoretical preferences in a scientific community, SC, are first (...) internally explained by appealing to the evidence, e, and the standards or values, V, accepted in SC, then a sociologist may sometimes step in to explain why e and V were accepted in SC. Latour's story about the social construction of facts in scientific laboratories is found to be misleading or incredible. The idea that scientific reality is an artifact turns out to have some interesting affinities with classical pragmatism, instrumentalism, phenomenology, and internal realism. However, the constructivist account of theoretical entities in terms of negotiation and social consensus is less plausible than the alternative realist story which explains consensus by the preexistence of mind-independent real entities. The author concludes that critical scientific realism, developed with the concept of truthlikeness, is compatible with the thesis that scientific beliefs or knowledge claims may be relative to various types of cognitive and practical interests. However, the realist denies, with good reasons, the stronger type of relativism which takes reality and truth to be relative to persons, groups, or social interests. (shrink)
The twenty-eight essays in this Handbook, all by leading experts in the field, provide the most extensive treatment of various epistemological problems, ...
Popper distinguishes the problems of theoretical and pragmatic preference between rival theories, but he claims that there is a common non-inductive solution to both of them, viz. the "best-tested theory", or the theory with the highest degree of corroboration. He further suggests that the degrees of corroboration serve as indicators of verisimilitude. One may therefore raise the question whether the recent theory of verisimilitude gives a general non-inductive solution to the problem of theoretical preference. This paper argues that this is (...) not the case: the theory of verisimilitude is applicable to this problem if and only if there is an independent solution to the problem of induction. Moreover, the solutions to the theoretical and pragmatic problems of preference coincide only in some special cases. (shrink)
The concept of verisimilitude is an indispensable tool for the fallibilist and realist epistemology. Part of the argument for this thesis consists in the important applications of this notion within the history and philosophy of science. But perhaps the harder part is to convince a sceptical reader of the existence of this concept. A general programme for defining and estimating degrees of truthlikeness for various kinds of scientific statements is outlined in some detail. Ten years after Miller's and Tichy's refutation (...) of Popper's attempted definition, this paper reviews recent developments and debates, and concludes that the treatment the Popperian problem with Carnapian logical tools leads to a new synthesis which turns out to formally contain Levi's theory of epistemic utilities as a special case. (shrink)