This monograph presents new ideas in nomic truth approximation. It features original and revised papers from a philosopher of science who has studied the concept for more than 35 years. Over the course of time, the author's initial ideas evolved. He discovered a way to generalize his first theory of nomic truth approximation, viz. by dropping an unnecessarily strong assumption. In particular, he first believed to have to assume that theories were maximally specific in the sense that they did not (...) only exclude certain conceptual possibilities, but also that all non-excluded possibilities were in fact claimed to be nomically possible. Now, he argues that the exclusion claim alone, or for that matter the inclusion claim alone, is sufficient to motivate the formal definition of being closer to the nomic truth. The papers collected here detail this generalized view of nomic truthlikeness or verisimilitude. Besides this, the book presents, in adapted form, the relation with several other topics, such as, domain revision, aesthetic progress, abduction, inference to the best explanation, pragmatic aspects, probabilistic methods, belief revision and epistemological positions, notably constructive realism. Overall, the volume presents profound insight into nomic truth approximation. This idea seeks to determine how one theory can be closer to, or more similar to, the truth about what is nomically, e.g. physically, chemically, biologically, possible than another theory. As a result, it represents the ultimate goal of theory oriented empirical science. Theo Kuipers is the author of Studies in Inductive Probability and Rational Expectation, From Instrumentalism to Constructive Realism and Structures in Science. He is the volume-editor of the Handbook on General Philosophy of Science. In 2005 there appeared two volumes of Essays in Debate with Theo Kuipers, entitled Confirmation, Empirical Progress, and Truth Approximation and Cognitive Structures in Scientific Inquiry. (shrink)
INTRODUCTION When Karl Popper published in' his definition of closer-to-the- truth this was an important intellectual event, but not a shocking one. ...
The naive structuralist definition of truthlikeness is an idealization in the sense that it assumes that all mistaken models of a theory are equally bad. The natural concretization is a refined definition based on an underlying notion of structurelikeness.In Section 1 the naive definition of truthlikeness of theories is presented, using a new conceptual justification, in terms of instantial and explanatory mistakes.
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989; (Ye 2000). On the other hand it is based on the formal theory of truthlikeness and truth approximation as presented in (...) my From Instrumentalism to Constructive Realism (2000). The analysis supports the findings of James McAllister in his beautiful Beauty and Revolutiorl in Science (1996), by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truth approximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion of the verisimilitude of such theories, and identify (...) suitable conditions concerning the (partial) correctness of acquired data, under which revising our theories by data leads us closer to “the nomic truth”, construed as the target of scientific inquiry. We conclude by indicating some further developments, generalizations, and open issues arising from our results. (shrink)
In this article I give a naturalistic-cum-formal analysis of therelation between beauty, empirical success, and truth. The analysis is based on the onehand on a hypothetical variant of the so-called `mere-exposure effect'' which has been more orless established in experimental psychology regarding exposure-affect relationshipsin general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989;Ye 2000). On the other hand it is based on the formal theory of truthlikeness andtruth approximation as presented in my From Instrumentalism to Constructive Realism (...) (2000).The analysis supports the findings of James McAllister in his beautifulBeauty and Revolution in Science (1996), by explaining and justifying them.First, scientists are essentially right in regarding aesthetic criteria useful for empiricalprogress and even for truth approximation, provided they conceive of them as less hard thanempirical criteria. Second, the aesthetic criteria of the time, the `aesthetic canon'', maywell be based on `aesthetic induction'' regarding nonempirical features of paradigms of successfultheories which scientists have come to appreciate as beautiful. Third, aestheticcriteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, `revolutionary'' scientists. We may find totallyopposite things beautiful: a simplemathematical principle as well as a series of unrepeatable complex contingencies. It is a matter of psychology.(Stephen Jay Gould, translated passage from (Kayzer 2000, 30)). (shrink)
This paper primarily deals with theconceptual prospects for generalizing the aim ofabduction from the standard one of explainingsurprising or anomalous observations to that ofempirical progress or even truth approximation. Itturns out that the main abduction task then becomesthe instrumentalist task of theory revision aiming atan empirically more successful theory, relative to theavailable data, but not necessarily compatible withthem. The rest, that is, genuine empirical progress aswell as observational, referential and theoreticaltruth approximation, is a matter of evaluation andselection, and possibly new (...) generation tasks forfurther improvement. The paper concludes with a surveyof possible points of departure, in AI and logic, forcomputational treatment of the instrumentalist taskguided by the `comparative evaluation matrix''. (shrink)
In section I the notions of logical and inductive probability will be discussed as well as two explicanda, viz. degree of confirmation, the base for inductive probability, and degree of evidential support, Popper's favourite explicandum. In section II it will be argued that Popper's paradox of ideal evidence is no paradox at all; however, it will also be shown that Popper's way out has its own merits.
The philosophy of science has lost its self-confidence, witness the lack of advanced textbooks in contrast to the abundance of elementary textbooks. Structures in Science is an advanced textbook that explicates, updates, accommodates, and integrates the best insights of logical-empiricism and its main critics. This `neo-classical approach' aims at providing heuristic patterns for research. The book introduces four ideal types of research programs and reanimates the distinction between observational laws and proper theories. It explicates various patterns of explanation by subsumption (...) and specification as well as structures in reductive and other types of interlevel research. Its analysis of theory evaluation leads to new characterizations of confirmation, empirical progress, and pseudoscience. Partial analogies between progress in nomological research and progress in explicative and design research emerge. Finally, special chapters are devoted to design research programs, computational philosophy of science, the structuralist approach to theories, and research ethics. (shrink)
3 in philosophy, and therefore in metaphilosophy, cannot be based on rules that avoid spending time on pseudo-problems. Of course, this implies that, if one succeeds in demonstrating convincingly the pseudo-character of a problem by giving its 'solution', the time spent on it need not be seen as wasted. We conclude this section with a brief statement of the criteria for concept explication as they have been formulated in several places by Carnap, Hempel and Stegmiiller. Hempel's account is still very (...) adequate for a detailed introduction. The process of explication starts with the identification of one or more vague and, perhaps, ambiguous concepts, the so-called explicanda. Next, one tries to disentangle the ambiguities. This, however, need not be possible at once. Ultimately the explicanda are to be replaced by certain counterparts, the so-called explicata, which have to conform to four requirements. They have to be as precise as possible and as simple as possible. In addition, they have to be useful in the sense that they give rise to the formulation of theories and the solution of problems. The three requirements of preciseness, simplicity and usefulness. have of course to be pursued in all concept formation. (shrink)
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular. On the other hand it is based on the formal theory of truthlikeness and truth approximation as presented in my "From Instrumentalism to Constructive Realism". The analysis (...) supports the findings of James McAllister in his beautiful "Beauty and Revolution in Science", by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truth approximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
Straightforward theory revision, taking into account as effectively as possible the established nomic possibilities and, on their basis induced empirical laws, is conducive for (unstratified) nomic truth approximation. The question this paper asks is: is it possible to reconstruct the relevant theory revision steps, on the basis of incoming evidence, in AGM-terms? A positive answer will be given in two rounds, first for the case in which the initial theory is compatible with the established empirical laws, then for the case (...) in which it is incompatible with at least one such a law. (shrink)
The qualitative theory of nomic truth approximation, presented in Kuipers in his, in which ‘the truth’ concerns the distinction between nomic, e.g. physical, possibilities and impossibilities, rests on a very restrictive assumption, viz. that theories always claim to characterize the boundary between nomic possibilities and impossibilities. Fully recognizing two different functions of theories, viz. excluding and representing, this paper drops this assumption by conceiving theories in development as tuples of postulates and models, where the postulates claim to exclude nomic impossibilities (...) and the models claim to represent nomic possibilities. Revising theories becomes then a matter of adding or revising models and/or postulates in the light of increasing evidence, captured by a special kind of theories, viz. ‘data-theories’. Under the assumption that the data-theory is true, achieving empirical progress in this way provides good reasons for the abductive conclusion that truth approximation has been achieved as well. Here, the notions of truth approximation and empirical progress are formally direct generalizations of the earlier ones. However, truth approximation is now explicitly defined in terms of increasing truth-content and decreasing falsity-content of theories, whereas empirical progress is defined in terms of lasting increased accepted and decreased rejected content in the light of increasing evidence. These definitions are strongly inspired by a paper of Gustavo Cevolani, Vincenzo Crupi and Roberto Festa, viz., “Verisimilitude and belief change for conjunctive theories” :183–222, 2011). (shrink)
In my From Instrumentalism to Constructive Realism I have shown how an instrumentalist account of empirical progress can be related to nomic truth approximation. However, it was assumed that a strong notion of nomic theories was needed for that analysis. In this paper it is shown, in terms of truth and falsity content, that the analysis already applies when, in line with scientific common sense, nomic theories are merely assumed to exclude certain conceptual possibilities as nomic possibilities.
While the special volumes of the series of Handbooks of the Philosophy of Science address topics relative to a specific discipline, this general volume deals ...
Can some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it can, moreover under conditions that are the same for ten different measures of confirmation. Further we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
An interesting consequence of the structuralist theory of truth approximation, as developed in my From Instrumentalism to Constructive Realism , henceforth ICR, concerns so-called ‘inference to the best explanation’ . It will be argued that this popular rule among scientific realists can better be replaced by, various kinds of, ‘inference to the best theory’.
An interesting consequence of the structuralist theory of truth approximation, as developed in my From Instrumentalism to Constructive Realism, henceforth ICR, concerns so-called ‘inference to the best explanation’. It will be argued that this popular rule among scientific realists can better be replaced by, various kinds of, ‘inference to the best theory’.
In earlier publications of the first author it was shown that intentional explanation of actions, functional explanation of biological traits and causal explanation of abnormal events share a common structure. They are called explanation by specification (of a goal, a biological function, an abnormal causal factor, respectively) as opposed to explanation by subsumption under a law. Explanation by specification is guided by a schematic train of thought, of which the argumentative steps not concerning questions were already shown to be logically (...) valid (elementary) arguments.Independently, the second author developed a new, inferential approach to erotetic logic, the logic of questions. In this approach arguments resulting in questions, with declarative sentences and/or other questions as premises, are analyzed, and validity of such arguments is defined. (shrink)
Could some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it might, moreover under conditions that are the same for ten different measures of confirmation. Further, we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
Introduction and Overview Content Type Journal Article Category Introduction Pages 151-163 DOI 10.1007/s10670-011-9288-9 Authors Theo Kuipers, Faculty of Philosophy, University of Groningen, Groningen, The Netherlands Gerhard Schurz, Department of Philosophy, University of Duesseldorf, Universitaetsstrasse 1, Geb. 23.21, 40225 Duesseldorf, Germany Journal Erkenntnis Online ISSN 1572-8420 Print ISSN 0165-0106 Journal Volume Volume 75 Journal Issue Volume 75, Number 2.
Standard accounts of the micro-reduction of phenomenological to kinetic thermostatics, based on the postulate relating empirical absolute temperature to mean kinetic energy ū=(3/2)kT, face two problems. The standard postulate also allows 'reduction' in the other direction and it can be criticized from the point of view that reduction postulates need to be ontological identities. This paper presents a detailed account of the reduction, based on the postulate that thermal equilibrium is ontologically identical to having equal mean kinetic energy. In particular, (...) it is shown that this postulate enables reduction only in the appropriate direction, but leaves room for 'evidence transport' in the other. Moreover, it makes possible the derivation (explanation) of the standard postulate, using the existential kinetic hypothesis and phenomenological laws with which it turns out to be laden. (shrink)
This paper supplies a structuralist reconstruction of the Modigliani-Miller theory and shows that the economic literature following their results reports on research with an implicit strategy to come "closer-to-the-truth" in the modern technical sense in philosophy of science.
Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a ‘Hypothetico-Probabilistic (HP-) method’, in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truth approximation. In all three (...) cases, the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate ‘concretizations’ of their deductive analogs, being ‘idealizations’. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison (LC-) method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truth approximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem. (shrink)
In section I the notions of logical and inductive probability will be discussed as well as two explicanda, viz. degree of confirmation, the base for inductive probability, and degree of evidential support, Popper's favourite explicandum. In section II it will be argued that Popper's paradox of ideal evidence is no paradox at all; however, it will also be shown that Popper's way out has its own merits.
Assuming that the target of theory oriented empirical science in general and of nomic truth approximation in particular is to characterize the boundary or demarcation between nomic possibilities and nomic impossibilities, I have presented, in my article entitled “Models, postulates, and generalized nomic truth approximation” :3057–3077, 2016. 10.1007/s11229-015-0916-9), the ‘basic’ version of generalized nomic truth approximation, starting from ‘two-sided’ theories. Its main claim is that nomic truth approximation can perfectly be achieved by combining two prima facie opposing views on theories: (...) the traditional view: theories are postulates that exclude certain possibilities from being realizable, enabling explanation and prediction and the model view: theories are sets of models that claim to represent certain realizable possibilities. Nomic truth approximation, i.e. increasing truth-content and decreasing falsity-content, becomes in this way revising theories by revising their models and/or their postulates in the face of increasing evidence. The basic version of generalized nomic truth approximation is in many respects as simple as possible. Among other things, it does not take into account that one conceptual possibility may be more similar to another than a third one. However, for example, one theory may include a possibility that is more similar to a wrongly not included possibility than another theory can offer. Similarly, for wrongly not excluded possibilities. In this article it will be shown that such ‘refined’ considerations can be taken into account by adapted clauses based on a ternary similarity relation between possibilities. This allows again abductive conclusions about refined truth approximation if a theory is persistently more successful in the refined sense than another. It will also be indicated and illustrated that this refined approach enables a specification to the effect that refined truth approximation can be obtained by the method of idealization and subsequent concretization. Finally, the basic and the refined approach will be evaluated with regard to some general principles and objections that have been discussed in the literature. (shrink)
The structuralist theory of truth approximation essen-tially deals with truth approximation by theory revision for a fixed domain. However, variable domains can also be taken into account, where the main changes concern domain extensions and restrictions. In this paper I will present a coherent set of definitions of “more truth-likeness”, “empirical progress” and “truth approximation” due to a revision of the domain of intended applications. This set of definitions seems to be the natural counterpart of the basic definitions of similar (...) notions as far as theory revision is concerned. The formal aspects of theory revision strongly suggest an analogy between truth approximation and design research, for example, drug research. Whereas a new drug may be better for a certain disease than an old one, a certain drug may be better for another disease than for the original target disease, a phenomenon which was nicely captured by the title of a study by Rein Vos [1991]: Drugs Looking for Diseases. Similarly, truth approximation may not only take the shape of theory revision but also of domain revision, naturally suggesting the phenomenon of “Theories looking for domains”. However, whereas Vos documented his title with a number of examples, so far, apart from plausible cases of “truth accumulation by domain extension”, I did not find clear-cut empirical instantiations of the analogy, only, as such, very interesting, non-empirical examples. (shrink)
The main formal notion involved in qualitative truth approximation by the HD-method, viz. ‘more truthlike’, is shown to not only have, by its definition, an intuitively appealing ‘model foundation’, but also, at least partially, a conceptually plausible ‘consequence foundation’. Moreover, combining the relevant parts of both leads to a very appealing ‘dual foundation’, the more so since the relevant methodological notions, viz. ‘more successful’ and its ingredients provided by the HD-method, can be given a similar dual foundation. According to the (...) resulting dual foundation of ‘naive truth approximation’, the HD-method provides successes (established true consequences) and counterexamples (established wrongly missing models) of theories. Such HD-results may support the tentative conclusion that one theory seems to remain more successful than another in the naive sense of having more successes and fewer counterexamples. If so, this provides good reasons for believing that the more successful theory is also more truthlike in the naive sense of having more correct models and more true consequences. (shrink)
Can some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it can, moreover under conditions that are the same for nine different measures of confirmation. Further we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
Design research programs attempt to bring together the properties of available materials and the demands derived from intended applications. The logic of problem states and state transitions in such programs, including assessment criteria and heuristic principles, is described in settheoretic terms, starting with a naive model comprising an intended profile and the operational profile of a prototype. In a first concretization the useful distinction between structural and functional properties is built into the model. In two further concretizations the inclusion of (...) potential applications is motivated and described for the case of drug research as well as the inclusion of potential realizations for the case of complex products. Next, another line of concretization of the naive model, the incorporation of potentially relevant properties, is sketched. Then the partial analogy between product- and truth-approximation is indicated. We conclude with some remarks about the usefulness of our models for products reaching the market in comparison to the the so-called social construction of technology approach. (shrink)
Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a 'Hypothetico-Probabilistic method', in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truth approximation. In all three cases, (...) the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate 'concretizations' of their deductive analogs, being 'idealizations'. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truth approximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem. (shrink)
Table of ContentsAndrzej KLAWITER, Krzystof #ASTOWSKI: Introduction: Originality, Courage and Responsibility List of Books by Leszek NowakSelected Bibliography of Leszek Nowak's WritingsScience and Idealization Theo A.F. KUIPERS: On Two ...
In this paper it is shown that there is a natural way of dealing with analogy by similarity in inductive systems by extending intuitive ways of introduction of systems without analogy. This procedure leads to Carnap-like systems, with zero probability for contingent generalizations, satisfying a general principle of so-called virtual analogy. This new principle is different from, but compatible with, Carnap's principle. It will be shown that the latter principle is satisfied, and should only be satisfied, if the underlying distance (...) function is such that all predicates have the same "predicate-environment". Finally, the claim that the new systems have the property of order indifference only with respect to the past will be defended. (shrink)
This collection of 17 articles offers an overview of the philosophical activities of a group of philosophers (who have been) working at the Groningen University. The meta-methodological assumption which unifies the research of this group, holds that there is a way to do philosophy which is a middle course between abstract normative philosophy of science and descriptive social studies of science. On the one hand it is argued with social studies of science that philosophy should take notice of what scientists (...) actually do. On the other hand, however, it is claimed that philosophy can and should aim to reveal cognitive patterns in the processes and products of scientific and common sense knowledge. Since it is thought that those patterns can function as guidelines in new research and/or in research in other disciplines, philosophy can nevertheless hold on to the normative aim which is characteristic of 'classical' philosophy of science. Compared to this common assumption, there is a diversity of subjects. Some papers deal with general problems of science, knowledge, cognition and argumentation, others with topics relating to foundational problems of particular sciences. Therefore this volume is of interest to philosophers of science, to philosophers of knowledge and argumentation in general, to philosophers of mind, as well as for scientists working in the physical and applied sciences, biology, psychology and economy who are interested in the foundations of their disciplines. After a foreword by Leszek Nowak and a general introduction by the editors, the book is divided into four parts, with special introductions. - I: Conceptual Analysis in Service of Various Research Programmes (Henk Zandvoort, Rein Vos, Rick Looijen, Gerben Stavenga, Renée Dalitz); - II: The Logic of the Evaluation of Arguments, Hypotheses, Default Rules, and Interesting Theorems (Erik Krabbe, Theo Kuipers, Alfons Keupink, Maarten Janssen/Yao-Hua Tan, Bert Hamminga); - III: Three Challenges to the Truth Approximation Programme (Sjoerd Zwart, Hinne Hettema/Theo Kuipers, Roberto Festa); - IV: Explicating Psychological Intuitions (Anne-Ruth Mackor, Jeanne Peijnenburg, Lex Guichard, Michel ter Hark). The Groningen research group was recently qualified, by an official international assessment committee, as one of the best philosophy research groups in the Netherlands. (shrink)