In this paper, we investigate the problem of truthapproximation via belief merging, i.e., we ask whether, and under what conditions, a group of inquirers merging together their beliefs makes progress toward the truth about the underlying domain. We answer this question by proving some formal results on how belief merging operators perform with respect to the task of truthapproximation, construed as increasing verisimilitude or truthlikeness. Our results shed new light on the issue of (...) how rational (dis)agreement affects the inquirers’ quest for truth. In particular, they vindicate the intuition that scientific inquiry, and rational discussion in general, benefits from some heterogeneity in opinion and interaction among different viewpoints. The links between our approach and related analyses of truth tracking, judgment aggregation, and opinion dynamics, are also highlighted. (shrink)
The qualitative theory of nomic truthapproximation, presented in Kuipers in his, in which ‘the truth’ concerns the distinction between nomic, e.g. physical, possibilities and impossibilities, rests on a very restrictive assumption, viz. that theories always claim to characterize the boundary between nomic possibilities and impossibilities. Fully recognizing two different functions of theories, viz. excluding and representing, this paper drops this assumption by conceiving theories in development as tuples of postulates and models, where the postulates claim to (...) exclude nomic impossibilities and the models claim to represent nomic possibilities. Revising theories becomes then a matter of adding or revising models and/or postulates in the light of increasing evidence, captured by a special kind of theories, viz. ‘data-theories’. Under the assumption that the data-theory is true, achieving empirical progress in this way provides good reasons for the abductive conclusion that truthapproximation has been achieved as well. Here, the notions of truthapproximation and empirical progress are formally direct generalizations of the earlier ones. However, truthapproximation is now explicitly defined in terms of increasing truth-content and decreasing falsity-content of theories, whereas empirical progress is defined in terms of lasting increased accepted and decreased rejected content in the light of increasing evidence. These definitions are strongly inspired by a paper of Gustavo Cevolani, Vincenzo Crupi and Roberto Festa, viz., “Verisimilitude and belief change for conjunctive theories” :183–222, 2011). (shrink)
This paper primarily deals with theconceptual prospects for generalizing the aim ofabduction from the standard one of explainingsurprising or anomalous observations to that ofempirical progress or even truthapproximation. Itturns out that the main abduction task then becomesthe instrumentalist task of theory revision aiming atan empirically more successful theory, relative to theavailable data, but not necessarily compatible withthem. The rest, that is, genuine empirical progress aswell as observational, referential and theoreticaltruth approximation, is a matter of evaluation andselection, (...) and possibly new generation tasks forfurther improvement. The paper concludes with a surveyof possible points of departure, in AI and logic, forcomputational treatment of the instrumentalist taskguided by the `comparative evaluation matrix''. (shrink)
We investigate the logical and conceptual connections between abductive reasoning construed as a process of belief change, on the one hand, and truthapproximation, construed as increasing (estimated) verisimilitude, on the other. We introduce the notion of â(verisimilitude-guided) abductive belief changeâ and discuss under what conditions abductively changing our theories or beliefs does lead them closer to the truth, and hence tracks truthapproximation conceived as the main aim of inquiry. The consequences of our analysis (...) for some recent discussions concerning belief revision aiming at truthapproximation and inference to the best explanation are also highlighted. (shrink)
Straightforward theory revision, taking into account as effectively as possible the established nomic possibilities and, on their basis induced empirical laws, is conducive for (unstratified) nomic truthapproximation. The question this paper asks is: is it possible to reconstruct the relevant theory revision steps, on the basis of incoming evidence, in AGM-terms? A positive answer will be given in two rounds, first for the case in which the initial theory is compatible with the established empirical laws, then for (...) the case in which it is incompatible with at least one such a law. (shrink)
This paper highlights some connections between work on truthapproximation and work in social epistemology, in particular work on peer disagreement. In some of the literature on truthapproximation, questions have been addressed concerning the efficiency of research strategies for approximating the truth. So far, social aspects of research strategies have not received any attention in this context. Recent findings in the field of opinion dynamics suggest that this is a mistake. How scientists exchange and (...) take into account information about each others’ beliefs may greatly influence the accuracy and speed with which the scientific community as a whole approximates the truth. On the other hand, social epistemologists concerned with peer disagreement have so far neglected the question of how practices of responding to disagreements with peers fare with respect to the goal of approximating the truth. Again, work on opinion dynamics shows that this may be a mistake, and that how we ought to respond to disagreements with our peers may depend on the specific purposes of our investigations. (shrink)
In my From Instrumentalism to Constructive Realism I have shown how an instrumentalist account of empirical progress can be related to nomic truthapproximation. However, it was assumed that a strong notion of nomic theories was needed for that analysis. In this paper it is shown, in terms of truth and falsity content, that the analysis already applies when, in line with scientific common sense, nomic theories are merely assumed to exclude certain conceptual possibilities as nomic possibilities.
Theo AF Kuipers THE THREEFOLD EVALUATION OF THEORIES A SYNOPSIS OF FROM INSTRUMENTALISM TO CONSTRUCTIVE REALISM. ON SOME RELATIONS BETWEEN CONFIRMATION, EMPIRICAL PROGRESS, AND TRUTHAPPROXIMATION (2000) ABSTRACT.
The structuralist theory of truthapproximation essen-tially deals with truthapproximation by theory revision for a fixed domain. However, variable domains can also be taken into account, where the main changes concern domain extensions and restrictions. In this paper I will present a coherent set of definitions of “more truth-likeness”, “empirical progress” and “truthapproximation” due to a revision of the domain of intended applications. This set of definitions seems to be the natural (...) counterpart of the basic definitions of similar notions as far as theory revision is concerned. The formal aspects of theory revision strongly suggest an analogy between truthapproximation and design research, for example, drug research. Whereas a new drug may be better for a certain disease than an old one, a certain drug may be better for another disease than for the original target disease, a phenomenon which was nicely captured by the title of a study by Rein Vos : Drugs Looking for Diseases. Similarly, truthapproximation may not only take the shape of theory revision but also of domain revision, naturally suggesting the phenomenon of “Theories looking for domains”. However, whereas Vos documented his title with a number of examples, so far, apart from plausible cases of “truth accumulation by domain extension”, I did not find clear-cut empirical instantiations of the analogy, only, as such, very interesting, non-empirical examples. (shrink)
Surprisingly enough, modified versions of the confirmation theory of Carnap and Hempel and the truthapproximation theory of Popper turn out to be smoothly synthesizable. The glue between confirmation and truthapproximation appears to be the instrumentalist methodology, rather than the falsificationist one.By evaluating theories separately and comparatively in terms of their successes and problems (hence even if they are already falsified), the instrumentalist methodology provides – both in theory and in practice – the straight route (...) for short-term empirical progress in science in the spirit of Laudan. However, it is argued that such progress is also functional for all kinds of truthapproximation: observational, referential, and theoretical. This sheds new light on the long-term dynamic of science and hence on the relation between the main epistemological positions, viz., instrumentalism (Toulmin, Laudan), constructive empiricism (van Fraassen), referential realism (Hacking and Cartwright), and theory realism of a non-essentialist nature (Popper), here called constructive realism.In From Instrumentalism to Constructive Realism (2000) the above story is presented in great detail. The present synopsis highlights the main ways of theory evaluation presented in that book, viz. evaluation in terms of confirmation (or falsification), empirical progress and truthapproximation. (shrink)
Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a ‘Hypothetico-Probabilistic (HP-) method’, in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truthapproximation. In (...) all three cases, the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate ‘concretizations’ of their deductive analogs, being ‘idealizations’. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison (LC-) method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truthapproximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem. (shrink)
In this paper it is argued that the theory of truthapproximation should be pursued in the framework of some kind of geometry of logic. More specifically it is shown that the theory of interval structures provides a general framework for dealing with matters of truthapproximation. The qualitative and the quantitative accounts of truthlikeness turn out to be special cases of the interval account. This suggests that there is no principled gap between the qualitative and (...) quantitative approach. Rather, there is a connected spectrum of ways of measuring truthlikeness depending on the specifics of the context in which it takes place. (shrink)
The main formal notion involved in qualitative truthapproximation by the HD-method, viz. ‘more truthlike’, is shown to not only have, by its definition, an intuitively appealing ‘model foundation’, but also, at least partially, a conceptually plausible ‘consequence foundation’. Moreover, combining the relevant parts of both leads to a very appealing ‘dual foundation’, the more so since the relevant methodological notions, viz. ‘more successful’ and its ingredients provided by the HD-method, can be given a similar dual foundation. According (...) to the resulting dual foundation of ‘naive truthapproximation’, the HD-method provides successes (established true consequences) and counterexamples (established wrongly missing models) of theories. Such HD-results may support the tentative conclusion that one theory seems to remain more successful than another in the naive sense of having more successes and fewer counterexamples. If so, this provides good reasons for believing that the more successful theory is also more truthlike in the naive sense of having more correct models and more true consequences. (shrink)
I sketch the most important epistemological positions in the instrumentalism-realism debate, viz., instrumentalism, constructive empiricism, referential realism, and theory realism. I order them according to their answers to a number of successive leading questions, where every next question presupposes an affirmative answer to the foregoing one. I include the answer to questions concerning truth, as well as the most plausible answer to questions concerning truthapproximation. Restricting my survey to the natural sciences and hence to the natural (...) world, I indicate the implications of the results of the study of empirical progress and truthapproximation for the way these epistemological positions are related. I conclude that there are good reasons for the instrumentalist to become a constructive empiricist; in order to give deeper explanations of success differences, the constructive empiricist is forced to become a referential realist; and, there are good reasons for the referential realist to become a theory realist of a non-essentialist nature, here called a constructive realist. (shrink)
Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a 'Hypothetico-Probabilistic method', in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truthapproximation. In all (...) three cases, the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate 'concretizations' of their deductive analogs, being 'idealizations'. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truthapproximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem. (shrink)
This paper primarily deals with the conceptual prospects for generalizing the aim of abduction from the standard one of explaining surprising or anomalous observations to that of empirical progress or even truthapproximation. It turns out that the main abduction task then becomes the instrumentalist task of theory revision aiming at an empirically more successful theory, relative to the available data, but not necessarily compatible with them. The rest, that is, genuine empirical progress as well as observational, referential (...) and theoretical truthapproximation, is a matter of evaluation and selection, and possibly new generation tasks for further improvement. The paper concludes with a survey of possible points of departure, in AI and logic, for computational treatment of the instrumentalist task guided by the ‘comparative evaluation matrix’. (shrink)
Assuming that the target of theory oriented empirical science in general and of nomic truthapproximation in particular is to characterize the boundary or demarcation between nomic possibilities and nomic impossibilities, I have presented, in my article entitled “Models, postulates, and generalized nomic truthapproximation”, the ‘basic’ version of generalized nomic truthapproximation, starting from ‘two-sided’ theories. Its main claim is that nomic truthapproximation can perfectly be achieved by combining two prima (...) facie opposing views on theories: the traditional view: theories are postulates that exclude certain possibilities from being realizable, enabling explanation and prediction and the model view: theories are sets of models that claim to represent certain realizable possibilities. Nomic truthapproximation, i.e. increasing truth-content and decreasing falsity-content, becomes in this way revising theories by revising their models and/or their postulates in the face of increasing evidence. The basic version of generalized nomic truthapproximation is in many respects as simple as possible. Among other things, it does not take into account that one conceptual possibility may be more similar to another than a third one. However, for example, one theory may include a possibility that is more similar to a wrongly not included possibility than another theory can offer. Similarly, for wrongly not excluded possibilities. In this article it will be shown that such ‘refined’ considerations can be taken into account by adapted clauses based on a ternary similarity relation between possibilities. This allows again abductive conclusions about refined truthapproximation if a theory is persistently more successful in the refined sense than another. It will also be indicated and illustrated that this refined approach enables a specification to the effect that refined truthapproximation can be obtained by the method of idealization and subsequent concretization. Finally, the basic and the refined approach will be evaluated with regard to some general principles and objections that have been discussed in the literature. (shrink)
This book is the first of two volumes devoted to the work of Theo Kuipers, a leading Dutch philosopher of science. Philosophers and scientists from all over the world, thirty seven in all, comment on Kuipers' philosophy, and each of their commentaries is followed by a reply from Kuipers. The present volume focuses on Kuipers' views on confirmation, empirical progress, and truthapproximation, as laid down in his From Instrumentalism to Constructive Realism. In this book, Kuipers offered a (...) synthesis of Carnap's and Hempel's confirmation theory on the one hand, and Popper's theory of truthapproximation on the other. The key element of this synthesis is a sophisticated methodology, which enables the evaluation of theories in terms of their problems and successes, and which also fits well with the claim that one theory is closer to the truth than another. Ilkka Niiniluoto, Patrick Maher, John Welch, Gerhard Schurz, Igor Douven, Bert Hamminga, David Miller, Johan van Benthem, Sjoerd Zwart, Thomas Mormann, Jesús Zamora Bonilla, Isabella Burger & Johannes Heidema, Joke Meheus, Hans Mooij, and Diderik Batens comment on these ideas of Kuipers, and many present their own account. The present book also contains a synopsis of From Instrumentalism to Constructive Realism. It can be read independently of the second volume of Essays in Debate with Theo Kuipers, which is devoted to Kuipers' Structures in Science. (shrink)
This paper supplies a structuralist reconstruction of the Modigliani-Miller theory and shows that the economic literature following their results reports on research with an implicit strategy to come "closer-to-the-truth" in the modern technical sense in philosophy of science.
The naive structuralist definition of truthlikeness is an idealization in the sense that it assumes that all mistaken models of a theory are equally bad. The natural concretization is a refined definition based on an underlying notion of structurelikeness.In Section 1 the naive definition of truthlikeness of theories is presented, using a new conceptual justification, in terms of instantial and explanatory mistakes.
In this paper we provide a compact presentation of the verisimilitudinarian approach to scientific progress (VS, for short) and defend it against the sustained attack recently mounted by Alexander Bird (2007). Advocated by such authors as Ilkka Niiniluoto and Theo Kuipers, VS is the view that progress can be explained in terms of the increasing verisimilitude (or, equivalently, truthlikeness, or approximation to the truth) of scientific theories. According to Bird, VS overlooks the central issue of the appropriate grounding (...) of scientific beliefs in the evidence, and it is therefore unable (a) to reconstruct in a satisfactory way some hypothetical cases of scientific progress, and (b) to provide an explanation of the aversion to falsity that characterizes scientific practice. We rebut both of these criticisms and argue that they reveal a misunderstanding of some key concepts underlying VS. (shrink)
ABSTRACTIn a recent paper in this journal, entitled ‘Scientific Progress: Why Getting Closer to Truth is Not Enough’, Moti Mizrahi argues that the view of progress as approximation to the truth or increasing verisimilitude is plainly false. The key premise of his argument is that on such a view of progress, in order to get closer to the truth one only needs to arbitrarily add a true disjunct to a hypothesis or theory. Since quite clearly scientific (...) progress is not a matter of adding true disjuncts to theories, the argument goes, the view of progress as approximation to the truth is untenable. We show that the key premise of Mizrahi’s argument is false: according to verisimilitude-based accounts of progress, adding arbitrary true disjuncts to existing theories is just not enough to get closer to the truth. (shrink)
There are several conceptions of truth, such as the classical correspondence conception, the coherence conception and the pragmatic conception. The classical correspondence conception, or Aristotelian conception, received a mathematical treatment in the hands of Tarski (cf. Tarski  and ), which was the starting point of a great progress in logic and in mathematics. In effect, Tarski's semantic ideas, especially his semantic characterization of truth, have exerted a major influence on various disciplines, besides logic and mathematics; for instance, (...) linguistics, the philosophy of science, and the theory of knowledge. The importance of the Tarskian investigations derives, among other things, from the fact that they constitute a mathematical, formal mark to serve as a reference for the philosophical (informal) conceptions of truth. Today the philosopher knows that the classical conception can be developed and that it is free from paradoxes and other difficulties, if certain precautions are taken. We believe that is not an exaggeration if we assert that Tarski's theory should be considered as one of the greatest accomplishments of logic and mathematics of our time, an accomplishment which is also of extraordinary relevance to philosophy, as we have already remarked. In this paper we show that the pragmatic conception of truth, at least in one of its possible interpretations, has also a mathematical formulation, similar in spirit to that given by Tarski to the classical correspondence conception. (shrink)
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda as compared (...) to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
Several argue that truth cannot be science’s sole epistemic goal, for it would fail to do justice to several scientific practices that advance understanding. I challenge these arguments, but only after making a small concession: science’s sole epistemic goal is not truth as such; rather, its goal is finding true answers to relevant questions. Using examples from the natural and social sciences, I then show that scientific understanding’s epistemically valuable features are either true answers to relevant questions or (...) a means thereof. (shrink)
In this paper, three theories of progress and the aim of science are discussed: the theory of progress as increasing explanatory power, advocated by Popper in The logic of scientific discovery ; the theory of progress as approximation to the truth, introduced by Popper in Conjectures and refutations ; the theory of progress as a steady increase of competing alternatives, which Feyerabend put forward in the essay “Reply to criticism. Comments on Smart, Sellars and Putnam” and defended as (...) late as the last edition of Against method. It is argued that, contrary to what Feyerabend scholars have predominantly assumed, Feyerabend's changing attitude towards falsificationism—which he often advocated at the beginning of his career, and vociferously attacked in the 1970s and 1980s—must be explained by taking into account not only Feyerabend's very peculiar view of the aim of science, but also Popper's changing account of progress. (shrink)
ABSTRACTThis discussion note aims to contribute to the ongoing debate over the nature of scientific progress. I argue against the semantic view of scientific progress, according to which scientific progress consists in approximation to truth or increasing verisimilitude. If the semantic view of scientific progress were correct, then scientists would make scientific progress simply by arbitrarily adding true disjuncts to their hypotheses or theories. Given that it is not the case that scientists could make scientific progress simply by (...) arbitrarily adding true disjuncts to their hypotheses or theories, it follows that the semantic view of scientific progress is incorrect. (shrink)
Psychological research on people’s understanding of natural language connectives has traditionally used truth table tasks, in which participants evaluate the truth or falsity of a compound sentence given the truth or falsity of its components in the framework of propositional logic. One perplexing result concerned the indicative conditional if A then C which was often evaluated as true when A and C are true, false when A is true and C is false but irrelevant“ (devoid of value) (...) when A is false (whatever the value of C). This was called the “psychological defective table of the conditional.” Here we show that far from being anomalous the “defective” table pattern reveals a coherent semantics for the basic connectives of natural language in a trivalent framework. This was done by establishing participants’ truth tables for negation, conjunction, disjunction, conditional, and biconditional, when they were presented with statements that could be certainly true, certainly false, or neither. We review systems of three-valued tables from logic, linguistics, foundations of quantum mechanics, philosophical logic, and artificial intelligence, to see whether one of these systems adequately describes people’s interpretations of natural language connectives. We find that de Finetti’s (1936/1995) three-valued system is the best approximation to participants’ truth tables. (shrink)
In this article I give a naturalistic-cum-formal analysis of therelation between beauty, empirical success, and truth. The analysis is based on the onehand on a hypothetical variant of the so-called `mere-exposure effect'' which has been more orless established in experimental psychology regarding exposure-affect relationshipsin general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989;Ye 2000). On the other hand it is based on the formal theory of truthlikeness andtruth approximation as presented in my From Instrumentalism to (...) Constructive Realism (2000).The analysis supports the findings of James McAllister in his beautifulBeauty and Revolution in Science (1996), by explaining and justifying them.First, scientists are essentially right in regarding aesthetic criteria useful for empiricalprogress and even for truthapproximation, provided they conceive of them as less hard thanempirical criteria. Second, the aesthetic criteria of the time, the `aesthetic canon'', maywell be based on `aesthetic induction'' regarding nonempirical features of paradigms of successfultheories which scientists have come to appreciate as beautiful. Third, aestheticcriteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truthapproximation, but this does not happen in the hands of aesthetically flexible, `revolutionary'' scientists. We may find totallyopposite things beautiful: a simplemathematical principle as well as a series of unrepeatable complex contingencies. It is a matter of psychology.(Stephen Jay Gould, translated passage from (Kayzer 2000, 30)). (shrink)
In a seminar with the title “Deduction and Induction in the Sciences”, it is intriguing to ask the following questions: Is there a third type of inference besides deduction and induction? Does this third type of inference play a significant role within scientific inquiry? A positive answer to both of these questions was advocated by Charles S. Peirce throughout his career, even though his opinions changed in important ways during the fifty years between 1865 and 1914. Peirce called the third (...) kind of inference “hypothesis”, “abduction”, or “retroduction”.1 In this paper, I shall follow Peirce’s steps in discussing abduction by analyzing its logical form , its role in science , and the grounds of its validity . We shall see that Peirce’s discussion is more insightful than many recent attempts to analyze abductive inference. Still, recently some progress has been made in the treatment of abduction within the Bayesian theory of epistemic probability and truth-approximation . The results of this work support the view of scientific realism: abduction or inference to the best explanation , combined with empirical and experimental testing of scientific theories, is the best method of seeking informative truths in science. (shrink)
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989; (Ye 2000). On the other hand it is based on the formal theory of truthlikeness and truthapproximation (...) as presented in my From Instrumentalism to Constructive Realism (2000). The analysis supports the findings of James McAllister in his beautiful Beauty and Revolutiorl in Science (1996), by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truthapproximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truthapproximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular. On the other hand it is based on the formal theory of truthlikeness and truthapproximation as presented in my "From Instrumentalism to Constructive (...) Realism". The analysis supports the findings of James McAllister in his beautiful "Beauty and Revolution in Science", by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truthapproximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truthapproximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
Calling into service the theory of truthapproximation of his (1997) and (2000), Kuipers defends the view that "beauty can be a road to the truth" and endorses the general conclusions of McAllister (1996) that aesthetic criteria reasonably play a role in theory selection in science. My comments pertain first to the general adequacy of Kuipers's theory of truthapproximation; secondly to its methodological aspects; thirdly to the aetiolated role that aesthetic factors turn out to (...) play in his account; and fourthly to the question before us, with a remark on McAllister's doctrine that scientific revolutions are characterized above all by novelty of aesthetic judgement. (shrink)