In the first part of this paper we investigate how scientific theories can be represented by frames. Different kinds of scientific theories can be distinguished in terms of the systematic power of their frames. In the second part we outline the central questions and goals of our research project. In the third and final part of this paper we show that frame-representation is a useful tool in the comparison of the theories of phlogiston and oxygen, despite those theories being traditionally (...) conceived as incommensurable. The frame-theoretic representation reveals common attributes, values and ultimately structural correspondence relations between the two theories. In our view this outcome lends credence to a structural realist view of science. (shrink)
We investigate a lattice of conditional logics described by a Kripke type semantics, which was suggested by Chellas and Segerberg – Chellas–Segerberg (CS) semantics – plus 30 further principles. We (i) present a non-trivial frame-based completeness result, (ii) a translation procedure which gives one corresponding trivial frame conditions for arbitrary formula schemata, and (iii) non-trivial frame conditions in CS semantics which correspond to the 30 principles.
In this paper we discuss the new Tweety puzzle. The original Tweety puzzle was addressed by approaches in non-monotonic logic, which aim to adequately represent the Tweety case, namely that Tweety is a penguin and, thus, an exceptional bird, which cannot fly, although in general birds can fly. The new Tweety puzzle is intended as a challenge for probabilistic theories of epistemic states. In the first part of the paper we argue against monistic Bayesians, who assume that epistemic states can (...) at any given time be adequately described by a single subjective probability function. We show that monistic Bayesians cannot provide an adequate solution to the new Tweety puzzle, because this requires one to refer to a frequency-based probability function. We conclude that monistic Bayesianism cannot be a fully adequate theory of epistemic states. In the second part we describe an empirical study, which provides support for the thesis that monistic Bayesianism is also inadequate as a descriptive theory of cognitive states. In the final part of the paper we criticize Bayesian approaches in cognitive science, insofar as their monistic tendency cannot adequately address the new Tweety puzzle. We, further, argue against monistic Bayesianism in cognitive science by means of a case study. In this case study we show that Oaksford and Chater’s (2007, 2008) model of conditional inference—contrary to the authors’ theoretical position—has to refer also to a frequency-based probability function. (shrink)
For a better understanding of causality Content Type Journal Article Category Essay Review Pages 1-6 DOI 10.1007/s11016-012-9648-3 Authors Alexander Gebharter, Department of Philosophy, Heinrich-Heine-University Düsseldorf, Universitätsstraße 1, 40225 Düsseldorf, Germany Gerhard Schurz, Department of Philosophy, Heinrich-Heine-University Düsseldorf, Universitätsstraße 1, 40225 Düsseldorf, Germany Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
Indicators of the reliability of informants are essential for social learning in a society that is initially dominated by ignorance or superstition. Such reliability indicators should be based on meta-induction over records of truth-success. This is the major claim of this paper, and it is supported in two steps. (1) One needs a non-circular justification of the method of meta-induction, as compared to other (non-inductive) learning methods. An approach to this problem (a variant of Hume's problem) has been developed in (...) earlier papers and is reported in section 2. It is based on the predictive optimality of meta-inductive learning, under the assumption that objective success records are globally available. (2) The rest of the paper develops an extension of this approach, so-called local meta-induction. Here individuals can access only success records of individuals in their immediate epistemic neighborhood. It is shown that local meta-inductive learning can spread reliable information over the entire population, and has clear advantages compared to success-independent social learning methods such as peer-imitation and authority-imitation. (shrink)
Systems of logico-probabilistic (LP) reasoning characterize inference from conditional assertions that express high conditional probabilities. In this paper we investigate four prominent LP systems, the systems O, P, Z, and QC. These systems differ in the number of inferences they licence (O ⊂ P ⊂ Z ⊂ QC). LP systems that license more inferences enjoy the possible reward of deriving more true and informative conclusions, but with this possible reward comes the risk of drawing more false or uninformative conclusions. In (...) the first part of the paper, we present the four systems and extend each of them by theorems that allow one to compute almost-tight lower-probability-bounds for the conclusion of an inference, given lower-probability-bounds for its premises. In the second part of the paper, we investigate by means of computer simulations which of the four systems provides the best balance of reward versus risk. Our results suggest that system Z offers the best balance. (shrink)
Meta-induction, in its various forms, is an imitative prediction method, where the prediction methods and the predictions of other agents are imitated to the extent that those methods or agents have proven successful in the past. In past work, Schurz demonstrated the optimality of meta-induction as a method for predicting unknown events and quantities. However, much recent discussion, along with formal and empirical work, on the Wisdom of Crowds has extolled the virtue of diverse and independent judgment as essential to (...) maintenance of 'wise crowds'. This suggests that meta-inductive prediction methods could undermine the wisdom of the crowd inasmuch these methods recommend that agents imitate the predictions of other agents. In this article, we evaluate meta-inductive methods with a focus on the impact on a group's performance that may result from including meta-inductivists among its members. In addition to considering cases of global accessibility (i.e., cases where the judgments of all members of the group are available to all of the group's members), we consider cases where agents only have access to the judgments of other agents within their own local neighborhoods. (shrink)
Laws of nature take center stage in philosophy of science. Laws are usually believed to stand in a tight conceptual relation to many important key concepts such as causation, explanation, confirmation, determinism, counterfactuals etc. Traditionally, philosophers of science have focused on physical laws, which were taken to be at least true, universal statements that support counterfactual claims. But, although this claim about laws might be true with respect to physics, laws in the special sciences (such as biology, psychology, economics etc.) (...) appear to have—maybe not surprisingly—different features than the laws of physics. Special science laws—for instance, the economic law “Under the condition of perfect competition, an increase of demand of a commodity leads to an increase of price, given that the quantity of the supplied commodity remains constant” and, in biology, Mendel's Laws—are usually taken to “have exceptions”, to be “non-universal” or “to be ceteris paribus laws”. How and whether the laws of physics and the laws of the special sciences differ is one of the crucial questions motivating the debate on ceteris paribus laws. Another major, controversial question concerns the determination of the precise meaning of “ceteris paribus”. Philosophers have attempted to explicate the meaning of ceteris paribus clauses in different ways. The question of meaning is connected to the problem of empirical content, i.e., the question whether ceteris paribus laws have non-trivial and empirically testable content. Since many philosophers have argued that ceteris paribus laws lack empirically testable content, this problem constitutes a major challenge to a theory of ceteris paribus laws. (shrink)
Introduction and Overview Content Type Journal Article Category Introduction Pages 151-163 DOI 10.1007/s10670-011-9288-9 Authors Theo Kuipers, Faculty of Philosophy, University of Groningen, Groningen, The Netherlands Gerhard Schurz, Department of Philosophy, University of Duesseldorf, Universitaetsstrasse 1, Geb. 23.21, 40225 Duesseldorf, Germany Journal Erkenntnis Online ISSN 1572-8420 Print ISSN 0165-0106 Journal Volume Volume 75 Journal Issue Volume 75, Number 2.
This paper utilizes a logical correspondence theorem (which has been proved elsewhere) for the justification of weak conceptions of scientific realism and convergence to truth which do not presuppose Putnam's no-miracles-argument (NMA). After presenting arguments against the reliability of the unrestricted NMA in Sect. 1, the correspondence theorem is explained in Sect. 2. In Sect. 3, historical illustrations of the correspondence theorem are given, and its ontological consequences are worked out. Based on the transitivity of the concept of correspondence, a (...) correspondence-based notion of convergence to truth is developed in Sect. 4. In the final Sect. 5 it is argued that the correspondence theorem together with the assumption of ' minimal realism' yields a justification of a weak version of scientific realism, which is then compared to metaphysical realism and to instrumentalism. (shrink)
This paper elaborates on the following correspondence theorem (which has been defended and formally proved elsewhere): if theory T has been empirically successful in a domain of applications A, but was superseded later on by a different theory T* which was likewise successful in A, then under natural conditions T contains theoretical expressions which were responsible for T’s success and correspond (in A) to certain theoretical expressions of T*. I illustrate this theorem at hand of the phlogiston versus oxygen theories (...) of combustion, and the classical versus relativistic theories of mass. The ontological consequences of the theorem are worked out in terms of the indirect reference and partial truth. The final section explains how the correspondence theorem may justify a weak version of scientific realism without presupposing the no-miracles argument. (shrink)
Although I agree with Elqayam & Evans' (E&E's) criticisms of is-ought and ought-is fallacies, I criticize their rejection of normativism on two grounds: (1) Contrary to E&E's assumption, not every normative system of reasoning consists of formal rules. (2) E&E assume that norms of reasoning are grounded on intuition or authority, whereas in contemporary epistemology they have to be justified, primarily by their truth-conduciveness.
The expansion or revision of false theories by true evidence does not always increase their verisimilitude. After a comparison of different notions of verisimilitude the relation between verisimilitude and belief expansion or revision is investigated within the framework of the relevant element account. We are able to find certain interesting conditions under which both the expansion and the revision of theories by true evidence is guaranteed to increase their verisimilitude.
Zwart and Franssen’s impossibility theorem reveals a conflict between the possible-world-based content-definition and the possible-world-based likeness-definition of verisimilitude. In Sect. 2 we show that the possible-world-based content-definition violates four basic intuitions of Popper’s consequence-based content-account to verisimilitude, and therefore cannot be said to be in the spirit of Popper’s account, although this is the opinion of some prominent authors. In Sect. 3 we argue that in consequence-accounts , content-aspects and likeness-aspects of verisimilitude are not in conflict with each other, but (...) in agreement . We explain this fact by pointing towards the deep difference between possible-world- and the consequence-accounts, which does not lie in the difference between syntactic (object-language) versus semantic (meta-language) formulations, but in the difference between ‘disjunction-of-possible-worlds’ versus ‘conjunction-of-parts’ representations of theories. Drawing on earlier work, we explain in Sect. 4 how the shortcomings of Popper’s original definition can be repaired by what we call the relevant element approach. We propose a quantitative likeness-definition of verisimilitude based on relevant elements which provably agrees with the qualitative relevant content-definition of verisimilitude on all pairs of comparable theories. We conclude the paper with a plea for consequence-accounts and a brief analysis of the problem of language-dependence (Sect. 6). (shrink)
Starting from a brief recapitulation of the contemporary debate on scientific realism, this paper argues for the following thesis : Assume a theory T has been empirically successful in a domain of application A, but was superseded later on by a superior theory T * , which was likewise successful in A but has an arbitrarily different theoretical superstructure. Then under natural conditions T contains certain theoretical expressions, which yielded T's empirical success, such that these T-expressions correspond (in A) to (...) certain theoretical expressions of T * , and given T * is true, they refer indirectly to the entities denoted by these expressions of T * . The thesis is first motivated by a study of the phlogiston–oxygen example. Then the thesis is proved in the form of a logical theorem , and illustrated by further examples. The final sections explain how the correspondence theorem justifies scientific realism and work out the advantages of the suggested account. Introduction: Pessimistic Meta-induction vs. Structural Correspondence The Case of the Phlogiston Theory Steps Towards a Systematic Correspondence Theorem The Correspondence Theorem and Its Ontological Interpretation Further Historical Applications Discussion of the Correspondence Theorem: Objections and Replies Consequences for Scientific Realism and Comparison with Other Positions 7.1 Comparison with constructive empiricism 7.2 Major difference from standard scientific realism 7.3 From minimal realism and correspondence to scientific realism 7.4 Comparison with particular realistic positions CiteULike Connotea Del.icio.us What's this? (shrink)
The justification of induction is of central significance for cross-cultural social epistemology. Different ‘epistemological cultures’ do not only differ in their beliefs, but also in their belief-forming methods and evaluation standards. For an objective comparison of different methods and standards, one needs (meta-)induction over past successes. A notorious obstacle to the problem of justifying induction lies in the fact that the success of object-inductive prediction methods (i.e., methods applied at the level of events) can neither be shown to be universally (...) reliable (Hume's insight) nor to be universally optimal. My proposal towards a solution of the problem of induction is meta-induction. The meta-inductivist applies the principle of induction to all competing prediction methods that are accessible to her. By means of mathematical analysis and computer simulations of prediction games I show that there exist meta-inductive prediction strategies whose success is universally optimal among all accessible prediction strategies, modulo a small short-run loss. The proposed justification of meta-induction is mathematically analytical. It implies, however, an a posteriori justification of object-induction based on the experiences in our world. In the final section I draw conclusions about the significance of meta-induction for the social spread of knowledge and the cultural evolution of cognition, and I relate my results to other simulation results which utilize meta-inductive learning mechanisms. (shrink)
In sec. 1.1 I emphasize the meliorative purpose of epistemology, and I characterize Goldman's epistemology as reliabilistic, cognitive, social, and meliorative. In sec. 1.2 I point out that Goldman's weak notion of knowledge is in conflict with our ordinary usage of 'knowledge'. In sec. 2 I argue for an externalist-internalist hybrid conception of justification which adds reliability-indicators to externalist knowledge. Reliability-indicators produce a veritistic surplus value for the social spread of knowledge. In sec. 3 I analyze some particular meliorative rules (...) which have been proposed by Goldman. I prove that obedience to the rule of maximally specific evidence increases expected veritistic value (sec. 3.1), and I argue that rule-circular arguments are epistemically worthless (sec. 3.2). In the final sec. 3.3 I report a non-circular justification of meta-induction which has been developed elsewhere. (shrink)
In this paper I present a game-theoretical approach to the problem of induction. I investigate the comparative success of prediction methods by mathematical analysis and computer programming. Hume's problem lies in the fact that although the success of object-inductive prediction strategies is quite robust, they cannot be universally optimal. My proposal towards a solution of the problem of induction is meta-induction. I show that there exist meta-inductive prediction strategies whose success is universally optimal, modulo short-run losses which are upper-bounded. I (...) then turn to the implications of my approach for the evolution of cognition. In the final section I suggest a revision of the paradigm of bounded rationality by introducing the distinction between local, general and universal prediction strategies. (shrink)
This paper starts with an examination of the major problems of foundation-oriented epistemology in Sect. 2. Then, in Sects. 3–4, it is argued that the externalistic re-definition of knowledge deprives this concept from useful applications to human’s epistemic practice. From the viewpoint of cultural evolution, the condition of justification is the most important ingredient of knowledge. An alternative foundation-oriented conception of knowledge called third-person internalism is developed in Sect. 2 and Sect. 5. It combines insights of externalism with the requirement (...) of second-order justification. The application of third-person internalism to contextualistic positions leads to an important constraint on contextualism (Sect. 6). The final section (Sect. 7) sketches new prospects for a foundation-oriented epistemology which are based on epistemic optimality arguments. (shrink)
This article suggests a ‘best alternative' justification of induction (in the sense of Reichenbach) which is based on meta-induction . The meta-inductivist applies the principle of induction to all competing prediction methods which are accessible to her. It is demonstrated, and illustrated by computer simulations, that there exist meta-inductivistic prediction strategies whose success is approximately optimal among all accessible prediction methods in arbitrary possible worlds, and which dominate the success of every noninductive prediction strategy. The proposed justification of meta-induction is (...) mathematically analytical. It implies, however, an a posteriori justification of object-induction based on the experiences in our world. *Received November 2005; revised March 2008. †To contact the author, please write to: Department of Philosophy, University of Duesseldorf, Universitaetsstrasse 1, Geb. 23.21, Duesseldorf, Germany D-40225; e-mail: firstname.lastname@example.org. (shrink)
In this paper it is shown that, in spite of their intuitive starting points, Kuipers' accounts lead to counterintuitive consequences. The counterintuitive results of Kuipers' account of H-D confirmation stem from the fact that Kuipers explicates a concept of partial (as opposed to full) confirmation. It is shown that Schurz-Weingartner's relevant-element approach as well as Gemes' content-part approach provide an account of full confirmation that does not lead to these counterintuitive results. One of the unwelcome results of Kuipers' account of (...) nomic truthlikeness is the consequence that a theory Y, in order to be more truthlike than a theory X (where Y and X are incompatible), must imply the entire nomic truth. It is shown how the relevant-element approach to truthlikeness avoids this result. (shrink)
In the first part I argue that normic laws are the phenomenological laws of evolutionary systems. If this is true, then intuitive human reasoning should be fit in reasoning from normic laws. In the second part I show that system P is a tool for reasoning with normic laws which satisfies two important evolutionary standards: it is probabilistically reliable, and it has rules of low complexity. In the third part I finally report results of an experimental study which demonstrate that (...) intuitive human reasoning is in well accord with basic argument patterns of system P. (shrink)
Normic laws have the form "if A, then normally B". This paper attempts to show that if a philosophical analysis of normic laws (1, 4) is combined with certain developments in nonmono- tonic logic (2, 3), the following problems in philosophy of science can be seen in a new pers- pective which, at least in many cases, allows to improve their received analysis: explanation and individual case understanding in the humanities (1, 2), an evolution-theoretic foundation of normic laws which explains (...) their omnipresence and establishes a the connection between prototypi- cal and statstical normality, (¤4), ceteris paribus laws (¤5), differences between physical versus non-physical sciences (¤6) and finally, theory-protection through auxiliary hypotheses (¤7). (shrink)
It has not been sufficiently considered in philosophical discussions of ceteris paribus (CP) laws that distinct kinds of CP-laws exist in science with rather different meanings. I distinguish between (1.) comparative CP-laws and (2.) exclusive CP-laws. There exist also mixed CP-laws, which contain a comparative and an exclusive CP-clause. Exclusive CP-laws may be either (2.1) definite, (2.2) indefinite or (2.3) normic. While CP-laws of kind (2.1) and (2.2) exhibit deductivistic behaviour, CP-laws of kind (2.3) require a probabilistic or non-monotonic reconstruction. (...) CP-laws of kind (1) may be both deductivistic or probabilistic. All these kinds of CP-laws have empirical content by which they are testable, except CP-laws of kind (2.2) which are almost vacuous. Typically, CP-laws of kind (1) express invariant correlations, CP-laws of kind (2.1) express closed system laws of physical sciences, and CP-laws of kind (2.3) express normic laws of non-physical sciences based on evolution-theoretic stability properties. (shrink)
Normic Laws and the Significance of Nonmonotonic Reasoning for Philosophy of Science. Normic laws have the form ‘if A then normally B’. They have been discovered in the explanation debate, but were considered as empirically vacuous (§1). I argue that the prototypical (or ideal) normality of normic laws implies statistical normality (§2), whence normic laws have empirical content. In §3–4 I explain why reasoning from normic laws is nonmonotonic, and why the understanding of the individual case is so important here. (...) After sketching some foundations of nonmonotonic reasoning as developed by AI-researchers (§5), Iargue that normic laws are also the best way to understand ceteris paribus laws (§6). §7 deals with the difference between physical and non-physical disciplines and §9 with the difference between normicity and approximation. In §8 it is shown how nonmonotonic reasoning provides a new understanding of the protection of theories against falsification by auxiliary hypotheses. §10, finally, gives a system- and evolution-theoretical explanation of the deeper reason for the omnipresence of normic laws in practice and science, and forthe connection between ideal and statistical normality. (shrink)
, Pietroski and Rey () suggested a reconstruction of ceteris paribus (CP)-laws, which — as they claim — saves CP-laws from vacuity. This discussion note is intended to show that, although Pietroski and Rey's reconstruction is an improvement in comparison to previous suggestions, it cannot avoid the result that CP-laws are almost vacuous. It is proved that if Cx is an arbitrary (nomological) event-type which has independently identifiable deterministic causes, then for every other (nomological) event-type Ax which is not strictly (...) connected with Cx or with ¬Cx, ‘CP if Ax then Cx’ satisfies the conditions of Pietroski and Rey for CP-laws. It is also shown that Pietroski and Rey's reconstruction presupposes the assumption of determinism. The conclusion points towards some alternatives to Piectroski and Rey's reconstruction. (shrink)
Normic laws have the form "if A, then normally B." They are omnipresent in everyday life and non-physical 'life' sciences such as biology, psychology, social sciences, and humanities. They differ significantly from ceteris-paribus laws in physics. While several authors have doubted that normic laws are genuine laws at all, others have argued that normic laws express a certain kind of prototypical normality which is independent of statistical majority. This paper presents a foundation for normic laws which is based on generalized (...) evolution theory and explains their omnipresence, lawlikeness, and reliability. It is argued that the fact that normic laws are a product of evolution must establish a systematic connection between prototypical and statistical normality. (shrink)
Reasoning about change is a central issue in research on human and robot planning. We study an approach to reasoning about action and change in a dynamic logic setting and provide a solution to problems which are related to the Frame problem. Unlike most work on the frame problem the logic described in this paper is monotonic. It (implicitly) allows for the occurrence of actions of multiple agents by introducing non-stationary notions of waiting and test. The need to state a (...) large number of frame axioms is alleviated by introducing a concept of chronological preservation to dynamic logic. As a side effect, this concept permits the encoding of temporal properties in a natural way. We compare the relative merits of our approach and non-monotonic approaches as regards different aspects of the frame problem. Technically, we show that the resulting extended systems of propositional dynamic logic preserve (weak) completeness, finite model property and decidability. (shrink)
This paper describes the development of theories of scientific explanation since Hempel's earliest models in the 1940ies. It focuses on deductive and probabilistic whyexplanations and their main problems: lawlikeness, explanation-prediction asymmetries, causality, deductive and probabilistic relevance, maximal specifity and homogenity, the height of the probability value. For all of these topic the paper explains the most important approaches as well as their criticism, including the author's own accounts. Three main theses of this paper are: (1) Both deductive and probabilistic explanations (...) are important in science, not reducible to each other. (2) One must distinguish between (cause giving) explanations and (reason giving) justifications and predictions. (3) The adequacy of deductive as well as probabilistic explanations is relative to a pragmatically given background knowledge-which does not exclude, however, the possibility of purely semantic models. (shrink)