This introduction consists of two parts. In the first part, the special issue editors introduce inductive metaphysics from a historical as well as from a systematic point of view and discuss what distinguishes it from other modern approaches to metaphysics. In the second part, they give a brief summary of the individual articles in this special issue.
Certain hypotheses cannot be directly confirmed for theoretical, practical, or moral reasons. For some of these hypotheses, however, there might be a workaround: confirmation based on analogical reasoning. In this paper we take up Dardashti, Hartmann, Thébault, and Winsberg’s (in press) idea of analyzing confirmation based on analogical inference Baysian style. We identify three types of confirmation by analogy and show that Dardashti et al.’s approach can cover two of them. We then highlight possible problems with their model as a (...) general approach to analogical inference and argue that these problems can be avoided by supplementing Bayesian update with Jeffrey conditionalization. (shrink)
Schurz proposed a justification of creative abduction on the basis of the Reichenbachian principle of the common cause. In this paper we take up the idea of combining creative abduction with causal principles and model instances of successful creative abduction within a Bayes net framework. We identify necessary conditions for such inferences and investigate their unificatory power. We also sketch several interesting applications of modeling creative abduction Bayesian style. In particular, we discuss use-novel predictions, confirmation, and the problem of underdetermination (...) in the context of abductive inferences. (shrink)
There are several proposals to resolve the problem of epistemic peer disagreement which concentrate on the question of how to incorporate evidence of such a disagreement. The main positions in this field are the equal weight view, the steadfast view, and the total evidence view. In this paper we present a new argument in favour of the equal weight view. As we will show, this view results from a general approach of forming epistemic attitudes in an optimal way. By this, (...) the argument for equal weighting can be massively strengthened from reasoning via epistemic indifference to reasoning from optimality. (shrink)
Generalized Darwinism models cultural development as an evolutionary process, where traits evolve through variation, selection, and inheritance. Inheritance describes either a discrete unit’s transmission or a mixing of traits. In this article, we compare classical models of cultural evolution and generalized population dynamics with respect to blending inheritance. We identify problems of these models and introduce our model, which combines relevant features of both. Blending is implemented as success-based social learning, which can be shown to be an optimal strategy.
We discuss two influential views of unification: mutual information unification (MIU) and common origin unification (COU). We propose a simple probabilistic measure for COU and compare it with Myrvold’s (2003, 2017) probabilistic measure for MIU. We then explore how well these two measures perform in simple causal settings. After highlighting several deficiencies, we propose causal constraints for both measures. A comparison with explanatory power shows that the causal version of COU is one step ahead in simple causal settings. However, slightly (...) increasing the complexity of the underlying causal structure shows that both measures can easily disagree with explanatory power. The upshot of this is that even sophisticated causally constrained measures for unification ultimately fail to track explanatory relevance. This shows that unification and explanation are not as closely related as many philosophers thought. (shrink)
ABSTRACTIn 2009, an earthquake struck the city L’Aquila, causing more than 300 deaths and leading to a trial which lasted almost four years and – though cleared in the appeal – in which scientists...
Eleonore Stump claims in her book 'Wandering in Darkness' that the problem of evil can be solved best by the help of narratives. In this review her argumentation for this claim is explicated.
In this paper it is argued that there are relevant similarities of Aristotle's account of definition and Carnap's account of explication. To show this, first, Aristotle's conditions of adequacy for definitions are provided and an outline of the main critique put forward against Aristotle's account of definition is given. Subsequently, Carnap's conditions of adequacy for explications are presented and discussed. It is shown that Aristotle's conditions of extensional correctness can be interpreted against the backdrop of Carnap's condition of similarity once (...) one skips Aristotelian essentialism and takes in a Carnapian and more pragmatic stance. Finally, it is argued that, in general, a complementary rational reconstruction of both approaches allows for resolving problems of interpretational underdetermination. (shrink)
Ever since proposals for generalizing the theory of natural evolution have been put forward, the aims and ambitions of both proponents and critics have differed widely. Some consider such proposals as merely metaphors, some as analogies, some aim at a real generalization and unification, and some have even proposed to work out full reductions. In this paper it is argued that these different forms of generalizing the theory of evolution can be systematically re-framed as different approaches for transferring justification from (...) the natural to the cultural realm, and that their differences are basically a matter of degree. With the help of such a classification it should become clearer what to expect, but also what not to expect from the different approaches. (shrink)
According to one view on analyticity in formal languages, a definition of 'analytic' can be given by semantic notions alone. In this contribution we are going to show that a purely semantic conception of analyticity is inadequate. To do so, we provide a method for transforming theories with a synthetic empirical basis into logical equivalent theories with an analytic ``empirical'' basis. We draw the conclusion that a definition of analyticity is adequate only if it is a pragmatic one.
Mereology, the theory of parts and wholes, is sometimes used as a framework for categorisation because it is regarded as ontologically innocent in the sense that the mereological fusion of some entities is nothing over and above the entities. In this paper it is argued that an adequate answer to the question of whether the thesis of the ontological innocence of mereology holds relies crucially on the underlying theory of reference. It is then shown that upholding the thesis comes at (...) high costs, viz. at the cost of a quite strong logical background theory or at paradoxical ways of predicating and counting. (shrink)
This introduction provides a detailed summary of all papers of the special issue on the second conference of the German Society for Philosophy of Science: GWP.2016.
In the debate of epistemic peer disagreement the equal-weight view suggests to split the difference between one's own and one's peer's opinions. An argument in favour of this view---which is prominently held by Adam Elga---is that by such a difference-splitting one may make some use of a so-called wise-crowd effect. In this paper it is argued that such a view faces two main problems: First, the problem that the standards for making use of a wise-crowd effect are quite low. And (...) second, the problem that following the equal-weight view decreases such effects and by this the argument's own basis is defeated. We therefore come to the conclusion that an argument for the equal-weight view with the help of wise-crowd effects as provided more or less explicitly by Elga does not succeed. (shrink)
The Newtonian research program consists of the core axioms of the Principia Mathematica, a sequence of force laws and auxiliary hypotheses, and a set of methodological rules. The latter underwent several changes and so it is sometimes claimed that, historically seen, Newton and the Newtonians added methodological rules post constructione in order to further support their research agenda. An argument of Duhem, Feyerabend, and Lakatos aims to provide a theoretical reason why Newton could not have come up with his theory (...) of the Principia in accordance with his own methodology: Since Newton’s starting point, Kepler’s laws, contradict the law of universal gravitation, he could not have applied the so-called method of analysis and synthesis. In this paper, this argument is examined with reference to the Principia’s several editions. Newton’s method is characterized, and necessary general background assumptions of the argument are made explicit. Finally, the argument is criticized based on a contemporary philosophy of science point of view. (shrink)
In this paper, it is argued that there are relevant similarities between Aristotle’s account of definition and Carnap’s account of explication. To show this, first, Aristotle’s conditions of adequacy for definitions are provided and an outline of the main critique put forward against Aristotle’s account of definition is given. Subsequently, Carnap’s conditions of adequacy for explications are presented and discussed. It is shown that Aristotle’s conditions of extensional correctness can be interpreted against the backdrop of Carnap’s condition of similarity once (...) one skips Aristotelian essentialism and takes a Carnapian and more pragmatic stance. Finally, it is argued that, in general, a complementary rational reconstruction of both approaches allows for resolving problems of interpretational underdetermination. (shrink)
The standard approach to solve prediction tasks is to apply inductive methods such as, e.g., the straight rule. Such methods are proven to be access-optimal in specific prediction settings, but not in all. Within the optimality-approach of meta-induction, success-based weighted prediction methods are proven to be access-optimal in all possible continuous prediction settings. However, meta-induction fails to be access-optimal in so-called demonic discrete prediction environments where the predicted value is inversely correlated with the true outcome. In this paper the problem (...) of discrete prediction environments is addressed by embedding them into a synchronised prediction setting. In such a setting the task consists in providing a discrete and a continuous prediction. It is shown that synchronisation constraints exclude the possibility of demonic environments. (shrink)