This Open Access book addresses the age-old problem of infinite regresses in epistemology. How can we ever come to know something if knowing requires having good reasons, and reasons can only be good if they are backed by good reasons in turn? The problem has puzzled philosophers ever since antiquity, giving rise to what is often called Agrippa's Trilemma. The current volume approaches the old problem in a provocative and thoroughly contemporary way. Taking seriously the idea that good reasons are (...) typically probabilistic in character, it develops and defends a new solution that challenges venerable philosophical intuitions and explains why they were mistakenly held. Key to the new solution is the phenomenon of fading foundations, according to which distant reasons are less important than those that are nearby. The phenomenon takes the sting out of Agrippa's Trilemma; moreover, since the theory that describes it is general and abstract, it is readily applicable outside epistemology, notably to debates on infinite regresses in metaphysics. (shrink)
A major objection to epistemic infinitism is that it seems to make justification impossible. For if there is an infinite chain of reasons, each receiving its justification from its neighbour, then there is no justification to inherit in the first place. Some have argued that the objection arises from misunderstanding the character of justification. Justification is not something that one reason inherits from another; rather it gradually emerges from the chain as a whole. Nowhere however is it made clear what (...) exactly is meant by emergence. The aim of this paper is to fill that lacuna: we describe a detailed procedure for the emergence of justification that enables us to see exactly how justification surfaces from a chain of reasons. (shrink)
A characteristic of contemporary analytic philosophy is its ample use of thought experiments. We formulate two features that can lead one to suspect that a given thought experiment is a poor one. Although these features are especially in evidence within the philosophy of mind, they can, surprisingly enough, also be discerned in some celebrated scientific thought experiments. Yet in the latter case the consequences appear to be less disastrous. We conclude that the use of thought experiments is more successful in (...) science than in philosophy. (shrink)
Today it is generally assumed that epistemic justification comes in degrees. The consequences, however, have not been adequately appreciated. In this paper we show that the assumption invalidates some venerable attacks on infinitism: once we accept that epistemic justification is gradual, an infinitist stance makes perfect sense. It is only without the assumption that infinitism runs into difficulties.
Galileo claimed inconsistency in the Aristotelian dogma concerning falling bodies and stated that all bodies must fall at the same rate. However, there is an empirical situation where the speeds of falling bodies are proportional to their weights; and even in vacuo all bodies do not fall at the same rate under terrestrial conditions. The reason for the deficiency of Galileo’s reasoning is analyzed, and various physical scenarios are described in which Aristotle’s claim is closer to the truth than is (...) Galileo’s. The purpose is not to reinstate Aristotelian physics at the expense of Galileo and Newton, but rather to provide evidence in support of the verdict that empirical knowledge does not come from prior philosophy.Author Keywords: Author Keywords: Aristotle; Galileo; Thought experiments; Falling bodies. (shrink)
We discuss two objections that foundationalists have raised against infinite chains of probabilistic justification. We demonstrate that neither of the objections can be maintained.
Like many discussions on the pros and cons of epistemic foundationalism, the debate between C. I. Lewis and H. Reichenbach dealt with three concerns: the existence of basic beliefs, their nature, and the way in which beliefs are related. In this paper we concentrate on the third matter, especially on Lewis’s assertion that a probability relation must depend on something that is certain, and Reichenbach’s claim that certainty is never needed. We note that Lewis’s assertion is prima facie ambiguous, but (...) argue that this ambiguity is only apparent if probability theory is viewed within a modal logic. Although there are empirical situations where Reichenbach is right, and others where Lewis’s reasoning seems to be more appropriate, it will become clear that Reichenbach’s stance is the generic one. We conclude that this constitutes a threat to epistemic foundationalism.Keywords: Epistemic foundationalism; Probability; Clarence Irving Lewis; Hans Reichenbach. (shrink)
From 1929 onwards, C. I. Lewis defended the foundationalist claim that judgements of the form 'x is probable' only make sense if one assumes there to be a ground y that is certain (where x and y may be beliefs, propositions, or events). Without this assumption, Lewis argues, the probability of x could not be anything other than zero. Hans Reichenbach repeatedly contested Lewis's idea, calling it "a remnant of rationalism". The last move in this debate was a challenge by (...) Lewis, defying Reichenbach to produce a regress of probability values that yields a number other than zero. Reichenbach never took up the challenge, but we will meet it on his behalf, as it were. By presenting a series converging to a limit, we demonstrate that x can have a definite and computable probability, even if its justification consists of an infinite number of steps. Next we show the invalidity of a recent riposte of foundationalists that this limit of the series can be the ground of justification. Finally we discuss the question where justification can come from if not from a ground. (shrink)
The notion of probabilistic support is beset by well-known problems. In this paper we add a new one to the list: the problem of transitivity. Tomoji Shogenji has shown that positive probabilistic support, or confirmation, is transitive under the condition of screening off. However, under that same condition negative probabilistic support, or disconfirmation, is intransitive. Since there are many situations in which disconfirmation is transitive, this illustrates, but now in a different way, that the screening-off condition is too restrictive. We (...) therefore weaken this condition to what we call ‘partial’ screening off. We show that the domain defined by partial screening off comprises two mutually exclusive subdomains. In one subdomain disconfirmation is indeed transitive, but confirmation is then intransitive. In the other, confirmation is transitive, but here disconfirmation is once more intransitive. (shrink)
Various arguments have been put forward to show that Zeno-like paradoxes are still with us. A particularly interesting one involves a cube composed of colored slabs that geometrically decrease in thickness. We first point out that this argument has already been nullified by Paul Benacerraf. Then we show that nevertheless a further problem remains, one that withstands Benacerraf s critique. We explain that the new problem is isomorphic to two other Zeno-like predicaments: a problem described by Alper and Bridger in (...) 1998 and a modified version of the problem that Benardete introduced in 1964. Finally, we present a solution to the three isomorphic problems. (shrink)
Can some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it can, moreover under conditions that are the same for ten different measures of confirmation. Further we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
We have earlier shown by construction that a proposition can have a welldefined nonzero probability, even if it is justified by an infinite probabilistic regress. We thought this to be an adequate rebuttal of foundationalist claims that probabilistic regresses must lead either to an indeterminate, or to a determinate but zero probability. In a comment, Frederik Herzberg has argued that our counterexamples are of a special kind, being what he calls ‘solvable’. In the present reaction we investigate what Herzberg means (...) by solvability. We discuss the advantages and disadvantages of making solvability a sine qua non , and we ventilate our misgivings about Herzberg’s suggestion that the notion of solvability might help the foundationalist. (shrink)
In an earlier paper I argued that there are cases in which an infinite probabilistic chain can be completed. According to Jeremy Gwiazda, however, I have merely shown that the chain in question can be computed, not that it can be completed. Gwiazda thereby discriminates between two terms that I used as synonyms. In the present paper I discuss to what extent computability and completability can be meaningfully distinguished.
Some series can go on indefinitely, others cannot, and epistemologists want to know in which class to place epistemic chains. Is it sensible or nonsensical to speak of a proposition or belief that is justified by another proposition or belief, ad infinitum? In large part the answer depends on what we mean by “justification.” Epistemologists have failed to find a definition on which everybody agrees, and some have even advised us to stop looking altogether. In spite of this, the present (...) essay submits a few candidate definitions. It argues that, although not giving the final word, these candidates tell us something about the possibility of infinite epistemic chains. And it shows that they can short-circuit a debate about doxastic justification. (shrink)
In an earlier paper we have shown that a proposition can have a well-defined probability value, even if its justification consists of an infinite linear chain. In the present paper we demonstrate that the same holds if the justification takes the form of a closed loop. Moreover, in the limit that the size of the loop tends to infinity, the probability value of the justified proposition is always well-defined, whereas this is not always so for the infinite linear chain. This (...) suggests that infinitism sits more comfortably with a coherentist view of justification than with an approach in which justification is portrayed as a linear process. (shrink)
It is widely held that the paradox of Achilles and the Tortoise, introduced by Zeno of Elea around 460 B.C., was solved by mathematical advances in the nineteenth century. The techniques of Weierstrass, Dedekind and Cantor made it clear, according to this view, that Achilles’ difficulty in traversing an infinite number of intervals while trying to catch up with the tortoise does not involve a contradiction, let alone a logical absurdity. Yet ever since the nineteenth century there have been dissidents (...) claiming that the apparatus of Weierstrass et al. has not resolved the paradox, and that serious problems remain. It seems that these claims have received unexpected support from recent developments in mathematical physics. This support has however remained largely unnoticed by historians of philosophy, presumably because the relevant debates are cast in mathematical-technical terms that are only accessible to people with the relevant training. That is unfortunate, since the debates in question might well profit from input by philosophers in general and historians of philosophy in particular. Below we will first recall the Achilles paradox, and describe the way in which nineteenth century mathematics supposedly solved it. Then we discuss recent work that contests this solution, reiterating the dissident dogma that no mathematical approach whatsoever can even come close to solving the original Achilles. We shall argue that this dissatisfaction with a mathematical solution is inadequate as it stands, but that it can perhaps be reformulated in the light of new developments in mathematical physics. (shrink)
Various arguments have been put forward to show that Zeno-like paradoxes are still with us. A particularly interesting one involves a cube composed of colored slabs that geometrically decrease in thickness. We first point out that this argument has already been nullified by Paul Benacerraf. Then we show that nevertheless a further problem remains, one that withstands Benacerraf’s critique. We explain that the new problem is isomorphic to two other Zeno-like predicaments: a problem described by Alper and Bridger in 1998 (...) and a modified version of the problem that Benardete introduced in 1964. Finally, we present a solution to the three isomorphic problems. (shrink)
Could some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it might, moreover under conditions that are the same for ten different measures of confirmation. Further, we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
As is well known, implication is transitive but probabilistic support is not. Eells and Sober, followed by Shogenji, showed that screening off is a sufficient constraint for the transitivity of probabilistic support. Moreover, this screening off condition can be weakened without sacrificing transitivity, as was demonstrated by Suppes and later by Roche. In this paper we introduce an even weaker sufficient condition for the transitivity of probabilistic support, in fact one that can be made as weak as one wishes. We (...) explain that this condition has an interesting property: it shows that transitivity is retained even though the Simpson paradox reigns. We further show that by adding a certain restriction the condition can be turned into one that is both sufficient and necessary for transitivity. (shrink)
Theo AF Kuipers THE THREEFOLD EVALUATION OF THEORIES A SYNOPSIS OF FROM INSTRUMENTALISM TO CONSTRUCTIVE REALISM. ON SOME RELATIONS BETWEEN CONFIRMATION, EMPIRICAL PROGRESS, AND TRUTH APPROXIMATION (2000) ABSTRACT.
Heisenberg’s uncertainty principle is a milestone of twentieth-century physics. We sketch the history that led to the formulation of the principle, and we recall the objections of Grete Hermann and Niels Bohr. Then we explain that there are in fact two uncertainty principles. One was published by Heisenberg in the Zeitschrift für Physik of March 1927 and subsequently targeted by Bohr and Hermann. The other one was introduced by Earle Kennard in the same journal a couple of months later. While (...) Kennard’s principle remains untarnished, the principle of Heisenberg has recently been criticized in a way that is very different from the objections by Bohr and Hermann: there are reasons to believe that Heisenberg’s formula is not valid. Experimental results seem to support this claim. -/- . (shrink)
It is argued that probability should be defined implicitly by the distributions of possible measurement values characteristic of a theory. These distributions are tested by, but not defined in terms of, relative frequencies of occurrences of events of a specified kind. The adoption of an a priori probability in an empirical investigation constitutes part of the formulation of a theory. In particular, an assumption of equiprobability in a given situation is merely one hypothesis inter alia, which can be tested, like (...) any other assumption. Probability in relation to some theories – for example quantum mechanics – need not satisfy the Kolmogorov axioms. To illustrate how two theories about the same system can generate quite different probability concepts, and not just different probabilistic predictions, a team game for three players is described. If only classical methods are allowed, a 75% success rate at best can be achieved. Nevertheless, a quantum strategy exists that gives a 100% probability of winning. (shrink)
We have never entirely agreed with Daniel Cohnitz on the status and rôle of thought experiments. Several years ago, enjoying a splendid lunch together in the city of Ghent, we cheerfully agreed to disagree on the matter; and now that Cohnitz has published his considered opinion of our views, we are glad that we have the opportunity to write a rejoinder and to explicate some of our disagreements. We choose not to deal here with all the issues that Cohnitz raises, (...) but rather to restrict ourselves to three specific points. (shrink)
Can some evidence confirm a conjunction of two hypotheses more than it confirms either of the hypotheses separately? We show that it can, moreover under conditions that are the same for nine different measures of confirmation. Further we demonstrate that it is even possible for the conjunction of two disconfirmed hypotheses to be confirmed by the same evidence.
Tom Stoneham put forward an argument purporting to show that coherentists are, under certain conditions, committed to the conjunction fallacy. Stoneham considers this argument a reductio ad absurdum of any coherence theory of justification. I argue that Stoneham neglects the distinction between degrees of confirmation and degrees of probability. Once the distinction is in place, it becomes clear that no conjunction fallacy has been committed.
We discuss two objections that foundationalists have raised against infinite chains of probabilistic justification. We demonstrate that neither of the objections can be maintained.
Some philosophers have claimed that it is meaningless or paradoxical to consider the probability of a probability. Others have however argued that second-order probabilities do not pose any particular problem. We side with the latter group. On condition that the relevant distinctions are taken into account, second-order probabilities can be shown to be perfectly consistent. May the same be said of an infinite hierarchy of higher-order probabilities? Is it consistent to speak of a probability of a probability, and of a (...) probability of a probability of a probability, and so on, {\em ad infinitum}? We argue that it is, for it can be shown that there exists an infinite system of probabilities that has a model. In particular, we define a regress of higher-order probabilities that leads to a convergent series which determines an infinite-order probability value. We demonstrate the consistency of the regress by constructing a model based on coin-making machines. (shrink)
This paper is the third and final one in a sequence of three. All three papers emphasize that a proposition can be justified by an infinite regress, on condition that epistemic justification is interpreted probabilistically. The first two papers showed this for one-dimensional chains and for one-dimensional loops of propositions, each proposition being justified probabilistically by its precursor. In the present paper we consider the more complicated case of two-dimensional nets, where each "child" proposition is probabilistically justified by two "parent" (...) propositions. Surprisingly, it turns out that probabilistic justification in two dimensions takes on the form of Mandelbrot's iteration. Like so many patterns in nature, probabilistic reasoning might in the end be fractal in character. (shrink)
It is argued that the recent revival of theakrasia problem in the philosophy of mind is adirect, albeit unforeseen result of the debate onaction explanation in the philosophy of science. Asolution of the problem is put forward that takesaccount of the intimate links between the problem ofakrasia and this debate. This solution is basedon the idea that beliefs and desires have degrees ofstrength, and it suggests a way of giving a precisemeaning to that idea. Finally, it is pointed out thatthe (...) solution captures certain intuitions of bothSocrates and Aristotle. (shrink)
A distinction is made between imagination in the narrow sense and in the broad sense. Narrow imagination is characterised as the ability to "see" pictures in the mind's eye or to "hear" melodies in the head. Broad imagination is taken to be the faculty of creating, either in the strict sense of making something ex nihilo or in the looser sense of seeing patterns in some data. The article focuses on a particular sort of broad imagination, the kind that has (...) to do with creating, not a work of art, a scientific theory or a political vision but one's own life. We shape our lives through our actions, and these actions not only influence our future—a commonplace—but also determine our past, which is a new and more controversial perspective. (shrink)
Suppose q is some proposition, and let P(q) = v0 (1) be the proposition that the probability of q is v0.1 How can one know that (1) is true? One cannot know it for sure, for all that may be asserted is a further probabilistic statement like P(P(q) = v0) = v1, (2) which states that the probability that (1) is true is v1. But the claim (2) is also subject to some further statement of an even higher probability: P(P(P(q) (...) = v0) = v1) = v2, (3) and so on. Thus, an infinite regress emerges of probabilities of probabilities, and the question arises as to whether this regress is vicious or harmless. Radical probabilists would like to claim that it is harmless, but Nicholas Rescher (2010), in his scholarly and very stimulating Infinite Regress: The Theory and History of Varieties of Change, argues that it is vicious. He believes that an infinite hierarchy of probabilities makes it impossible to know anything about the probability of the original proposition q: unless some claims are going to be categorically validated and not just adjudged probabilistically, the radically probabilistic epistemology envisioned here is going to be beyond the prospect of implementation. . . . If you can indeed be certain of nothing, then how can you be sure of your probability assessments. If all you ever have is a nonterminatingly regressive claim of the format . . . the probability is .9 that (the probability is .9 that (the probability of q is .9)) then in the face of such a regress, you would know effectively nothing about the condition of q. After all, without a categorically established factual basis of some sort, there is no way of assessing probabilities. But if these requisites themselves are never categorical but only probabilistic, then we are propelled into a vitiating regress of presuppositions. (shrink)
Reichenbach's use of 'posits' to defend his frequentistic theory of probability has been criticized on the grounds that it makes unfalsifiable predictions. The justice of this criticism has blinded many to Reichenbach's second use of a posit, one that can fruitfully be applied to current debates within epistemology. We show first that Reichenbach's alternative type of posit creates a difficulty for epistemic foundationalists, and then that its use is equivalent to a particular kind of Jeffrey conditionalization. We conclude that, under (...) particular circumstances, Reichenbach's approach and that of the Bayesians amount to the same thing, thereby presenting us with a new instance in which chance and credence coincide. (shrink)
Richard Jeffrey’s radical probabilism (‘probability all the way down’) is augmented by the claim that probability cannot be turned into certainty, except by data that logically exclude all alternatives. Once we start being uncertain, no amount of updating will free us from the treadmill of uncertainty. This claim is cast first in objectivist and then in subjectivist terms.
In A Treatise of Human Nature, David Hume presents an argument according to which all knowledge reduces to probability, and all probability reduces to nothing. Many have criticized this argument, while others find nothing wrong with it. In this paper we explain that the argument is invalid as it stands, but for different reasons than have been hitherto acknowledged. Once the argument is repaired, it becomes clear that there is indeed something that reduces to nothing, but it is something other (...) than what, according to many, Hume had in mind. Thus two views emerge of what exactly it is that reduces. We surmise that Hume failed to distinguish the two, because he lacked the formal means to differentiate between a rendering of his argument that is in accordance with the probability calculus, and one that is not. (shrink)
The original article has been corrected. Erroneously, a comma and a space were added in line 164 to 500, 500, and the authors would like readers to know that this should instead read 500,500.
A common objection to coherentism is that it cannot account for truth: it gives us no reason to prefer a true theory over a false one, if both theories are equally coherent. By extending Susan Haack's crossword metaphor, the authors argue that there could be circumstances under which this objection is untenable. Although these circumstances are remote, they are in full accordance with the most ambitious modern theories in physics. Coherence may perhaps be truth conducive.
In A Treatise of Human Nature, David Hume presents an argument according to which all knowledge reduces to probability, and all probability reduces to nothing. Many have criticized this argument, while others find nothing wrong with it. In this paper we explain that the argument is invalid as it stands, but for different reasons than have been hitherto acknowledged. Once the argument is repaired, it becomes clear that there is indeed something that reduces to nothing, but it is something other (...) than what, according to many, Hume had in mind. Thus two views emerge of what exactly it is that reduces. We surmise that Hume failed to distinguish the two, because he lacked the formal means to differentiate between a rendering of his argument that is in accordance with the probability calculus, and one that is not. (shrink)
Often, regret implies the wish not to have performed certain actions. In this article I claim that this wish can to some extent be fulfilled: it is possible, in a sense, to influence the character of actions that have already been performed. This possibility arises from combining a first person perspective with an outlook on actions as expressions of tendencies, where tendencies are identified on the basis of a number of actions. The idea is specified within the framework of Carnapian (...) reduction sentences, but this technique is in no sense mandatory: it can be formulated in other vocabularies as well. (shrink)
Is it coherent to speak of the probability of a probability, and the probability of a probability of a probability, and so on? We show that it is, in the sense that a regress of higher-order probabilities can lead to convergent sequences that determine all these probabilities. By constructing an implementable model which is based on coin-making machines, we demonstrate the consistency of our regress.
How certain is Heisenberg’s uncertainty principle?Heisenberg’s uncertainty principle is at the heart of the orthodox or Copenhagen interpretation of quantum mechanics. We first sketch the history that led up to the formulation of the principle. Then we recall that there are in fact two uncertainty principles, both dating from 1927, one by Werner Heisenberg and one by Earle Kennard. Finally, we explain that recent work in physics gives reason to believe that the principle of Heisenberg is invalid, while that of (...) Kennard still stands. (shrink)
In The Structure of Science, Ernest Nagel finds fault with Werner Heisenberg’s explication of the uncertainty principle. Nagel’s complaint is that this principle does not follow from the impossibility of measuring with precision both the position and the momentum of a particle, as Heisenberg intimates, rather it is the other way around. Recent developments in theoretical physics have shown that Nagel’s argument is more substantial than he could have envisaged. In particular it has become clear that there are in fact (...) two uncertainty principles; as a result, there are four pairs of quantities to examine, whereas Heisenberg considers only one. These findings throw new light on Nagel’s criticism. They enable us to see that his intuition was surprisingly apposite, but also make clear where his argument misses the mark. (shrink)
This book is the first of two volumes devoted to the work of Theo Kuipers, a leading Dutch philosopher of science. Philosophers and scientists from all over the world, thirty seven in all, comment on Kuipers' philosophy, and each of their commentaries is followed by a reply from Kuipers. The present volume focuses on Kuipers' views on confirmation, empirical progress, and truth approximation, as laid down in his From Instrumentalism to Constructive Realism. In this book, Kuipers offered a synthesis of (...) Carnap's and Hempel's confirmation theory on the one hand, and Popper's theory of truth approximation on the other. The key element of this synthesis is a sophisticated methodology, which enables the evaluation of theories in terms of their problems and successes, and which also fits well with the claim that one theory is closer to the truth than another. Ilkka Niiniluoto, Patrick Maher, John Welch, Gerhard Schurz, Igor Douven, Bert Hamminga, David Miller, Johan van Benthem, Sjoerd Zwart, Thomas Mormann, Jesús Zamora Bonilla, Isabella Burger & Johannes Heidema, Joke Meheus, Hans Mooij, and Diderik Batens comment on these ideas of Kuipers, and many present their own account. The present book also contains a synopsis of From Instrumentalism to Constructive Realism. It can be read independently of the second volume of Essays in Debate with Theo Kuipers, which is devoted to Kuipers' Structures in Science. (shrink)
This book is the second of two volumes devoted to the work of Theo Kuipers, a leading Dutch philosopher of science. Philosophers and scientists from all over the world, thirty seven in all, comment on Kuipers’ philosophy, and each of their commentaries is followed by a reply from Kuipers. The present volume is devoted to Kuipers’ neo-classical philosophy of science, as laid down in his Structures in Science . Kuipers defends a dialectical interaction between science and philosophy in that he (...) views philosophy of science as a meta-science which formulates cognitive structures that provide heuristic patterns for actual scientific research, including design research. In addition, Kuipers pays considerable attention to the computational approaches to philosophy of science as well as to the ethics of doing research. Thomas Nickles, David Atkinson, Jean-Paul van Bendegem, Maarten Franssen, Anne Ruth Mackor, Arno Wouters, Erik Weber & Helena de Preester, Eric Scerri, Adam Grobler & Andrzej Wisniewski, Alexander van den Bosch, Gerard Vreeswijk, Jaap Kamps, Paul Thagard, Emma Ruttkamp, Robert Causey, Henk Zandvoort comment on these ideas of Kuipers, and many present their own account. The present book also contains a synopsis of Structures in Science. It can be read independently of the first volume of Essays in Debate with Theo Kuipers, which is devoted to Kuipers’ From Instrumentalism to Constructive Realism. (shrink)