This book is a sustained defense of the compatibility of the presence of idealizations in the sciences and scientific realism. So, the book is essentially a detailed response to the infamous arguments raised by Nancy Cartwright to the effect that idealization and scientific realism are incompatible.
Searle’s Chinese Room Argument (CRA) has been the object of great interest in the philosophy of mind, artificial intelligence and cognitive science since its initial presentation in ‘Minds, Brains and Programs’ in 1980. It is by no means an overstatement to assert that it has been a main focus of attention for philosophers and computer scientists of many stripes. It is then especially interesting to note that relatively little has been said about the detailed logic of the argument, whatever significance (...) Searle intended CRA to have. The problem with the CRA is that it involves a very strong modal claim, the truth of which is both unproved and highly questionable. So it will be argued here that the CRA does not prove what it was intended to prove. (shrink)
The main question addressed in this paper is whether some false sentences can constitute evidence for the truth of other propositions. In this paper it is argued that there are good reasons to suspect that at least some false propositions can constitute evidence for the truth of certain other contingent propositions. The paper also introduces a novel condition concerning propositions that constitute evidence that explains a ubiquitous evidential practice and it contains a defense of a particular condition concerning the possession (...) of evidence. The core position adopted here then is that false propositions that are approximately true reports of measurements can constitute evidence for the truth of other propositions. So, it will be argued that evidence is only quasi-factive in this very specific sense. (shrink)
Recently a number of variously motivated epistemologists have argued that knowledge is closely tied to practical matters. On the one hand, radical pragmatic encroachment is the view that facts about whether an agent has knowledge depend on practical factors and this is coupled to the view that there is an important connection between knowledge and action. On the other hand, one can argue for the less radical thesis only that there is an important connection between knowledge and practical reasoning. So, (...) defenders of both of these views endorse the view that knowledge is the norm of practical reasoning. This thesis has recently come under heavy fire and a number of weaker proposals have been defended. In this paper counter-examples to the knowledge norm of reasoning will be presented and it will be argued that this viewand a number of related but weaker viewscannot be sustained in the face of these counter-examples. The paper concludes with a novel proposal concerning the norm of practical reasoning that is immune to the counter-examples introduced here. (shrink)
Following Nancy Cartwright and others, I suggest that most (if not all) theories incorporate, or depend on, one or more idealizing assumptions. I then argue that such theories ought to be regimented as counterfactuals, the antecedents of which are simplifying assumptions. If this account of the logic form of theories is granted, then a serious problem arises for Bayesians concerning the prior probabilities of theories that have counterfactual form. If no such probabilities can be assigned, the the posterior probabilities will (...) be undefined, as the latter are defined in terms of the former. I argue here that the most plausible attempts to address the problem of probabilities of conditionals fail to help Bayesians, and, hence, that Bayesians are faced with a new problem. In so far as these proposed solutions fail, I argue that Bayesians must give up Bayesianism or accept the counterintuitive view that no theories that incorporate any idealizations have ever really been confirmed to any extent whatsoever. Moreover, as it appears that the latter horn of this dilemma is highly implausible, we are left with the conclusion that Bayesianism should be rejected, at least as it stands. (shrink)
Recently Timothy Williamson (2007) has argued that characterizations of the standard (i.e. intuition-based) philosophical practice of philosophical analysis are misguided because of the erroneous manner in which this practice has been understood. In doing so he implies that experimental critiques of the reliability of intuition are based on this misunderstanding of philosophical methodology and so have little or no bearing on actual philosophical practice or results. His main point is that the orthodox understanding of philosophical methodology is incorrect in that (...) it treats philosophical thought experiments in such a way that they can be “filled in” in various ways that undermines their use as counter-examples and that intuition plays no substantial role in philosophical practice when we properly understand that methodology as a result of the possibility of such filling in. In this paper Williamson’s claim that philosophical thought experiments cases can be legitimately filled in this way will be challenged and it will be shown that the experimental critique of the intuition-based methods involved a serious issue. (shrink)
In this article the standard philosophical method involving intuition-driven conceptual analysis is challenged in a new way. This orthodox approach to philosophy takes analysanda to be the specifications of the content of concepts in the form of sets of necessary and sufficient conditions. Here it is argued that there is no adequate account of what necessary and sufficient conditions are. So, the targets of applications of the standard philosophical method so understood are not sufficiently well understood for this method to (...) be dependable. (shrink)
Paradoxes have played an important role both in philosophy and in mathematics and paradox resolution is an important topic in both fields. Paradox resolution is deeply important because if such resolution cannot be achieved, we are threatened with the charge of debilitating irrationality. This is supposed to be the case for the following reason. Paradoxes consist of jointly contradictory sets of statements that are individually plausible or believable. These facts about paradoxes then give rise to a deeply troubling epistemic problem. (...) Specifically, if one believes all of the constitutive propositions that make up a paradox, then one is apparently committed to belief in every proposition. This is the result of the principle of classical logical known as ex contradictione (sequitur) quodlibetthat anything and everything follows from a contradiction, and the plausible idea that belief is closed under logical or material implication (i.e. the epistemic closure principle). But, it is manifestly and profoundly irrational to believe every proposition and so the presence of even one contradiction in one’s doxa appears to result in what seems to be total irrationality. This problem is the problem of paradox-induced explosion. In this paper it will be argued that in many cases this problem can plausibly be avoided in a purely epistemic manner, without having either to resort to non-classical logics for belief (e.g. paraconsistent logics) or to the denial of the standard closure principle for beliefs. The manner in which this result can be achieved depends on drawing an important distinction between the propositional attitude of belief and the weaker attitude of acceptance such that paradox constituting propositions are accepted but not believed. Paradox-induced explosion is then avoided by noting that while belief may well be closed under material implication or even under logical implication, these sorts of weaker commitments are not subject to closure principles of those sorts. So, this possibility provides us with a less radical way to deal with the existence of paradoxes and it preserves the idea that intelligent agents can actually entertain paradoxes. (shrink)
Following the standard practice in sociology, cultural anthropology and history, sociologists, historians of science and some philosophers of science define scientific communities as groups with shared beliefs, values and practices. In this paper it is argued that in real cases the beliefs of the members of such communities often vary significantly in important ways. This has rather dire implications for the convergence defense against the charge of the excessive subjectivity of subjective Bayesianism because that defense requires that communities of Bayesian (...) inquirers share a significant set of modal beliefs. The important implication is then that given the actual variation in modal beliefs across individuals, either Bayesians cannot claim that actual theories have been objectively confirmed or they must accept that such theories have been confirmed relative only to epistemically insignificant communities. (shrink)
Defenders of doxastic voluntarism accept that we can voluntarily commit ourselves to propositions, including belief-contravening propositions. Thus, defenders of doxastic voluntarism allow that we can choose to believe propositions that are negatively implicated by our evidence. In this paper it is argued that the conjunction of epistemic deontology and doxastic voluntarism as it applies to ordinary cases of belief-contravening propositional commitments is incompatible with evidentialism. In this paper ED and DV will be assumed and this negative result will be used (...) to suggest that voluntary belief-contravening commitments are not themselves beliefs and that these sorts of commitments are not governed by evidentialism. So, the apparent incompatibility of the package views noted above can be resolved without ceding evidentialism with respect to beliefs. (shrink)
In a recent article, Peter Gärdenfors (1992) has suggested that the AGM (Alchourrón, Gärdenfors, and Makinson) theory of belief revision can be given an epistemic basis by interpreting the revision postulates of that theory in terms of a version of the coherence theory of justification. To accomplish this goal Gärdenfors suggests that the AGM revision postulates concerning the conservative nature of belief revision can be interpreted in terms of a concept of epistemic entrenchment and that there are good empirical reasons (...) to adopt this view as opposed to some form of foundationalist account of the justification of our beliefs. In this paper I argue that Gärdenfors’ attempt to underwrite the AGM theory of belief revision by appealing to a form of coherentism is seriously inadequate for several reasons. (shrink)
In this paper Timothy Williamson’s argument that the knowledge norm of assertion is the best explanation of the unassertability of Morrean sentences is challenged and an alternative account of the norm of assertion is defended.
In a series of influential articles, George Bealer argues for the autonomy of philosophical knowledge on the basis that philosophically known truths must be necessary truths. The main point of his argument is that the truths investigated by the sciences are contingent truths to be discovered a posteriori by observation, while the truths of philosophy are necessary truths to be discovered a priori by intuition. The project of assimilating philosophy to the sciences is supposed to be rendered illegitimate by the (...) more or less sharp distinction in these characteristic methods and its modal basis. In this article Bealer's particular way of drawing the distinction between philosophy and science is challenged in a novel manner, and thereby philosophical naturalism is further defended. (shrink)
Stalnaker argued that conditional excluded middle should be included in the principles that govern counterfactuals on the basis that intuitions support that principle. This is because there are pairs of competing counterfactuals that appear to be equally acceptable. In doing so, he was forced to introduced semantic vagueness into his system of counterfactuals. In this paper it is argued that there is a simpler and purely epistemic explanation of these cases that avoids the need for introducing semantic vagueness into the (...) semantics for counterfactuals. (shrink)
In this paper it is argued that three of the most prominent theories of conditional acceptance face very serious problems. David Lewis' concept of imaging, the Ramsey test and Jonathan Bennett's recent hybrid view all face viscious regresses, or they either employ unanalyzed components or depend upon an implausibly strong version of doxastic voluntarism.
In this paper we argue that dissociative identity disorder (DID) is best interpreted as a causal model of a (possible) post-traumatic psychological process, as a mechanical model of an abnormal psychological condition. From this perspective we examine and criticize the evidential status of DID, and we demonstrate that there is really no good reason to believe that anyone has ever suffered from DID so understood. This is so because the proponents of DID violate basic methodological principles of good causal modeling. (...) When every ounce of your concentration is fixed upon blasting a winged pig out of the sky, you do not question its species' ontological status. James Morrow, City of Truth (1990). (shrink)
It is a commonplace belief that many beliefs, e.g. religious convictions, are a purely private matter, and this is meant in some way to serve as a defense against certain forms of criticism. In this paper it is argued that this thesis is false, and that belief is really often a public matter. This argument, the publicity of belief argument, depends on one of the most compelling and central thesis of Peircean pragmatism. This crucial thesis is that bona fide belief (...) cannot be separated from action. It is then also suggested that we should accept a form of W. K. Clifford's evidentialism. When these theses are jointly accepted in conjunction with the basic principle of ethics that it is prima facie wrong to act in such a way that may subject others to serious but unnecessary and avoidable harm, it follows that many beliefs are morally wrong. (shrink)
In his 1993 article George Bealer offers three separate arguments that are directed against the internal coherence of empiricism, specifically against Quine’s version of empiricism. One of these arguments is the starting points argument (SPA) and it is supposed to show that Quinean empiricism is incoherent. We argue here that this argument is deeply flawed, and we demonstrate how a Quinean may successfully defend his views against Bealer’s SPA. Our defense of Quinean empiricism against the SPA depends on showing (1) (...) that Bealer is, in an important sense, a foundationalist, and (2) that Quine is, in an important sense, a coherentist. Having established these two contentions we show that Bealer’s SPA begs the question against Quinean empiricists. (shrink)
Some contemporary theologically inclined epistemologists, the reformed epistemologists, have attempted to show that belief in God is rational by appealing directly to a special kind of experience. To strengthen the appeal to this particular, and admittedly peculiar, type of experience these venture to draw a parallel between such experiences and normal perceptual experiences in order to show that, by parity of reasoning, if beliefs formed on the basis of the later are taken to be justified and rational to hold, then (...) beliefs formed on the basis of the former should also be regarded as justified and rational to hold. Such appeals to religious experience have been discussed and/or made by Robert Pargetter, Alvin Plantinga and William Alston and they claim that they provide sufficient warrant for religious beliefs, specifically for the belief that God exists. The main critical issue that will be raised here concerns the coherence of this notion of religious experience itself and whether such appeals to religious experience really provide justification for belief in the existence of God.<br><br>. (shrink)
The development of possible worlds semantics for modal claims has led to a more general application of that theory as a complete semantics for various formal and natural languages, and this view is widely held to be an adequate (philosophical) interpretation of the model theory for such languages. We argue here that this view generates a self-referential inconsistency that indicates either the falsity or the incompleteness of PWS.
This paper shows that the knowability paradox isn’t a paradox because the derivation of the paradox is faulty. This is explained by showing that the K operator employed in generating the paradox is used equivocally and when the equivocation is eliminated the derivation fails.
The ontology of decision theory has been subject to considerable debate in the past, and discussion of just how we ought to view decision problems has revealed more than one interesting problem, as well as suggested some novel modifications of classical decision theory. In this paper it will be argued that Bayesian, or evidential, decision-theoretic characterizations of decision situations fail to adequately account for knowledge concerning the causal connections between acts, states, and outcomes in decision situations, and so they are (...) incomplete. Second, it will be argues that when we attempt to incorporate the knowledge of such causal connections into Bayesian decision theory, a substantial technical problem arises for which there is no currently available solution that does not suffer from some damning objection or other. From a broader perspective, this then throws into question the use of decision theory as a model of human or machine planning. (shrink)
This paper shows that any view of future contingent claims that treats such claims as having indeterminate truth values or as simply being false implies probabilistic irrationality. This is because such views of the future imply violations of reflection, special reflection and conditionalization.
Imre Lakatos' views on the philosophy of mathematics are important and they have often been underappreciated. The most obvious lacuna in this respect is the lack of detailed discussion and analysis of his 1976a paper and its implications for the methodology of mathematics, particularly its implications with respect to argumentation and the matter of how truths are established in mathematics. The most important themes that run through his work on the philosophy of mathematics and which culminate in the 1976a paper (...) are (1) the (quasi-)empirical character of mathematics and (2) the rejection of axiomatic deductivism as the basis of mathematical knowledge. In this paper Lakatos' later views on the quasi-empirical nature of mathematical theories and methodology are examined and specific attention is paid to what this view implies about the nature of mathematical argumentation and its relation to the empirical sciences. (shrink)
Some recent work by philosophers of mathematics has been aimed at showing that our knowledge of the existence of at least some mathematical objects and/or sets can be epistemically grounded by appealing to perceptual experience. The sensory capacity that they refer to in doing so is the ability to perceive numbers, mathematical properties and/or sets. The chief defense of this view as it applies to the perception of sets is found in Penelope Maddy’s Realism in Mathematics, but a number of (...) other philosophers have made similar, if more simple, appeals of this sort. For example, Jaegwon Kim (1981, 1982), John Bigelow (1988, 1990), and John Bigelow and Robert Pargetter (1990) have all defended such views. The main critical issue that will be raised here concerns the coherence of the notions of set perception and mathematical perception, and whether appeals to such perceptual faculties can really provide any justification for or explanation of belief in the existence of sets, mathematical properties and/or numbers. (shrink)
In this paper I argue that the best explanation of expertise about taste is that such alleged experts are simply more eloquent in describing the taste experiences that they have than are ordinary tasters.
In this paper the strategy for the eliminative reduction of the alethic modalities suggested by John Venn is outlined and it is shown to anticipate certain related contemporary empiricistic and nominalistic projects. Venn attempted to reduce the alethic modalities to probabilities, and thus suggested a promising solution to the nagging issue of the inclusion of modal statements in empiricistic philosophical systems. However, despite the promise that this suggestion held for laying the ‘ghost of modality’ to rest, this general approach, tempered (...) modal eliminativism, is shown to be inadequate for that task.
This paper has three interdependent aims. The first is to make Reichenbach’s views on induction and probabilities clearer, especially as they pertain to his pragmatic justification of induction. The second aim is to show how his view of pragmatic justification arises out of his commitment to extensional empiricism and moots the possibility of a non-pragmatic justification of induction. Finally, and most importantly, a formal decision-theoretic account of Reichenbach’s pragmatic justification is offered in terms both of the minimax principle and the (...) dominance principle. (shrink)
In this chapter we consider three philosophical perspectives (including those of Stalnaker and Lewis) on the question of whether and how the principle of conditional excluded middle should figure in the logic and semantics of counterfactuals. We articulate and defend a third view that is patterned after belief revision theories offered in other areas of logic and philosophy. Unlike Lewis’ view, the belief revision perspective does not reject conditional excluded middle, and unlike Stalnaker’s, it does not embrace supervaluationism. We adduce (...) both theoretical and empirical considerations to argue that the belief revision perspective should be preferred to its alternatives. The empirical considerations are drawn from the results of four empirical studies (which we report below) of non-experts’ judgments about counterfactuals and conditional excluded middle. (shrink)
In a recent revision (chapter 4 of Nowakowa and Nowak 2000) of an older article Leszek Nowak (1992) has attempted to rebut Niiniluoto’s 1990 critical suggestion that proponents of the Poznań idealizational approach to the sciences have committed a rather elementary logical error in the formal machinery that they advocate for use in the analysis of scientific methodology. In this paper I criticize Nowak’s responses to Niiniluoto’s suggestion, and, subsequently, work out some of the consequences of that criticism for understanding (...) the role that idealization plays in scientific methodology. (shrink)
The generalized correspondence principle is the assertion of something like the following methodological norm: successor theories ought to incorporate precursor theories as special cases. However, the actual core connotation of this principle seems to be that when we are constructing new theories in some domain of application we ought to retain as much of prior but refuted theories as is possible while eliminating inconsistency with the data. As a result, it is argued here that the correspondence principle has not been (...) correctly formulated. Also, it is argued here that there is no compelling extant justification of this proposed methodological norm. (shrink)
The main examples of pragmatic encroachment presented by Jason Stanley involve the idea that knowledge ascription occurs more readily in cases where stakes are low rather than high. This is the stakes hypothesis. In this paper an example is presented showing that in some cases knowledge ascription is more readily appropriate where stakes are high rather than low.
ome recent work by philosophers of mathematics has been aimed at showing that our knowledge of the existence of at least some mathematical objects and/or sets can be epistemically grounded by appealing to perceptual experience. The sensory capacity that they refer to in doing so is the ability to perceive numbers, mathematical properties and/or sets. The chief defense of this view as it applies to the perception of sets is found in Penelope Maddy’s Realism in Mathematics, but a number of (...) other philosophers have made similar, if more simple, appeals of this sort. For example, Jaegwon Kim (1981, 1982), John Bigelow (1988, 1990), and John Bigelow and Robert Pargetter (1990) have all defended such views. The main critical issue that will be raised here concerns the coherence of the notions of set perception and mathematical perception, and whether appeals to such perceptual faculties can really provide any justification for or explanation of belief in the existence of sets, mathematical properties and/or numbers. (shrink)
Some recent work by philosophers of mathematics has been aimed at showing that our knowledge of the existence of at least some mathematical objects and/or sets can be epistemically grounded by appealing to perceptual experience. The sensory capacity that they refer to in doing so is the ability to perceive numbers, mathematical properties and/or sets. The chief defense of this view as it applies to the perception of sets is found in Penelope Maddy’s Realism in Mathematics, but a number of (...) other philosophers have made similar, if more simple, appeals of this sort. For example, Jaegwon Kim, John Bigelow, and John Bigelow and Robert Pargetter have all defended such views. The main critical issue that will be raised here concerns the coherence of the notions of set perception and mathematical perception, and whether appeals to such perceptual faculties can really provide any justification for or explanation of belief in the existence of sets, mathematical properties and/or numbers. (shrink)
In this paper I argue that Tyler Burge's non-reductive view of testiomonial knowledge cannot adeqautrely discriminate between fallacious ad vericumdium appeals to expet testimony and legitimate appeals to authority.
In this paper it is argued that the conjunction of linguistic ersatzism, the ontologically deflationary view that possible worlds are maximal and consistent sets of sentences, and possible world semantics, the view that the meaning of a sentence is the set of possible worlds at which it is true, implies that no actual speaker can effectively use virtually any language to successfully communicate information. This result is based on complexity issues that relate to our finite computational ability to deal with (...) large bodies of information and a strong, but well motivated, assumption about the cognitive accessibility of meanings of sentences ersatzers seem to be implicitly committed to. It follows that linguistic ersatzism, possible world semantics, or both must be rejected. (shrink)
Experimental philosophy is one of the most controversial and potentially revolutionary areas of philosophical research today. X-Phi, as it is known by many of its practitioners, questions many basic concepts regarding human intuitions—concepts which have guided centuries of modern philosophers. In their place, x-phi steers philosophical research back to scientific investigations in order to better understand human intuitions, using research techniques borrowed from current research in psychology and neuroscience. While scholars debate whether experimental philosophy signals a sea change or is (...) merely a faddish detour, no existing book looks at the X-Phi movement in reference to its methodology. In _The Experimental Turn and the Methods of Philosophy_, Michael J.Shaffer addresses this need, suggesting that the significance of experimental philosophy can best be assessed and understood in methodological terms. By comparing and contrasting traditional views of philosophical methodology with those of experimental philosophy, Shaffer traces the roots of the movement to Quinean naturalism and also demonstrates the deep, revolutionary significance of the experimental turn. (shrink)
Theorists in various scientific disciplines offer radically different accounts of the origin of violent behavior in humans, but it is not clear how the study of violence is to be scientifically grounded. This problem is made more complicated because both what sorts of acts constitute violence and what needs to be appealed to in explaining violence differs according to social scientists, biologists, anthropologists and neurophysiologists, and this generates serious problems with respect to even attempting to ascertain the differential bona fides (...) of these various explanatory programs. As a consequence, there is little theoretical reason to suspect that efforts to prevent violence will have any appreciable effect. In this paper we investigate the general issue of whether any of the general theoretical approaches to violent behavior can reasonably be taken to be the best approach to the explanation of seriously violent behavior. Our more specific aim is to examine the controversial explanation of violent behavior offered by Lonnie Athens in order to ascertain whether it can be seriously considered to be the best explanation of violent behavior. (shrink)