The one world, one science argument (so named by Rescher) is advanced by Carl Sagan and others to support the thesis that we will be able to learn to converse with intelligent extraterrestrials if and when we encounter them. The prima facie obstacle to extraterrestrial communication is that the aliens culture and geography are bound to be so different from ours that we would find it extremely difficult, if not practically impossible, to find a common topic on which we can (...) both converse. Sagan's rebuttal is that we will share mathematics and the laws of physics, these being the same for all intelligent beings regardless of local cultural and geographical variations. I show that this argument fails even if its contentious assumptions about science and the world are granted—that is to say, it fails on uncontentious grounds. 1 OWOS 2 OWOS and Social Constructivism 3 OWOS and Conceptual Relativism 4 OWOS and the Selection Problem 5 The Fundamental Laws Solution 6 The Mathematics Solution 7 The Radio Solution 8 The Common Conditions Solution 9 The Intractability of the Selection Problem 10 The Superfluity of OWOS CiteULike Connotea Del.icio.us What's this? (shrink)
Are there truths, states of affairs or knowledge that cannot be put into words? This book distinguishes several varieties and grades of ineffability and tries to ascertain whether they are coherent notions and whether they actually obtain. It is concluded that the weaker grades of ineffability do obtain, and that even the strongest grades have not been shown to be incoherent.
The problem of scientific disregard is the problem of accounting for why some putative theories that appear to be well-supported by empirical evidence nevertheless play no role in the scientific enterprise. Laudan and Leplin suggest (and Hoefer and Rosenberg concur) that at least some of these putative theories fail to be genuine theoretical rivals because they lack some non-empirical property of theoreticity. This solution also supports their repudiation of the thesis of underdetermination. I argue that the attempt to provide criteria (...) of theoreticity fails, that there is a Bayesian solution to the problem of scientific disregard that fares better, and that this successful solution supports a distinctively Bayesian version of the underdetermination thesis. (shrink)
Social constructivists maintain that we invent the properties of the world rather than discover them. Is reality constructed by our own activity? Or, more provocatively, are scientific facts--is everything --constructed? Social Constructivism and the Philosophy of Science is a clear assessment of this critical and increasingly important debate. Andre Kukla presents a comprehensive discussion of the philosophical issues involved and analyzes the strengths and weaknesses of a range of constructivist arguments, illustrating the divide between the sociology and the philosophy of (...) science through examples as varied as laboratory science, time, and criminality. He argues that current philosophical objections to constructivism are drastically inconclusive, while offering and developing new objections. Throughout, Kukla distinguishes between the social causes of scientific beliefs and the view that all ascertainable facts are constructed. (shrink)
This book offers a superbly clear analysis of the standard arguments for and against scientific realism. In surveying claims on both sides of the debate, Kukla organizes them in ways that expose unnoticed connections. He identifies broad patterns of error, reconciles seemingly incompatible positions, and discovers unoccupied positions with the potential to influence further debate. Kukla's overall assessment is that neither the realists nor the antirealists may claim a decisive victory.
Scientific realists have argued that the truth(likeness) of our theories provides the only explanation for the success of science. I consider alternative explanations proposed by antirealists. I endorse Leplin's contention that neither van Fraassen's Darwinist explanation nor Laudan's methodological explanation provides the sort of explanatory alternative which is called for in this debate. Fine's suggestion--that the empirical adequacy of our theories already explains their success--is more promising for antirealists. Leplin claims that this putative explanation collapses into realism on one reading (...) and into vacuity on another reading. But his analysis conflates three doctrines into two, and one of the three avoids both realism and vacuity. (shrink)
The instrumentalist argument from the underdetermination of theories by data runs as follows: (1) every theory has empirically equivalent rivals; (2) the only warrant for believing one theory over another is its possession of a greater measure of empirical virtue; (3) therefore belief in any theory is arbitrary. In this paper, I examine the status of the first premise. Several arguments against the universal availability of empirically equivalent theoretical rivals are criticized, and four algorithms for producing empirically equivalent rivals are (...) defended. I conclude that the case for the first premise of the argument from underdetermination is very strong. The disposition of the argument itself depends on the fate of the second premise. (shrink)
McGinn claims that (1) there is nothing “inherently mysterious” about consciousness, even though (2) we will never be able to understand it. The first claim is no more than a rhetorical flourish. The second may be read either as a claim (1) that we are unable to construct an explanatory theory of consciousness, or (2) that any such theory must strike us as unintelligible, in the sense in which quantum mechanics is sometimes said to be unintelligible. On the first reading, (...) McGinn's argument is based on a false premiss (the “homogeneity constraint"). On the second reading, it suffers from the shortcoming that the central notion of intelligibility is too obscure to permit any definite conclusion. I close with a brief discussion of the contemporary tendency to reject non-physicalist approaches to consciousness on a priori grounds. (shrink)
Forster and Sober present a solution to the curve-fitting problem based on Akaike's Theorem. Their analysis shows that the curve with the best epistemic credentials need not always be the curve that most closely fits the data. However, their solution does not, without further argument, avoid the two difficulties that are traditionally associated with the curve-fitting problem: (1) that there are infinitely many equally good candidate-curves relative to any given set of data, and (2) that these best candidates include curves (...) with indefinitely many bumps. (shrink)
Abstract What should we do when we discover that our assessment of probabilities is incoherent? I explore the hypothesis that there is a logic of incoherence?a set of universally valid rules that specify how incoherent probability assessments are to be repaired. I examine a pair of candidate?rules of incoherence logic that have been employed in philosophical reconstructions of scientific arguments. Despite their intuitive plausibility, both rules turn out to be invalid. There are presently no viable candidate?rules for an incoherence logic (...) on the table. Other ways of dealing with incoherence are surveyed, and found either to be unsatisfactory or to rely on a logic of incoherence in the end. The resolution of these antagonistic conclusions is left to future researchers. (shrink)
It has been claimed that a great deal of AI research is an attempt to discover the empirical laws describing a new type of entity in the world—the artificial computing system. I call this enterprise 'medium AI', since it is in some respects stronger than Searle's 'weak AI', and in other respects weaker than 'strong AI'. Bruce Buchanan, among others, conceives of medium AI as an empirical science entirely on a par with psychology or chemistry. I argue that medium AI (...) is not an empirical science at all. Depending on how artificial computing systems are categorized, it is either an a priori science like mathematics, or a branch of engineering. (shrink)
The antirealist argument from the underdetermination of theories by data relies on the premise that the empirical content of a theory is the only determinant of its belief-worthiness (premise NN). Several authors have claimed that the antirealist cannot endorse NN, on pain of internal inconsistency. I concede this point. Nevertheless, this refutation of the underdetermination argument fails because there are weaker substitutes for NN that will serve just as well as a premise to the argument. On the other hand, antirealists (...) have not made a convincing casefor NN (or its weaker substitutes) either. In particular, I criticize van Fraassen's recent claim that all ampliative rules in epistemology must be rejected on the grounds that they lead to incoherence. The status of the underdetermination argument remains unsettled. (shrink)
Both sides in the debate about scientific realism have argued that their view provides a better account of actual scientific practice. For example, it has been claimed that the practice of theory conjunction presupposes realism, and that scientists' use of multiple and incompatible models presupposes some form of instrumentalism. Assuming that the practices of science are rational, these conclusions cannot both be right. I argue that neither of them is right, and that, in fact, all scientific practices are compatible with (...) both realism and instrumentalism. I also repudiate van Fraassen's argument to the effect that the instrumentalist account of scientific practice is logically weaker, hence better, than the realist account. In the end, there are no scientific practice arguments on the table that support either side of the debate. It is also noted that the deficiencies of van Fraassen's argument are recapitulated in Putnam's miracle argument for realism. My pessimistic assessment of the state of the debate is reminiscent of Arthur Fine's. However, Fine's argument for the ‘natural ontological attitude’ once again repeats the problems of van Fraassen's and Putnam's arguments. (shrink)
Abstract Fodor defines epistemic boundedness as a condition wherein there are epistemi?cally significant constraints on the beliefs that a mind is capable of entertaining. He discusses a type of (epistemic) boundedness wherein a hypothesis cannot be entertained because it is inexpressible in terms of the mind's stock of concepts. In addition to this semantic boundedness, I describe a number of different sources of boundedness having to do with syntactic, abductive, and implementational limitations. I also discuss the similarities and differences between (...) individual and social limitations on our epistemic possibilities. (shrink)
It is widely recognized that computational theories of learning must posit the existence of a priori constraints on hypothesis selection. The present article surveys the theoretical options available for modelling the dynamic process whereby the constraints have their effect. According to the 'simplicity' theory (exemplified by Fodor's treatment), hypotheses are preference-ordered in terms of their syntactic or semantic properties. It is argued that the same explanatory power can be obtained with a weaker (hence better) theory, the 'minimalist' theory, which dispenses (...) with the preference ordering. According to the 'finitistic' theory, the learner is capable of generating only finitely many hypotheses for evaluation. Chomsky maintains that the occurrence of errorless learning in language acquisition necessitates a finististic explanation. Once again, there is a weaker theory that explains the same data. Finally, Goodman's argument to the effect that there cannot be a computational theory of learning is examined and rejected. (shrink)
According to a certain type of instrumentalist, we may have good reasons for accepting scientific theories, but never for believing more than their empirical consequences. Horwich (1991) considers several attempts to capture a difference between acceptance and belief, and claims that none of them succeed. He concludes that instrumentalism has not been shown to be a coherent position. However, in the course of his discussion, Horwich himself deploys a conceptual apparatus which is sufficient for formulating the instrumentalist doctrine in a (...) coherent manner. The worst accusation that can be laid against instrumentalists is that they have violated common linguistic usage. (shrink)
Rationality demands at least that we eliminate incoherencies among our beliefs when we are apprised of them. This minimal requirement gives us no grounds for condemning a refusal to look for incoherencies, or indeed to deliberate altogether. Several stronger conditions on rationality are explored and rejected. There are presently no good arguments against logical sloth.
The taxonomy of scientific problems constructed by Laudan is not exhaustive of all types of scientific work. For one thing, it does not take into account projects which produce an increase of theoretical virtue in a theory that does not suffer from conceptual problems. It is argued that any work which alters the amount of theoretical virtue possessed by a theory constitutes a scientific advance. A new taxonomy is proposed which distinguishes scientific contributions on the basis of which theoretical virtue (...) is altered, whether the alteration produces an increase or a decrease in virtue, and whether the alteration is due to a logical invention, a logical discovery, or an empirical discovery. (shrink)