in J. Brzeziρski, A. Klawiter, T.A.F. Kuipers, K. Lastowski, K. Paprzycka, P. Przybysz (eds.), The Courage of Doing Philosophy: Essays Dedicated to Leszek Nowak, pp. 59-115, Amsterdam/New York, NY: Rodopi, forthcoming 2007.
Sellars was one of the few systematic philosophers in the analytic tradition but he never published a magnum opus. Though his profound and complex philosophical endeavours were all tied together into a many-dimensional worldview, the dimensions of this worldview were built bit-bybit throughout his philosophical career. His papers, collections of essays, public lectures and lecture notes deal with almost every philosophical issue. One can easily see in them a megaintellect—a genius—thinking deeply and carefully about hard philosophical problems, taking the reader (...) by the hand, as it were, and unveiling to her not only the complexity of the problems under discussion but also the hard-won truth that in philosophy there is no black and white: new insights can be gained and fresh light can be cast on old problems only by utilising the best thoughts of, and striking a balance between, competing philosophical traditions and thinkers. (shrink)
idea of a mechanical balance, described the volume of exchange of various aggregated commodities, weighted by their price, balanced against the quantity of money in the economy, weighted by the money’ s rate of circulation. Another family of models addressed issues about the gold standard and bimetallism by thinking of quantities of gold and silver as liquids in different connected reservoirs representing, alternatively, bullion and minted coin, and the way the liquids/metal/currency in one reservoir will ¯ ow into others if (...) the level in one becomes higher than in another. Morgan sets out the ways in which Fischer developed these models in response to both theoretical and practical issues of the day. In the process we see how the activity of building models can address relations which are very imperfectly understood, revealing previously unappreciated causal interconnections. For example, Gresham’ s law is revealed as just one facet of a much more complex network of interconnected variables, and the models help to make clear the conditions under which this law does and does not apply. Also illustrated are ways in which such models can be illuminating about the underlying mechanisms even though the models in question involve extreme idealizations and are extremely limited in practical application because they require parameters which cannot be independently measured. In this respect these examples provide cases in which models would appear to facilitate theory development and articulation more than mediation between preexisting theory and the world. (shrink)
Aristotle (384-322 BC) claimed a sharp distinction between understanding the fact and understanding the reason why (dioti; aitia). Though both types of understanding proceed via deductive syllogism, only the latter is characteristic of science because only the latter is tied to the knowledge of causes. In his Posterior Analytics, Aristotle contrasted the following two instances of deductive syllogism.
There has been no shortage of such conceptual analyses and no shortage of counterexamples to all of them. The counterexamples exploit, at least partly, situations in which we are presumed to have clear intuitions about what causes what, but which intuitions are not being respected by the suggested philosophical analysis. The counterexamples typically lead to a battery of sophisticated attempts to revise or amend the philosophical analysis so that it is saved from refutation. These attempts, typically, either deny the intuitions (...) on which the counterexamples are based or accommodate the problematic cases within the theory by adding further clauses to the original philosophical analysis. The result of all this is that where the original philosophical theory rested on a simple, forceful and intuitively plausible idea (e.g., that causation consists in a relation of counterfactual dependence between discrete events), the modified philosophical theory becomes very convoluted, somewhat ad hoc and implausible. (shrink)
In recent years philosophy of science has seen a resurgence of interest in metaphysical issues, especially those concerning laws, causation,and explanation. Although this book takes only the latter two words for its title, it is also about laws of nature. It is divided into three sections: the first is on causation, the second is on laws, and the third is on explanation: this is entirely appropriate because the debates about them are closely related. Ever since Hume argued that causation is (...) nothing more than regularities, laws have been more respectable than causes in philosophy. Perhaps this is also because science is replete with specially named laws which seem to play a central role in theories and explanations. Yet, as many philosophers have recently pointed out, contrary to Russell’s famous pronouncement that causation is a relic of a bygone age (quoted p. 3 by Psillos), the contemporary special sciences are very much concerned with the identification and investigation of all manner of causal structures. This raises the question of whether the apparent causal powers attributed to kinds in the special sciences are anything over and above a way of talking about the result of the operations of physical laws governing their microconstituents. Hence the logical empiricist’s project of showing how the laws of the special sciences reduce to those of physics. On their view, explanation, and in particular causal explanation, is nothing more than argument using the laws of nature as premises. However, this coveringlaw model of explanation has been subjected to intense criticism, and there have been attempts to construct alternatives that rely on the idea that to explain an event is to cite its real cause, where this cause need not be subsumed under any law. Since the demise of logical empiricism, or at least the waning of its influence, there has been a proliferation of theories about laws, causation and explanation, many of which differ radically from one another. (shrink)
Niiniluoto (2003) has offered an incisive and comprehensive review of the recent debates about abduction. There is little on which I disagree with him. So, in this commentary, I shall try to cast some doubts to the attempts to render Inference to the Best Explanation (IBE) within a Bayesian framework.
Philosophy of science emerged as a distinctive part of philosophy in the twentieth century. It set its own agenda, the systematic study of the metaphysical and epistemological foundations of science, and acquired its own professional structure, departments and journals. Its defining moment was the meeting (and the clash) of two courses of events: the breakdown of the Kantian philosophical tradition and the crisis in the sciences and mathematics in the beginning of the century. The emergence of the new Frege-Russell logic, (...) the arithmetisation of geometry and the collapse of classical mechanics called into question the neat Kantian scheme of synthetic a priori principles. But the thought that some a priori (framework) principles should be in place in order for science to be possible had still had a strong grip on the thinkers of the European continent. A heated intellectual debate started concerning the status of these a priori principles. The view that dominated the scene after the dust had settled was that the required framework principles were conventions. The seed of this thought was found in Poincaré’s writings, but in the hands of the Logical Positivists, it was fertilised with Frege’s conception of analyticity and Hilbert’s conception of implicit definitions. The consolidation of modern physics lent credence to the view that a priori principles can be revised; hence, a new conception of relativised a priori emerged. The linguistic turn in philosophy re-oriented the subject-matter of philosophical thinking about science to the language of science. Formal-logical methods and conceptual analysis were taken to be the privileged philosophical tools. Not only, it was thought, do they clarify and perhaps solve (or dissolve) traditional philosophical problems; they also make philosophy rigorous and set it apart from empirical science. In the 1930s, philosophy of science became the logic of science; it became synonymous to anti-psychologism, anti-historicism and anti-naturalism.. (shrink)
Philosophy of science emerged as a distinctive part of philosophy in the twentieth century. Its defining moment was the meeting (and the clash) of two courses of events: the breakdown of the Kantian philosophical tradition and the crisis in the sciences and mathematics in the beginning of the century. But what we now call philosophy of science has a rich intellectual history that goes back to the ancient Greeks. It is intimately connected with the efforts made by many thinkers to (...) come to terms with the distinctive kind of knowledge (episteme; scientia) that science offers. Though science proper was distinguished from natural philosophy only in the nineteenth century, the philosophy of natural philosophy has had almost the very same agenda that current philosophy of science has. (shrink)
Two things seem to make science different from other human activities: the existence of a special method and the claim that this method produces objective knowledge of the world. Yet, as Barry Gower’s impressive book shows, after centuries of philosophical reflection on scientific method, there is considerable disagreement as to what exactly this method is. What is more interesting is that all attempts to characterise scientific method, from Galileo and Descartes up until the present, suffer from an internal tension: whatever (...) the method of science be in its details, it should satisfy two general desiderata which, at least prima facie, pull in contrary directions. On the one hand, it should be amplia- tive: it should be able to move from the finite data and observations available at any given time to hypotheses and theories which go far beyond these data, either by generalising them over unexamined (or even unexaminable) domains or by introducing unobserved and unobservable causes which bring the phenomena about. This ‘content-increasing’ aspect of scientific method is indispensable, if science is seen as an activity which purports to extend our knowledge beyond what is immediately observed by means of the senses. On the other hand, the method of science should be epistemically probative: it should be able to convey epistemic warrant to its conclusions (hypotheses and theories). Otherwise, its claim to extending our knowledge of the world beyond what is actually observed is dubious. The tension arises because ampliative methods don’t carry their epistemically probative character on their sleeves. Since the conclusion of an ampliative method can be false, even though all of its premises are true, the following question arises: what makes it the case that the method conveys whatever epistemic warrant the premises enjoy to the intended conclusion rather than to its negation? (shrink)
Mumford presents the friends of laws with a Central Dilemma, either horn of which is supposed to be utterly unpalatable. The thrust of the dilemma is this: laws are either external or internal to their instances. If they are external, they cannot govern (or determine) their instances. If they are internal, they cannot govern (or determine) their instances. Ergo, laws cannot govern (or determine) their instances. The role of this dilemma is central to Mumford’s argument against laws: they are supposed (...) to have no credible role to play. The dilemma rests on the premise that laws, if they exist, must do something: they must play a governing role. Of course, it is one thing to say that laws play a governing role and it is quite another to say that laws must play some role. Mumford (§9.4) agonises a lot about this, but his considered view is that laws must play an x- role in virtue of which they make a difference in the determination of the world’s history. As Mumford is fully aware, the supposed ‘governing role’ of laws might be just a metaphor. Still, he thinks his Central Dilemma is powerful against any x-role that laws are supposed to play. We shall see later that this is not so. For the time being, let us play along. The central dilemma is faulty, anyway. (shrink)
The Semantic Thesis: Scientific theories should be taken at face-value. They are truthconditioned descriptions of their intended domain, both observable and unobservable. Hence, they are capable of being true or false. The theoretical terms featuring in theories have putative factual reference.
Underdetermination is a relation between evidence and theory. More accurately, it is a relation between the propositions that express the (relevant) evidence and the propositions that constitute the theory. Evidence is said to underdetermine theory. This may mean two things. First, the evidence cannot prove the truth of the theory. Second, the evidence cannot render the theory probable. Let’s call the first deductive underdetermination, and the second inductive (or ampliative) underdetermination. Both kinds of claim are supposed to have a certain (...) epistemic implication, viz., that belief in theory is never warranted by the evidence. This is the underdetermination thesis. (shrink)
When we philosophers think about causation we are primarily interested in what causation is—what exactly is the relation between cause and effect? Or, more or less equivalently, how and in virtue of what is the cause connected to the effect? But we are also interested in an epistemic issue, viz., the possibility of causal knowledge: how, if at all, can causal knowledge be obtained? The two issues are, of course, conceptually distinct—but to many thinkers, there is a connection between them. (...) A metaphysical account of causation would be useless if it did not make, at least in principle, causal knowledge possible. Conversely, many philosophers, mostly of an empiricist persuasion, have taken the possibility of causal knowledge to act as a constraint on the metaphysics of causation: no feature that cannot in principle become the object of knowledge can be attributed to causation. (shrink)
This paper formulates what I think is the basic problem of any attempt to characterise the abstract structure of scientific method, viz., that it has to satisfy two conflicting desiderata: it should be ampliative (contentincreasing) and it should confer epistemic warrant on its outcomes. Then, after two extreme solutions to the problem of the method, viz., Enumerative Induction and the Method of Hypothesis, are examined, the paper argues that abduction, suitably understood as Inference to the Best Explanation, offers the best (...) description of scientific method and solves the foregoing problem in the best way: it strikes the best balance between ampliation and epistemic warrant. (shrink)
Rudolf Carnap delivered the hitherto unpublished lecture ‘Theoretical Concepts in Science’ at the meeting of the American Philosophical Association, Paciﬁc Division, at Santa Barbara, California, on 29 December 1959. It was part of a symposium on ‘Carnap’s views on Theoretical Concepts in Science’. In the bibliography that appears in the end of the volume, ‘The Philosophy of Rudolf Carnap’, edited by Paul Arthur Schilpp, a revised version of this address appears to be among Carnap’s forthcoming papers. But although Carnap started (...) to revise it, he never ﬁnished the revision,1 and never published the unrevised transcript. Perhaps this is because variants of the approach to theoretical concepts presented for the ﬁrst time in the Santa Barbara lecture have appeared in other papers of his (cf. the editorial footnotes in Carnap’s lecture). Still, I think, the Santa Barbara address is a little philosophical gem that needs to see the light of day. The document that follows is the unrevised transcript of Carnap’s lecture.2 Its style, then, is that of an oral presentation. I decided to leave it as it is, making only very minor stylistic changes—which, except those related to punctuation, are indicated by curly brackets.3 I think that reading this lecture is a rewarding experience, punctuated as the lecture is with odd remarks and autobiographical points. One can almost envisage.. (shrink)
In this survey article I try to appraise the present state of the scientific realism debate with an eye to important but hitherto unexplored suggestions and open issues that need further work. In section 2, I shall mostly focus on the relation between scientific realism and truth. In section 3, I shall discuss the grounds for the realists’ epistemic optimism.
The present essay aims to show how our thinking about explanation has evolved and where it stands now. Its first part presents how some major thinkers, from Aristotle, through to Descartes, Leibniz, Newton, Hume and Kant, to Mill, conceived of explanation. The second part offers a systematic examination of the most significant and controversial contemporary models of explanation. The first part starts with Aristotle’s conception—the thought that explanation consists in finding out why something happened and that answering why-questions requires finding (...) causes—which set the agenda for almost all subsequent thinking about explanation. It discusses the links between laws of nature, causation and explanation in the thought of the early modern philosophers and culminates with John Stuart Mill’s first well-worked out model of scientific explanation, which was based on the idea that there is no necessity in nature and that, ultimately, explanation amounts to unification into a comprehensive deductive system, whose axioms capture the fundamental laws of nature. The second part starts with the Logical Empiricists’ attempt to legitimise the concept of causation by subsuming it under the concept of a deductive-nomological argument. It moves on to discuss the reappearance of genuinely causal models of explanation as well as the re-appearance and development of the Millian idea that explanation amounts to unification. It ends with the examination of teleological approaches to explanation. (shrink)
Alan Musgrave has been one of the most important philosophers of science in the last quarter of the 20th century. He has exemplified an exceptional combination of clearheaded and profound philosophical thinking. Two seem to be the pillars of his thought: an uncompromising commitment to scientific realism and an equally uncompromising commitment to deductivism. The essays reprinted in this volume (which span a period of 25 years, from 1974 to 1999) testify to these two commitments. (There are two omissions from (...) this collection: “Realism, Truth and Objectivity” in Realism and Anti-realism in the PhilosophyofScience(1996,Kluwer)and“HowtoDowithoutInductiveLogic” (Science&Educationvol.8,1999.Iwillmakesomereferencestothesepapersin what follows.) In the present review, instead of giving an orderly summary of the 16 papers of Essays, I discuss Musgrave’s two major commitments and raise some worries about their combination. (shrink)