How do rational minds make contact with the world? The empiricist tradition sees a gap between mind and world, and takes sensory experience, fallible as it is, to provide our only bridge across that gap. In its crudest form, for example, the traditional idea is that our minds consult an inner realm of sensory experience, which provides us with evidence about the nature of external reality. Notoriously, however, it turns out to be far from clear that there is any viable (...) conception of experience which allows it to do the job. The original problem is to show that thought is rationally constrained by external reality. If sensory experience is to provide the solution--in particular, if it is to provide the answer to sceptical challenges--it must therefore meet two criteria. First, it must itself be `receptive'--i.e., appropriately constrained by external reality. Second, it must be the kind of thing that can enter into a logical or rational relationship with belief--it must already be `conceptual,' in other words. In arguing against the idea that anything could serve both roles, Wilfred Sellars termed this conception of experience "the Myth of the Given.". (shrink)
This volume brings together fourteen major essays by one of contemporary philosophy's most challenging thinkers. Huw Price links themes from Quine, Carnap, Wittgenstein and Rorty, to craft a powerful critique of contemporary naturalistic metaphysics. He offers a new positive program for philosophy, cast from a pragmatist mould.
Why is the future so different from the past? Why does the past affect the future and not the other way round? The universe began with the Big Bang - will it end with a `Big Crunch'? Now in paperback, this book presents an innovative and controversial view of time and contemporary physics. Price urges physicists, philosophers, and anyone who has ever pondered the paradoxes of time to look at the world from a fresh perspective, and throws fascinating new light (...) on some of the great mysteries of the universe. (shrink)
Pragmatists have traditionally been enemies of representationalism but friends of naturalism, when naturalism is understood to pertain to human subjects, in the sense of Hume and Nietzsche. In this volume Huw Price presents his distinctive version of this traditional combination, as delivered in his René Descartes Lectures at Tilburg University in 2008. Price contrasts his view with other contemporary forms of philosophical naturalism, comparing it with other pragmatist and neo-pragmatist views such as those of Robert Brandom and Simon Blackburn. Linking (...) their different 'expressivist' programmes, Price argues for a radical global expressivism that combines key elements from both. With Paul Horwich and Michael Williams, Brandom and Blackburn respond to Price in new essays. Price replies in the closing essay, emphasising links between his views and those of Wilfrid Sellars. The volume will be of great interest to advanced students of philosophy of language and metaphysics. (shrink)
In this paper we defend the view that the ordinary notions of cause and effect have a direct and essential connection with our ability to intervene in the world as agents.1 This is a well known but rather unpopular philosophical approach to causation, often called the manipulability theory. In the interests of brevity and accuracy, we prefer to call it the agency theory.2 Thus the central thesis of an agency account of causation is something like this: an event A is (...) a cause of a distinct event B just in case bringing about the occurrence of A would be an effective means by which a free agent could bring about the occurrence of B. In our view the unpopularity of the agency approach to causation may be traced to two factors. The first is a failure to appreciate certain distinctive advantages that this approach has over its various rivals. We have drawn attention to some of these advantages elsewhere, and we summarize below. However, the second and more important factor is the influence of a number of stock objections, objections that seem to have persuaded many philosophers that agency accounts face insuperable obstacles. In this paper we want to show that these objections have been vastly overrated. There are four main objections. (shrink)
The difference between cause and effect seems obvious and crucial in ordinary life, yet missing from modern physics. Almost a century ago, Bertrand Russell called the law of causality 'a relic of a bygone age'. In this important collection 13 leading scholars revisit Russell's revolutionary conclusion, discussing one of the most significant and puzzling issues in contemporary thought.
I distinguish three views, a defence of any one of which would go some way towards vindicating the view that there is something objective about the passage of time: the view that the present moment is objectively distinguished; the view that time has an objective direction – that it is an objective matter which of two non-simultaneous events is the earlier and which the later; the view that there is something objectively dynamic, ﬂux-like, or "ﬂow-like" about time. I argue that (...) each of these views is not so much false as doubtfully coherent. In each case, it turns out to be hard to make sense of what the view could be, at least if it is to be non-trivial, and of use to a friend of objective passage. I conclude with some remarks about avenues that seem worth exploring in the philosophy of time, when we are done with trying to make sense of passage. (shrink)
In a recent paper, Richard Rorty begins by telling us why pragmatists such as himself are inclined to identify truth with justification: ‘Pragmatists think that if something makes no difference to practice, it should make no difference to philosophy. This conviction makes them suspicious of the distinction between justification and truth, for that distinction makes no difference to my decisions about what to do.’ (1995, p. 19) Rorty goes on to discuss the claim, defended by Crispin Wright, that truth is (...) a normative constraint on assertion. He argues that this claim runs foul of this principle of no difference without a practical difference: ‘The need to justify our beliefs to ourselves and our fellow agents subjects us to norms, and obedience to these norms produces a behavioural pattern that we must detect in others before confidently attributing beliefs to them. But there seems to be no occasion to look for obedience to an additional norm – the commandment to seek the truth. For – to return to the pragmatist doubt with which I began – obedience to that commandment will produce no behaviour not produced by the need to offer justification.’ (1995, p. 26) Again, then, Rorty appeals to the claim that a commitment to a norm of truth rather than a norm of justification makes no behavioural difference. This is an empirical claim, testable in principle by comparing the behaviour of a community of realists (in Rorty’s sense) to that of a community of pragmatists. In my view, the experiment would show that the claim is unjustified, indeed false. I think that there is an important and widespread behavioural pattern that depends on the fact that speakers do take themselves to be subject to such an additional norm. Moreover, it is a.. (shrink)
Concepts employed in folk descriptions of the world often turn out to be more perspectival than they seem at first sight, involving previously unrecognised sensitivity to the viewpoint or 'situation' of the user of the concept in question. Often, it is progress in science that reveals such perspectivity, and the deciding factor is that we realise that other creatures would apply the same concepts with different extension, in virtue of differences between their circumstances and ours. In this paper I argue (...) that causal concepts are perspectival in this way, and describe the 'situation' on which they depend in terms of an abstract characterisation of the viewpoint of a deliberating agent. I argue that this approach makes better sense than rivals of the apparent asymmetry and temporal orientation of the causal relation. (shrink)
In his influential book 'Making Things Happen' and in other places, Jim Woodward has noted some affinities between his own account of causation and that of Menzies and Price, but argued that the latter view is implausibly ‘subjective’. In this piece I discuss Woodward’s criticisms. I argue that the Menzies and Price view is not as different from Woodward’s own account as he believes, and that in so far as it is different, it has some advantages whose importance Woodward misses; (...) but also that the Menzies and Price view lacks some elements whose importance Woodward rightly stresses. When properly characterized, however, the ‘subjectivity’ survives unscathed. (shrink)
Probabilistic accounts of causality have long had trouble with ‘spurious’ evidential correlations. Such correlations are also central to the case for causal decision theory—the argument that evidential decision theory is inadequate to cope with certain sorts of decision problem. However, there are now several strong defences of the evidential theory. Here I present what I regard as the best defence, and apply it to the probabilistic approach to causality. I argue that provided a probabilistic theory appeals to the notions of (...) agency and effective strategy, it can avoid the problem of spurious causes. I show that such an appeal has other advantages; and argue that it is not illegitimate, even for a causal realist. (shrink)
In “A Subjectivist’s Guide to Objective Chance,” David Lewis says that he is “led to wonder whether anyone but a subjectivist is in a position to understand objective chance.” The present essay aims to motivate this same Lewisean attitude, and a similar degree of modest subjectivism, with respect to objective causation. The essay begins with Newcomb problems, which turn on an apparent tension between two principles of choice: roughly, a principle sensitive to the causal features of the relevant situation, and (...) a principle sensitive only to evidential factors. Two-boxers give priority to causal beliefs, and one-boxers to evidential beliefs. The essay notes that a similar issue can arise when the modality in question is chance, rather than causation. In this case, the conflict is between decision rules based on credences guided solely by chances, and rules based on credences guided by other sorts of probabilistic evidence. Far from excluding cases of the latter kind, Lewis’s Principal Principle explicitly allows for them, in the form of the caveat that credences should follow beliefs about chances only in the absence of “inadmissible evidence.” The essay then exhibits a tension in Lewis’s views on these two matters, by presenting a class of decision problems—some of them themselves Newcomb problems—in which Lewis’s view of the relevance of inadmissible evidence seems in tension with his causal decision theory. It offers a diagnosis for this dilemma and proposes a remedy, based on an extension of a proposal due to Ned Hall and others from the case of chance to that of causation. The remedy suggests a new view of the relation between causal decision theory and evidential decision theory, namely, that they stand to each other much as chance stands to credence, being objective and subjective faces of the same practical coin. This has much the same metaphysical benefits as Lewis’s own view of chance and also throws interesting new light on Newcomb problems, providing an irenic resolution of the apparent disagreement between causal and evidential decision rules. (shrink)
One of the most striking features of causation is that causes typically precede their effects – the causal arrow is strongly aligned with the temporal arrow. Why should this be so? We offer an opinionated guide to this problem, and to the solutions currently on offer. We conclude that the most promising strategy is to begin with the de facto asymmetry of human deliberation, characterised in epistemic terms, and to build out from there. More than any rival, this subjectivist approach (...) promises to demystify the asymmetry, temporal orientation, and deliberative relevance of causal judgements. (shrink)
Many areas of philosophy employ a distinction between factual and non-factual (descriptive/non-descriptive, cognitive/non-cognitive, etc) uses of language. This book examines the various ways in which this distinction is normally drawn, argues that all are unsatisfactory, and suggests that the search for a sharp distinction is misconceived. The book develops an alternative approach, based on a novel theory of the function and origins of the concept of truth. The central hypothesis is that the main role of the normative notion of truth (...) is to encourage speakers to argue, with long-run behavioural advantages. This offers a fresh perspective on many debates about realism in contemporary philosophy. (shrink)
Making a Difference presents fifteen original essays on causation and counterfactuals by an international team of experts. Collectively, they represent the state of the art on these topics. The essays in this volume are inspired by the life and work of Peter Menzies, who made a difference in the lives of students, colleagues, and friends. Topics covered include: the semantics of counterfactuals, agency theories of causation, the context-sensitivity of causal claims, structural equation models, mechanisms, mental causation, causal exclusion argument, free (...) will, and the consequence argument. (shrink)
Is non-cognitivism compatible with minimalism about truth? A contemporary argument claims not, and therefore that moral realists, for example, should take heart from the popularity of semantic minimalism. The same is said to apply to cognitivism about other topics—conditionals, for example—for the argument depends only on the fact that ordinary usage applies the notions of truth and falsity to utterances of the kind in question. Given this much, minimalism about truth is said to leave no room for the view that (...) the utterances concerned are non-cognitive in nature. (shrink)
There is a long-standing disagreement in the philosophy of probability and Bayesian decision theory about whether an agent can hold a meaningful credence about an upcoming action, while she deliberates about what to do. Can she believe that it is, say, 70% probable that she will do A, while she chooses whether to do A? No, say some philosophers, for Deliberation Crowds Out Prediction (DCOP), but others disagree. In this paper, we propose a valid core for DCOP, and identify terminological (...) causes for some of the apparent disputes. (shrink)
Can an agent deliberating about an action A hold a meaningful credence that she will do A? 'No', say some authors, for 'Deliberation Crowds Out Prediction' (DCOP). Others disagree, but we argue here that such disagreements are often terminological. We explain why DCOP holds in a Ramseyian operationalist model of credence, but show that it is trivial to extend this model so that DCOP fails. We then discuss a model due to Joyce, and show that Joyce's rejection of DCOP rests (...) on terminological choices about terms such as 'intention', 'prediction', and 'belief'. Once these choices are in view, they reveal underlying agreement between Joyce and the DCOP-favouring tradition that descends from Ramsey. Joyce's Evidential Autonomy Thesis (EAT) is effectively DCOP, in different terminological clothing. Both principles rest on the so-called 'transparency' of first-person present-tensed reflection on one's own mental states. (shrink)
Since the late nineteenth century, physics has been puzzled by the time-asymmetry of thermodynamic phenomena in the light of the apparent T-symmetry of the underlying laws of mechanics. However, a compelling solution to this puzzle has proved elusive. In part, I argue, this can be attributed to a failure to distinguish two conceptions of the problem. According to one, the main focus of our attention is a time-asymmetric lawlike generalisation. According to the other, it is a particular fact about the (...) early universe. This paper aims (i) to distinguish these two different conceptions of the time-asymmetric explanandum in thermodynamics; (ii) to argue in favour of the latter; and (iii) to show that whichever we choose, our rational expectations about the thermodynamic behaviour of the future must depend on what we know about the past: contrary to the common view, statistical arguments alone do not give us good reason to expect that entropy will always continue to increase. (shrink)
William James said that sometimes detailed philosophical argument is irrelevant. Once a current of thought is really under way, trying to oppose it with argument is like planting a stick in a river to try to alter its course: “round your obstacle flows the water and ‘gets there just the same’”. He thought pragmatism was such a river. There is a contemporary river that sometimes calls itself pragmatism, although other titles are probably better. At any rate it is the denial (...) of differences, the celebration of the seamless web of language, the soothing away of distinctions, whether of primary versus secondary, fact versus value, description versus expression, or of any other significant kind. What is left is a smooth, undifferentiated view of language, sometimes a nuanced kind of anthropomorphism or “internal” realism, sometimes the view that no view is possible: minimalism, deflationism, quietism. Wittgenstein is often admired as a high priest of the movement. Planting a stick in this water is probably futile, but having done it before I shall do it again, and—who knows?—enough sticks may make a dam, and the waters of error may subside. (Blackburn, 1998a, 157). (shrink)
Proponents of causal decision theories argue that classical Bayesian decision theory (BDT) gives the wrong advice in certain types of cases, of which the clearest and commonest are the medical Newcomb problems. I defend BDT, invoking a familiar principle of statistical inference to show that in such cases a free agent cannot take the contemplated action to be probabilistically relevant to its causes (so that BDT gives the right answer). I argue that my defence does better than those of Ellery (...) Eells and Richard Jeffrey; and that it applies, where necessary, to other types of Newcomb problem. (shrink)
A number of writers have been attracted to the idea that some of the peculiarities of quantum theory might be manifestations of 'backward' or 'retro' causality, underlying the quantum description. This idea has been explored in the literature in two main ways: firstly in a variety of explicit models of quantum systems, and secondly at a conceptual level. This note introduces a third approach, intended to complement the other two. It describes a simple toy model, which, under a natural interpretation, (...) shows how retrocausality can emerge from simple global constraints. The model is also useful in permitting a clear distinction between the kind of retrocausality likely to be of interest in QM, and a different kind of reverse causality, with which it is liable to be confused. The model is proposed in the hope that future elaborations might throw light on the potential of retrocausality to account for quantum phenomena. (shrink)
The best case for thinking that quantum mechanics is nonlocal rests on Bell's Theorem, and later results of the same kind. However, the correlations characteristic of Einstein–Podolsky–Rosen (EPR)–Bell (EPRB) experiments also arise in familiar cases elsewhere in quantum mechanics (QM), where the two measurements involved are timelike rather than spacelike separated; and in which the correlations are usually assumed to have a local causal explanation, requiring no action-at-a-distance (AAD). It is interesting to ask how this is possible, in the light (...) of Bell's Theorem. We investigate this question, and present two options. Either (i) the new cases are nonlocal too, in which case AAD is more widespread in QM than has previously been appreciated (and does not depend on entanglement, as usually construed); or (ii) the means of avoiding AAD in the new cases extends in a natural way to EPRB, removing AAD in these cases too. There is a third option, viz., that the new cases are strongly disanalogous to EPRB. But this option requires an argument, so far missing, that the physical world breaks the symmetries which otherwise support the analogy. In the absence of such an argument, the orthodox combination of views—action-at-a-distance in EPRB, but local causality in its timelike analogue—is less well established than it is usually assumed to be. 1 Introduction1.1 Background1.2 Outline of the argument2 The Experiments2.1 Standard EPRB2.2 Sideways EPRB2.3 Comparing the experiments2.4 The need for beables3 The Symmetry Considerations3.1 The action symmetry3.2 Time-symmetry in SEPRB4 The Basic Trilemma4.1 An intuitive defence of Option III?5 Avoiding the Trilemma?6 The Classical Objection7 Defending Option III7.1 The free will argument7.2 Independence and consistency8 Entanglement and Epistemic Perspective. (shrink)
I discuss the relationship between the two forms of expressivism defended by Robert Brandom, on one hand, and philosophers in the Humean tradition, such as Simon Blackburn and Allan Gibbard, on the other. I identify three apparent points of difference between the two programs, but argue that all three are superficial. Both projects benefit from the insights of the other, and the combination is in a natural sense a global expressivism.
It has often been suggested that retrocausality offers a solution to some of the puzzles of quantum mechanics: e.g., that it allows a Lorentz-invariant explanation of Bell correlations, and other manifestations of quantum nonlocality, without action-at-a-distance. Some writers have argued that time-symmetry counts in favour of such a view, in the sense that retrocausality would be a natural consequence of a truly time-symmetric theory of the quantum world. Critics object that there is complete time-symmetry in classical physics, and yet no (...) apparent retrocausality. Why should the quantum world be any different? This note throws some new light on these matters. I call attention to a respect in which quantum mechanics is different, under some assumptions about quantum ontology. Under these assumptions, the combination of time-symmetry without retrocausality is unavailable in quantum mechanics, for reasons intimately connected with the differences between classical and quantum physics (especially the role of discreteness in the latter). Not all interpretations of quantum mechanics share these assumptions, however, and in those that do not, time-symmetry does not entail retrocausality. (shrink)
[Abstract and PDF at the Pittsburgh PhilSci Archive] A slightly shorter version of this paper is to appear in a volume edited by Jonathan Barrett, Adrian Kent, David Wallace and Simon Saunders, containing papers presented at the [email protected] conference in Oxford in July 2007, and the Many [email protected] meeting at the Perimeter Institute in September 2007. The paper is based on my talk at the latter meeting (audio, video and slides of which are accessible here).
holds for all central declarative sentences. According to deflationists, the key to an understanding of truth lies in an appreciation of the grammatical advantages of a predicate satisfying DS. As Paul Horwich puts it, “our truth predicate is merely a logical device enabling simple formulations of certain sorts of generalization.” (1996, p. 878; see also Horwich 1990).
Speech act theory is one of the more lasting products of the linguistic movement in philosophy of the mid−Twentieth century. Within philosophy itself the movement's products did not in general prove so durable. Particularly striking in this respect is the perceived fate of what was one of the most characteristic applications of the linguistic turn in philosophy, namely the view that many traditional philosophical problems are such as to yield to an understanding of the distinctive function of a particular part (...) of language. Most typically, the crucial insight was held to be that despite appearances, the function of the part of language in question is not assertoric, or descriptive, and that the traditional problems arose at least in part from a failure to appreciate this point. Thus problems in moral philosophy were thought to yield to an appreciation that moral discourse is expressive rather than descriptive, problems in the philosophy of mind to an understanding of distinctive r×le of psychological ascriptions, and so on. The philosophical journals of the 1950s are rich with views like these. (No general term for this approach seems to have become widely accepted at the time. I shall call it "non−factualism", for what it denies, most characteristically, is the fact−stating r×le of language of a certain kind.). (shrink)
Scientiﬁc naturalism is a metaphysical doctrine, a view about what there is, or what we ought to believe that there is. It maintains that natural science should be our guide in matters metaphysical: the ontology we should accept is the ontology that turns out to be required by science. Quine is often regarded as the doyen of scientiﬁc naturalists, though the supporting cast includes such giants as David Lewis and J. J. C. Smart.
Although it is obvious that much of language is representational, it is occasionally denied. I have attended conference papers attacking the representational view of language given by speakers who have in their pockets pieces of paper with writing on them that tell them where the conference dinner is and when the taxis leave for the airport. (Jackson, 1997.