Why is the future so different from the past? Why does the past affect the future and not the other way around? What does quantum mechanics really tell us about the world? In this important and accessible book, Huw Price throws fascinating new light on some of the great mysteries of modern physics, and connects them in a wholly original way. Price begins with the mystery of the arrow of time. Why, for example, does disorder always increase, as required by (...) the second law of thermodynamics? Price shows that, for over a century, most physicists have thought about these problems the wrong way. Misled by the human perspective from within time, which distorts and exaggerates the differences between past and future, they have fallen victim to what Price calls the "double standard fallacy": proposed explanations of the difference between the past and the future turn out to rely on a difference which has been slipped in at the beginning, when the physicists themselves treat the past and future in different ways. To avoid this fallacy, Price argues, we need to overcome our natural tendency to think about the past and the future differently. We need to imagine a point outside time -- an Archimedean "view from nowhen" -- from which to observe time in an unbiased way. Offering a lively criticism of many major modern physicists, including Richard Feynman and Stephen Hawking, Price shows that this fallacy remains common in physics today -- for example, when contemporary cosmologists theorize about the eventual fate of the universe. The "big bang" theory normally assumes that the beginning and end of the universe will be very different. But if we are to avoid the double standard fallacy, we need to consider time symmetrically, and take seriously the possibility that the arrow of time may reverse when the universe recollapses into a "big crunch." Price then turns to the greatest mystery of modern physics, the meaning of quantum theory. He argues that in missing the Archimedean viewpoint, modern physics has missed a radical and attractive solution to many of the apparent paradoxes of quantum physics. Many consequences of quantum theory appear counterintuitive, such as Schrodinger's Cat, whose condition seems undetermined until observed, and Bell's Theorem, which suggests a spooky "nonlocality," where events happening simultaneously in different places seem to affect each other directly. Price shows that these paradoxes can be avoided by allowing that at the quantum level the future does, indeed, affect the past. This demystifies nonlocality, and supports Einstein's unpopular intuition that quantum theory describes an objective world, existing independently of human observers: the Cat is alive or dead, even when nobody looks. So interpreted, Price argues, quantum mechanics is simply the kind of theory we ought to have expected in microphysics -- from the symmetric standpoint. Time's Arrow and Archimedes' Point presents an innovative and controversial view of time and contemporary physics. In this exciting book, Price urges physicists, philosophers, and anyone who has ever pondered the mysteries of time to look at the world from the fresh perspective of Archimedes' Point and gain a deeper understanding of ourselves, the universe around us, and our own place in time. (shrink)
This volume brings together fourteen major essays by one of contemporary philosophy's most challenging thinkers. Huw Price links themes from Quine, Carnap, Wittgenstein and Rorty, to craft a powerful critique of contemporary naturalistic metaphysics. He offers a new positive program for philosophy, cast from a pragmatist mould.
Pragmatists have traditionally been enemies of representationalism but friends of naturalism, when naturalism is understood to pertain to human subjects, in the sense of Hume and Nietzsche. In this volume Huw Price presents his distinctive version of this traditional combination, as delivered in his René Descartes Lectures at Tilburg University in 2008. Price contrasts his view with other contemporary forms of philosophical naturalism, comparing it with other pragmatist and neo-pragmatist views such as those of Robert Brandom and Simon Blackburn. Linking (...) their different 'expressivist' programmes, Price argues for a radical global expressivism that combines key elements from both. With Paul Horwich and Michael Williams, Brandom and Blackburn respond to Price in new essays. Price replies in the closing essay, emphasising links between his views and those of Wilfrid Sellars. The volume will be of great interest to advanced students of philosophy of language and metaphysics. (shrink)
The difference between cause and effect seems obvious and crucial in ordinary life, yet missing from modern physics. Almost a century ago, Bertrand Russell called the law of causality 'a relic of a bygone age'. In this important collection 13 leading scholars revisit Russell's revolutionary conclusion, discussing one of the most significant and puzzling issues in contemporary thought.
In a recent paper, Richard Rorty begins by telling us why pragmatists such as himself are inclined to identify truth with justification: ‘Pragmatists think that if something makes no difference to practice, it should make no difference to philosophy. This conviction makes them suspicious of the distinction between justification and truth, for that distinction makes no difference to my decisions about what to do.’ (1995, p. 19) Rorty goes on to discuss the claim, defended by Crispin Wright, that truth is (...) a normative constraint on assertion. He argues that this claim runs foul of this principle of no difference without a practical difference: ‘The need to justify our beliefs to ourselves and our fellow agents subjects us to norms, and obedience to these norms produces a behavioural pattern that we must detect in others before confidently attributing beliefs to them. But there seems to be no occasion to look for obedience to an additional norm – the commandment to seek the truth. For – to return to the pragmatist doubt with which I began – obedience to that commandment will produce no behaviour not produced by the need to offer justification.’ (1995, p. 26) Again, then, Rorty appeals to the claim that a commitment to a norm of truth rather than a norm of justification makes no behavioural difference. This is an empirical claim, testable in principle by comparing the behaviour of a community of realists (in Rorty’s sense) to that of a community of pragmatists. In my view, the experiment would show that the claim is unjustified, indeed false. I think that there is an important and widespread behavioural pattern that depends on the fact that speakers do take themselves to be subject to such an additional norm. Moreover, it is a.. (shrink)
In “A Subjectivist’s Guide to Objective Chance,” David Lewis says that he is “led to wonder whether anyone but a subjectivist is in a position to understand objective chance.” The present essay aims to motivate this same Lewisean attitude, and a similar degree of modest subjectivism, with respect to objective causation. The essay begins with Newcomb problems, which turn on an apparent tension between two principles of choice: roughly, a principle sensitive to the causal features of the relevant situation, and (...) a principle sensitive only to evidential factors. Two-boxers give priority to causal beliefs, and one-boxers to evidential beliefs. The essay notes that a similar issue can arise when the modality in question is chance, rather than causation. In this case, the conflict is between decision rules based on credences guided solely by chances, and rules based on credences guided by other sorts of probabilistic evidence. Far from excluding cases of the latter kind, Lewis’s Principal Principle explicitly allows for them, in the form of the caveat that credences should follow beliefs about chances only in the absence of “inadmissible evidence.” The essay then exhibits a tension in Lewis’s views on these two matters, by presenting a class of decision problems—some of them themselves Newcomb problems—in which Lewis’s view of the relevance of inadmissible evidence seems in tension with his causal decision theory. It offers a diagnosis for this dilemma and proposes a remedy, based on an extension of a proposal due to Ned Hall and others from the case of chance to that of causation. The remedy suggests a new view of the relation between causal decision theory and evidential decision theory, namely, that they stand to each other much as chance stands to credence, being objective and subjective faces of the same practical coin. This has much the same metaphysical benefits as Lewis’s own view of chance and also throws interesting new light on Newcomb problems, providing an irenic resolution of the apparent disagreement between causal and evidential decision rules. (shrink)
One of the most striking features of causation is that causes typically precede their effects – the causal arrow is strongly aligned with the temporal arrow. Why should this be so? We offer an opinionated guide to this problem, and to the solutions currently on offer. We conclude that the most promising strategy is to begin with the de facto asymmetry of human deliberation, characterised in epistemic terms, and to build out from there. More than any rival, this subjectivist approach (...) promises to demystify the asymmetry, temporal orientation, and deliberative relevance of causal judgements. (shrink)
The best case for thinking that quantum mechanics is nonlocal rests on Bell's Theorem, and later results of the same kind. However, the correlations characteristic of Einstein–Podolsky–Rosen (EPR)–Bell (EPRB) experiments also arise in familiar cases elsewhere in quantum mechanics (QM), where the two measurements involved are timelike rather than spacelike separated; and in which the correlations are usually assumed to have a local causal explanation, requiring no action-at-a-distance (AAD). It is interesting to ask how this is possible, in the light (...) of Bell's Theorem. We investigate this question, and present two options. Either (i) the new cases are nonlocal too, in which case AAD is more widespread in QM than has previously been appreciated (and does not depend on entanglement, as usually construed); or (ii) the means of avoiding AAD in the new cases extends in a natural way to EPRB, removing AAD in these cases too. There is a third option, viz., that the new cases are strongly disanalogous to EPRB. But this option requires an argument, so far missing, that the physical world breaks the symmetries which otherwise support the analogy. In the absence of such an argument, the orthodox combination of views—action-at-a-distance in EPRB, but local causality in its timelike analogue—is less well established than it is usually assumed to be. 1 Introduction1.1 Background1.2 Outline of the argument2 The Experiments2.1 Standard EPRB2.2 Sideways EPRB2.3 Comparing the experiments2.4 The need for beables3 The Symmetry Considerations3.1 The action symmetry3.2 Time-symmetry in SEPRB4 The Basic Trilemma4.1 An intuitive defence of Option III?5 Avoiding the Trilemma?6 The Classical Objection7 Defending Option III7.1 The free will argument7.2 Independence and consistency8 Entanglement and Epistemic Perspective. (shrink)
In this paper we defend the view that the ordinary notions of cause and effect have a direct and essential connection with our ability to intervene in the world as agents.1 This is a well known but rather unpopular philosophical approach to causation, often called the manipulability theory. In the interests of brevity and accuracy, we prefer to call it the agency theory.2 Thus the central thesis of an agency account of causation is something like this: an event A is (...) a cause of a distinct event B just in case bringing about the occurrence of A would be an effective means by which a free agent could bring about the occurrence of B. In our view the unpopularity of the agency approach to causation may be traced to two factors. The first is a failure to appreciate certain distinctive advantages that this approach has over its various rivals. We have drawn attention to some of these advantages elsewhere, and we summarize below. However, the second and more important factor is the influence of a number of stock objections, objections that seem to have persuaded many philosophers that agency accounts face insuperable obstacles. In this paper we want to show that these objections have been vastly overrated. There are four main objections. (shrink)
I distinguish three views, a defence of any one of which would go some way towards vindicating the view that there is something objective about the passage of time: the view that the present moment is objectively distinguished; the view that time has an objective direction – that it is an objective matter which of two non-simultaneous events is the earlier and which the later; the view that there is something objectively dynamic, ﬂux-like, or "ﬂow-like" about time. I argue that (...) each of these views is not so much false as doubtfully coherent. In each case, it turns out to be hard to make sense of what the view could be, at least if it is to be non-trivial, and of use to a friend of objective passage. I conclude with some remarks about avenues that seem worth exploring in the philosophy of time, when we are done with trying to make sense of passage. (shrink)
A number of writers have been attracted to the idea that some of the peculiarities of quantum theory might be manifestations of 'backward' or 'retro' causality, underlying the quantum description. This idea has been explored in the literature in two main ways: firstly in a variety of explicit models of quantum systems, and secondly at a conceptual level. This note introduces a third approach, intended to complement the other two. It describes a simple toy model, which, under a natural interpretation, (...) shows how retrocausality can emerge from simple global constraints. The model is also useful in permitting a clear distinction between the kind of retrocausality likely to be of interest in QM, and a different kind of reverse causality, with which it is liable to be confused. The model is proposed in the hope that future elaborations might throw light on the potential of retrocausality to account for quantum phenomena. (shrink)
[Abstract and PDF at the Pittsburgh PhilSci Archive] A slightly shorter version of this paper is to appear in a volume edited by Jonathan Barrett, Adrian Kent, David Wallace and Simon Saunders, containing papers presented at the Everett@50 conference in Oxford in July 2007, and the Many Worlds@50 meeting at the Perimeter Institute in September 2007. The paper is based on my talk at the latter meeting (audio, video and slides of which are accessible here).
Since the late nineteenth century, physics has been puzzled by the time-asymmetry of thermodynamic phenomena in the light of the apparent T-symmetry of the underlying laws of mechanics. However, a compelling solution to this puzzle has proved elusive. In part, I argue, this can be attributed to a failure to distinguish two conceptions of the problem. According to one, the main focus of our attention is a time-asymmetric lawlike generalisation. According to the other, it is a particular fact about the (...) early universe. This paper aims (i) to distinguish these two different conceptions of the time-asymmetric explanandum in thermodynamics; (ii) to argue in favour of the latter; and (iii) to show that whichever we choose, our rational expectations about the thermodynamic behaviour of the future must depend on what we know about the past: contrary to the common view, statistical arguments alone do not give us good reason to expect that entropy will always continue to increase. (shrink)
In this paper we defend the view that the ordinary notions of cause and effect have a direct and essential connection with our ability to intervene in the world as agents.1 This is a well known but rather unpopular philosophical approach to causation, often called the manipulability theory. In the interests of brevity and accuracy, we prefer to call it the agency theory.2 Thus the central thesis of an agency account of causation is something like this: an event A is (...) a cause of a distinct event B just in case bringing about the occurrence of A would be an effective means by which a free agent could bring about the occurrence of B. (shrink)
In a famous paper in Noûs in 1979, John Perry points out that action depends on indexical beliefs. In addition to “third-person” information about her environment, an agent need “ﬁrst-person” information about where, when and who she is. This conclusion is widely interpreted as a reason for thinking that tensed claims cannot be translated without loss into untensed language; but not as a reason for realism about tensed facts. In another famous paper in the same volume of Noûs, Nancy Cartwright (...) argues that action requires that agents represent their world in causal terms, rather than merely probabilistic terms: for, Cartwright argues, there’s a distinction between eﬀective and ineﬀective strategies, that otherwise goes missing. This is widely taken as a reason for thinking that causal claims cannot be translated without loss into merely probabilistic claims; and also – in contrast to Perry’s case – widely regarded as a reason for realism about causation. In this paper I ask whether the latter conclusion is compulsory, or whether, as in Perry’s case, the need for causal beliefs might merely reﬂect some “situated” aspect of a decision-maker’s perspective. (shrink)
Probabilistic accounts of causality have long had trouble with ‘spurious’ evidential correlations. Such correlations are also central to the case for causal decision theory—the argument that evidential decision theory is inadequate to cope with certain sorts of decision problem. However, there are now several strong defences of the evidential theory. Here I present what I regard as the best defence, and apply it to the probabilistic approach to causality. I argue that provided a probabilistic theory appeals to the notions of (...) agency and effective strategy, it can avoid the problem of spurious causes. I show that such an appeal has other advantages; and argue that it is not illegitimate, even for a causal realist. (shrink)
Concepts employed in folk descriptions of the world often turn out to be more perspectival than they seem at first sight, involving previously unrecognised sensitivity to the viewpoint or 'situation' of the user of the concept in question. Often, it is progress in science that reveals such perspectivity, and the deciding factor is that we realise that other creatures would apply the same concepts with different extension, in virtue of differences between their circumstances and ours. In this paper I argue (...) that causal concepts are perspectival in this way, and describe the 'situation' on which they depend in terms of an abstract characterisation of the viewpoint of a deliberating agent. I argue that this approach makes better sense than rivals of the apparent asymmetry and temporal orientation of the causal relation. (shrink)
It has often been suggested that retrocausality offers a solution to some of the puzzles of quantum mechanics: e.g., that it allows a Lorentz-invariant explanation of Bell correlations, and other manifestations of quantum nonlocality, without action-at-a-distance. Some writers have argued that time-symmetry counts in favour of such a view, in the sense that retrocausality would be a natural consequence of a truly time-symmetric theory of the quantum world. Critics object that there is complete time-symmetry in classical physics, and yet no (...) apparent retrocausality. Why should the quantum world be any different? This note throws some new light on these matters. I call attention to a respect in which quantum mechanics is different, under some assumptions about quantum ontology. Under these assumptions, the combination of time-symmetry without retrocausality is unavailable in quantum mechanics, for reasons intimately connected with the differences between classical and quantum physics (especially the role of discreteness in the latter). Not all interpretations of quantum mechanics share these assumptions, however, and in those that do not, time-symmetry does not entail retrocausality. (shrink)
Proponents of causal decision theories argue that classical Bayesian decision theory (BDT) gives the wrong advice in certain types of cases, of which the clearest and commonest are the medical Newcomb problems. I defend BDT, invoking a familiar principle of statistical inference to show that in such cases a free agent cannot take the contemplated action to be probabilistically relevant to its causes (so that BDT gives the right answer). I argue that my defence does better than those of Ellery (...) Eells and Richard Jeffrey; and that it applies, where necessary, to other types of Newcomb problem. (shrink)
How do rational minds make contact with the world? The empiricist tradition sees a gap between mind and world, and takes sensory experience, fallible as it is, to provide our only bridge across that gap. In its crudest form, for example, the traditional idea is that our minds consult an inner realm of sensory experience, which provides us with evidence about the nature of external reality. Notoriously, however, it turns out to be far from clear that there is any viable (...) conception of experience which allows it to do the job. The original problem is to show that thought is rationally constrained by external reality. If sensory experience is to provide the solution--in particular, if it is to provide the answer to sceptical challenges--it must therefore meet two criteria. First, it must itself be `receptive'--i.e., appropriately constrained by external reality. Second, it must be the kind of thing that can enter into a logical or rational relationship with belief--it must already be `conceptual,' in other words. In arguing against the idea that anything could serve both roles, Wilfred Sellars termed this conception of experience "the Myth of the Given.". (shrink)
William James said that sometimes detailed philosophical argument is irrelevant. Once a current of thought is really under way, trying to oppose it with argument is like planting a stick in a river to try to alter its course: “round your obstacle flows the water and ‘gets there just the same’”. He thought pragmatism was such a river. There is a contemporary river that sometimes calls itself pragmatism, although other titles are probably better. At any rate it is the denial (...) of differences, the celebration of the seamless web of language, the soothing away of distinctions, whether of primary versus secondary, fact versus value, description versus expression, or of any other significant kind. What is left is a smooth, undifferentiated view of language, sometimes a nuanced kind of anthropomorphism or “internal” realism, sometimes the view that no view is possible: minimalism, deflationism, quietism. Wittgenstein is often admired as a high priest of the movement. Planting a stick in this water is probably futile, but having done it before I shall do it again, and—who knows?—enough sticks may make a dam, and the waters of error may subside. (Blackburn, 1998a, 157). (shrink)
For more than a century, physics has known of a puzzling conﬂict between the T- asymmetry of thermodynamic phenomena and the T-symmetry of the underlying microphysics on which these phenomena depend. This paper provides a guide to the current status of this puzzle, distinguishing the central issue from various issues with which it may be confused. It is shown that there are two competing conceptions of what is needed to resolve the puzzle of the thermodynamic asymmetry, which diﬀer with respect (...) to the number of distinct T-asymmetries they take to be manifest in the physical world. On the preferable one-asymmetry conception, the remaining puzzle concerns the ordered distribution of matter in the early universe. The puzzle of the thermodynamic arrow thus becomes a puzzle for cosmology. (shrink)
Is non-cognitivism compatible with minimalism about truth? A contemporary argument claims not, and therefore that moral realists, for example, should take heart from the popularity of semantic minimalism. The same is said to apply to cognitivism about other topics—conditionals, for example—for the argument depends only on the fact that ordinary usage applies the notions of truth and falsity to utterances of the kind in question. Given this much, minimalism about truth is said to leave no room for the view that (...) the utterances concerned are non-cognitive in nature. (shrink)
The best-known argument for Evidential Decision Theory (EDT) is the ‘Why ain’cha rich?’ challenge to rival Causal Decision Theory (CDT). The basis for this challenge is that in Newcomb-like situations, acts that conform to EDT may be known in advance to have the better return than acts that conform to CDT. Frank Arntzenius has recently proposed an ingenious counter argument, based on an example in which, he claims, it is predictable in advance that acts that conform to EDT will do (...) less well than acts that conform to CDT. We raise two objections to Arntzenius’s example. We argue, first, that the example is subtly incoherent, in a way that undermines its effectiveness against EDT; and, second, that the example relies on calculating the average return over an inappropriate population of acts. (shrink)
holds for all central declarative sentences. According to deflationists, the key to an understanding of truth lies in an appreciation of the grammatical advantages of a predicate satisfying DS. As Paul Horwich puts it, “our truth predicate is merely a logical device enabling simple formulations of certain sorts of generalization.” (1996, p. 878; see also Horwich 1990).
It is often objected that the Everett interpretation of QM cannot make sense of quantum probabilities, in one or both of two ways: either it can’t make sense of probability at all, or it can’t explain why probability should be governed by the Born rule. David Deutsch has attempted to meet these objections. He argues not only that rational decision under uncertainty makes sense in the Everett interpretation, but also that under reasonable assumptions, the credences of a rational agent in (...) an Everett world should be constrained by the Born rule. David Wallace has developed and defended Deutsch’s proposal, and greatly clarified its conceptual basis. In particular, he has stressed its reliance on the distinguishing symmetry of the Everett view, viz., that all possible outcomes of a quantum measurement are treated as equally real. The argument thus tries to make a virtue of what has usually been seen as the main obstacle to making sense of probability in the Everett world. In this note I outline some objections to the Deutsch-Wallace argument, and to related proposals by Hilary Greaves about the epistemology of Everettian QM. (In the latter case, my arguments include an appeal to an Everettian analogue of the Sleeping Beauty problem.) The common thread to these objections is that the symmetry in question remains a very significant obstacle to making sense of probability in the Everett interpretation. (shrink)
Physics takes for granted that interacting physical systems with no common history are independent, before their interaction. This principle is time-asymmetric, for no such restriction applies to systems with no common future, after an interaction. The time-asymmetry is normally attributed to boundary conditions. I argue that there are two distinct independence principles of this kind at work in contemporary physics, one of which cannot be attributed to boundary conditions, and therefore conflicts with the assumed T (or CPT) symmetry of microphysics. (...) I note that this may have interesting ramifications in quantum mechanics. (shrink)
Scientiﬁc naturalism is a metaphysical doctrine, a view about what there is, or what we ought to believe that there is. It maintains that natural science should be our guide in matters metaphysical: the ontology we should accept is the ontology that turns out to be required by science. Quine is often regarded as the doyen of scientiﬁc naturalists, though the supporting cast includes such giants as David Lewis and J. J. C. Smart.