Metaphysicians should pay attention to quantum mechanics. Why? Not because it provides definitive answers to many metaphysical questions-the theory itself is remarkably silent on the nature of the physical world, and the various interpretations of the theory on offer present conflicting ontological pictures. Rather, quantum mechanics is essential to the metaphysician because it reshapes standard metaphysical debates and opens up unforeseen new metaphysical possibilities. Even if quantum mechanics provides few clear answers, there are good reasons to think that any adequate (...) understanding of the quantum world will result in a radical reshaping of our classical world-view in some way or other. Whatever the world is like at the atomic scale, it is almost certainly not the swarm of particles pushed around by forces that is often presupposed. This book guides readers through the theory of quantum mechanics and its implications for metaphysics in a clear and accessible way. The theory and its various interpretations are presented with a minimum of technicality. The consequences of these interpretations for metaphysical debates concerning realism, indeterminacy, causation, determinism, holism, and individuality are explored in detail, stressing the novel form that the debates take given the empirical facts in the quantum domain. While quantum mechanics may not deliver unconditional pronouncements on these issues, the range of possibilities consistent with our knowledge of the empirical world is relatively small-and each possibility is metaphysically revisionary in some way. This book will appeal to researchers, students, and anybody else interested in how science informs our world-view. (shrink)
Deception has long been an important topic in philosophy. However, the traditional analysis of the concept, which requires that a deceiver intentionally cause her victim to have a false belief, rules out the possibility of much deception in the animal kingdom. Cognitively unsophisticated species, such as fireflies and butterflies, have simply evolved to mislead potential predators and/or prey. To capture such cases of “functional deception,” several researchers Machiavellian intelligence II, Cambridge University Press, Cambridge, pp 112–143, 1997; Searcy and Nowicki, The (...) evolution of animal communication, Princeton University Press, Princeton, 2005; Skyrms, Signals, Oxford University Press, Oxford 2010) have endorsed the broader view that deception only requires that a deceiver benefit from sending a misleading signal. Moreover, in order to facilitate game-theoretic study of deception in the context of Lewisian sender-receiver games, Brian Skyrms has proposed an influential formal analysis of this view. Such formal analyses have the potential to enhance our philosophical understanding of deception in humans as well as animals. However, as we argue in this paper, Skyrms’s analysis, as well as two recently proposed alternative analyses, are seriously flawed and can lead us to draw unwarranted conclusions about deception. (shrink)
This paper investigates the tenability of wavefunction realism, according to which the quantum mechanical wavefunction is not just a convenient predictive tool, but is a real entity figuring in physical explanations of our measurement results. An apparent difficulty with this position is that the wavefunction exists in a many-dimensional configuration space, whereas the world appears to us to be three-dimensional. I consider the arguments that have been given for and against the tenability of wavefunction realism, and note that both the (...) proponents and the opponents assume that quantum mechanical configuration space is many-dimensional in exactly the same sense in which classical space is three-dimensional. I argue that this assumption is mistaken, and that configuration space can be taken as three-dimensional in a relevant sense. I conclude that wavefunction realism is far less problematic than it has been taken to be. Introduction Non-separability The instantaneous solution The dynamical solution Invariance What is configuration space, anyway? Conclusion. (shrink)
Putnam and Laudan separately argue that the falsity of past scientific theories gives us reason to doubt the truth of current theories. Their arguments have been highly influential, and have generated a significant literature over the past couple of decades. Most of this literature attempts to defend scientific realism by attacking the historical evidence on which the premises of the relevant argument are based. However, I argue that both Putnam's and Laudan's arguments are fallacious, and hence attacking their premises is (...) unnecessary. The paper concludes with a discussion of the further historical evidence that would be required if the pessimistic induction is to present a serious threat to scientific realism. (shrink)
The Sleeping Beauty paradox in epistemology and the many-worlds interpretation of quantum mechanics both raise problems concerning subjective probability assignments. Furthermore, there are striking parallels between the two cases; in both cases personal experience has a branching structure, and in both cases the agent loses herself among the branches. However, the treatment of probability is very different in the two cases, for no good reason that I can see. Suppose, then, that we adopt the same treatment of probability in each (...) case. Then the dominant ‘thirder’ solution to the Sleeping Beauty paradox becomes incompatible with the tenability of the many-worlds interpretation. (shrink)
Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. (...) One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics. (shrink)
Measures of epistemic utility are used by formal epistemologists to make determinations of epistemic betterness among cognitive states. The Brier rule is the most popular choice among formal epistemologists for such a measure. In this paper, however, we show that the Brier rule is sometimes seriously wrong about whether one cognitive state is epistemically better than another. In particular, there are cases where an agent gets evidence that definitively eliminates a false hypothesis, but where the Brier rule says that things (...) have become epistemically worse. Along the way to this ‘elimination experiment’ counter-example to the Brier rule as a measure of epistemic utility, we identify several useful monotonicity principles for epistemic betterness. We also reply to several potential objections to this counter-example. (shrink)
In quantum mechanics it is usually assumed that mutually exclusives states of affairs must be represented by orthogonal vectors. Recent attempts to solve the measurement problem, most notably the GRW theory, require the relaxation of this assumption. It is shown that a consequence of relaxing this assumption is that arithmatic does not apply to ordinary macroscopic objects. It is argued that such a radical move is unwarranted given the current state of understanding of the foundations of quantum mechanics.
Putnam and Laudan separately argue that the falsity of past scientific theories gives us reason to doubt the truth of current theories. Their arguments have been highly influential, and have generated a significant literature over the past couple of decades. Most of this literature attempts to defend scientific realism by attacking the historical evidence on which the premises of the relevant argument are based. However, I argue that both Putnam's and Laudan's arguments are fallacious, and hence attacking their premises is (...) unnecessary. The paper concludes with a discussion of the further historical evidence that would be required if the pessimistic induction is to present a serious threat to scientific realism. (shrink)
The main difficulty facing no-collapse theories of quantum mechanics in the Everettian tradition concerns the role of probability within a theory in which every possible outcome of a measurement actually occurs. The problem is two-fold: First, what do probability claims mean within such a theory? Second, what ensures that the probabilities attached to measurement outcomes match those of standard quantum mechanics? Deutsch has recently proposed a decision-theoretic solution to the second problem, according to which agents are rationally required to weight (...) the outcomes of measurements according to the standard quantum-mechanical probability measure. I show that this argument admits counterexamples, and hence fails to establish the standard probability weighting as a rational requirement. (shrink)
Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. (...) One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics. (shrink)
All parties to the Sleeping Beauty debate agree that it shows that some cherished principle of rationality has to go. Thirders think that it is Conditionalization and Reflection that must be given up or modified; halfers think that it is the Principal Principle. I offer an analysis of the Sleeping Beauty puzzle that allows us to retain all three principles. In brief, I argue that Sleeping Beauty’s credence in the uncentered proposition that the coin came up heads should be 1/2, (...) but her credence in the centered proposition that the coin came up heads and it is Monday should be 1/3. I trace the source of the earlier mistakes to an unquestioned assumption in the debate, namely that an uncentered proposition is just a special kind of centered proposition. I argue that the falsity of this assumption is the real lesson of the Sleeping Beauty case. (shrink)
It has long been recognized that a local hidden variable theory of quantum mechanics can in principle be constructed, provided one is willing to countenance pre-measurement correlations between the properties of measured systems and measuring devices. However, this ‘conspiratorial’ approach is typically dismissed out of hand. In this article I examine the justification for dismissing conspiracy theories of quantum mechanics. I consider the existing arguments against such theories, and find them to be less than conclusive. I suggest a more powerful (...) argument against the leading strategy for constructing a conspiracy theory. Finally, I outline two alternative strategies for constructing conspiracy theories, both of which are immune to these arguments, but require one to either modify or reject the common cause principle. Introduction The incompleteness of quantum mechanics Hidden variables Hidden mechanism conspiracy theories Existing arguments against hidden mechanisms A new argument against hidden mechanisms Backwards-causal conspiracy theories Acausal conspiracy theories Conclusion. (shrink)
The world looks three-dimensional unless one looks closely, when it looks 3N-dimensional. But which appearance is veridical, and which the illusion? Albert contends that the three-dimensionality of the everyday world is illusory, and that 3N-dimensional wavefunction one discerns in quantum phenomena is the reality behind the illusion. What I try to do here is to argue for the converse of Albert's position; the world really is three dimensional, and the 3N-dimensional appearance of quantum phenomena is the theoretical analog of an (...) illusion; we represent quantum reality to ourselves as 3N-dimensional in order to more readily visualize the correlations between wave packets. (shrink)
I examine recent arguments based on functionalism that claim to show that Bohm's theory fails to solve the measurement problem, or if it does so, it is only because it reduces to a form of the many-worlds theory. While these arguments reveal some interesting features of Bohm's theory, I contend that they do not undermine the distinctive Bohmian solution to the measurement problem. ‡I would like to thank Harvey Brown, Martin Thomson-Jones, and David Wallace for helpful discussions. †To contact the (...) author, please write to: Department of Philosophy, University of Miami, P.O. Box 248054, Coral Gables, FL 33124–4670; e-mail: [email protected] (shrink)
There is a recurring line of argument in the literature to the effect that Bohm's theory fails to solve the measurement problem. I show that this argument fails in all its variants. Hence Bohm's theory, whatever its drawbacks, at least succeeds in solving the measurement problem. I briefly discuss a similar argument that has been raised against the GRW theory.
Recently, Richard Healey and Simon Friederich have each advocated a pragmatist interpretation of quantum mechanics as a way to dissolve its foundational problems. The idea is that if we concentrate on the way quantum claims are used, the foundational problems of quantum mechanics cannot be formulated, and so do not require solution. Their central contention is that the content of quantum claims differs from the content of non-quantum claims, in that the former is prescriptive whereas the latter is descriptive. Healey (...) also argues that claims about non-decoherent systems are largely devoid of content. I consider various objections to these claims, noting in particular the ways in which the application of pragmatism to quantum mechanics differs from previous examples of pragmatist therapy. I conclude that a pragmatist dissolution of the foundational difficulties of quantum mechanics is promising, but requires fairly radical changes to our understanding of the content of propositions and the extent of physical explanation. (shrink)
A few years ago, I argued that according to spontaneous collapse theories of quantum mechanics, arithmetic applies to macroscopic objects only as an approximation. Several authors have written articles defending spontaneous collapse theories against this charge, including Bassi and Ghirardi, Clifton and Monton, and now Frigg. The arguments of these authors are all different and all ingenious, but in the end I think that none of them succeeds, for reasons I elaborate here. I suggest a fourth line of response, based (...) on an analogy with epistemic paradoxes, which I think is the best way to defend spontaneous collapse theories, and which leaves my main thesis intact. (shrink)
Spontaneous collapse theories of quantum mechanics require an interpretation if their claim to solve the measurement problem is to be vindicated. The most straightforward interpretation rule, the fuzzy link, generates a violation of common sense known as the counting anomaly. Recently, a consensus has developed that the mass density link provides an appropriate interpretation of spontaneous collapse theories that avoids the counting anomaly. In this paper, I argue that the mass density link violates common sense in just as striking a (...) way as the fuzzy link, and hence should not be regarded as a problem-free alternative to the fuzzy link. Hence advocates of spontaneous collapse theories must accept some violation of common sense, although this is not necessarily fatal to their project. (shrink)
I argued that anyone who adopts the Everettian approach to the foundations of quantum mechanics must also accept the (unpopular) ‘halfer’ solution to the Sleeping Beauty puzzle. Papineau and Durà-Vilà have responded with an argument that it is perfectly cogent both to be an Everettian and to accept the (popular) ‘thirder’ solution to Sleeping Beauty. Here I attempt to rebut their argument, and to clarify my original position.
It is widely acknowledged that the link between quantum language and ordinary language must be "fuzzier" than the traditional eigenstate-eigenvalue link. In the context of spontaneous-collapse theories, Albert and Loewer argue that the form of this fuzzy link is a matter of convention, and can be freely chosen to minimize anomalies for those theories. I defend the position that the form of the link is empirical, and could be such as to render collapse theories idle. This means that defenders of (...) spontaneous-collapse theories must gamble that the actual form of the link renders such theories tenable. (shrink)
A major problem facing no-collapse interpretations of quantum mechanics in the tradition of Everett is how to understand the probabilistic axiom of quantum mechanics (the Born rule) in the context of a deterministic theory in which every outcome of a measurement occurs. Deutsch claims to derive a decision-theoretic analogue of the Born rule from the non-probabilistic part of quantum mechanics and some non-probabilistic axioms of classical decision theory, and hence concludes that no probabilistic axiom is needed. I argue that Deutsch’s (...) derivation begs the question. (shrink)
In 1994, Maudlin proposed an objection to retrocausal approaches to quantum mechanics in general, and to the transactional interpretation in particular, involving an absorber that changes location depending on the trajectory of the particle. Maudlin considered this objection fatal. However, the TI did not die; rather, a number of responses were developed, some attempting to accommodate Maudlin's example within the existing TI, and others modifying the TI. I argue that none of these responses is fully adequate. The reason, I submit, (...) is that there are two aspects to Maudlin's objection; the more readily soluble aspect has received all the attention, but the more problematic aspect has gone unnoticed. I consider the prospects for developing a successful retrocausal quantum theory in light of this second aspect of the objection. (shrink)
The main problem with the many‐worlds theory is that it is not clear how the notion of probability should be understood in a theory in which every possible outcome of a measurement actually occurs. In this paper, I argue for the following theses concerning the many‐worlds theory: If probability can be applied at all to measurement outcomes, it must function as a measure of an agent’s self‐location uncertainty. Such probabilities typically violate reflection. Many‐worlds branching does not have sufficient structure to (...) admit self‐location probabilities. Decision‐theoretic arguments do not solve this problem. †To contact the author, please write to: Department of Philosophy, University of Miami, P.O. Box 248054, Coral Gables, FL 33124‐4670; e‐mail: [email protected] (shrink)
Spontaneous collapse theories of quantum mechanics require an interpretation if their claim to solve the measurement problem is to be vindicated. The most straightforward interpretation rule, the fuzzy link, generates a violation of common sense known as the counting anomaly. Recently, a consensus has developed that the mass density link provides an appropriate interpretation of spontaneous collapse theories that avoids the counting anomaly. In this paper, I argue that the mass density link violates common sense in just as striking a (...) way as the fuzzy link, and hence should not be regarded as a problem-free alternative to the fuzzy link. Hence advocates of spontaneous collapse theories must accept some violation of common sense, although this is not necessarily fatal to their project. (shrink)
Spontaneous collapse theories provide a promising solution to the measurement problem. But they also introduce a number of problems of their own concerning dimensionality, vagueness, and locality. In response to these problems, advocates of collapse theories have proposed various accounts of the primitive ontology of collapse theories—postulated underlying entities governed by the collapse theory and underwriting our observations. The most prominent of these are a mass density distribution over three-dimensional space, and a set of discrete “flash” events at space-time points. (...) My argument here is that these primitive ontologies are redundant, in the sense that the structures exhibited by the primitive ontologies that allow them to solve the problems facing spontaneous collapse theories are also present in the wave function. But redundancy is not nonexistence; indeed, the fact that the relevant structures are already there in the wave function shows that the mass density ontology and the flash ontology exist whether they are explicitly postulated or not. By the same token, there is no need to decide between a wave function ontology, a mass density ontology and a flash ontology. (shrink)
The Doomsday Argument and the Simulation Argument share certain structural features, and hence are often discussed together. Both are cases where reflecting on one’s location among a set of possibilities yields a counter-intuitive conclusion—in the first case that the end of humankind is closer than you initially thought, and in the second case that it is more likely than you initially thought that you are living in a computer simulation. Indeed, the two arguments do have some structural similarities. But there (...) are also significant disanalogies between the two arguments, and I argue that these disanalogies mean that the Simulation Argument succeeds and the Doomsday Argument fails. (shrink)
The call to supplement the quantum wave function with local beables is almost as old as quantum mechanics. But what exactly is the problem with the wave function as the representation of a quantum system? I canvass three potential problems with the wave function: the well-known problems of incompleteness and dimensionality, and the lesser known problem of non-locality introduced recently by Myrvold. Building on Myrvold's insight, I show that the standard ways of introducing local beables into quantum mechanics are unsuccessful. (...) I consider whether we really need local beables, and assess the prospects for a new theory of local beables. (shrink)
A local hidden variable theory of quantum mechanics is formulated by adapting Gell-Mann and Hartle’s many-histories formulation. The resulting theory solves the measurement problem by exploiting the independence loophole in Bell’s theorem; it violates the independence of hidden variable values and measuring device settings. Although the theory is problematic in some respects, it provides a concrete example via which the tenability of this approach can be better evaluated.
In cases of animal mimicry, the receiver of the signal learns the truth that he is either dealing with the real thing or with a mimic. Thus, despite being a prototypical example of animal deception, mimicry does not seem to qualify as deception on the traditional definition, since the receiver is not actually misled. We offer a new account of propositional content in sender-receiver games that explains how the receiver is misled by mimicry. We show that previous accounts of deception, (...) and of propositional content, give incorrect results about whether certain signals are deceptive. (shrink)
Protective measurement might be taken to put the last nail in the coffin of ensemble interpretations of the quantum state. My goal here is to show that even though ensemble interpretations face formidable obstacles, protective measurements don't lead to any additional difficulties. Rather, they provide us with a nice illustration of a conclusion for which we had considerable indirect evidence already, namely that quantum mechanics leads to a blurring of the distinction between the intrinsic properties of a system and the (...) statistical properties of the ensemble of which it is a member. This conclusion goes for all realist interpretations of the quantum state, both the mainstream ones that take the wave function to be a real field and the more conjectural ones that take the wave function to describe our knowledge of an ensemble. (shrink)
Interpretations of Quantum Mechanics Quantum mechanics is a physical theory developed in the 1920s to account for the behavior of matter on the atomic scale. It has subsequently been developed into arguably the most empirically successful theory in the history of physics. However, it is hard to understand quantum mechanics as a description of the … Continue reading Quantum Mechanics, Interpretations of →.
Accuracy-based arguments for conditionalization and probabilism appear to have a significant advantage over their Dutch Book rivals. They rely only on the plausible epistemic norm that one should try to decrease the inaccuracy of one's beliefs. Furthermore, it seems that conditionalization and probabilism follow from a wide range of measures of inaccuracy. However, we argue that among the measures in the literature, there are some from which one can prove conditionalization, others from which one can prove probabilism, and none from (...) which one can prove both. Hence at present, the accuracy-based approach cannot underwrite both conditionalization and probabilism. (shrink)
Accuracy-based arguments for conditionalization and probabilism appear to have a significant advantage over their Dutch Book rivals. They rely only on the plausible epistemic norm that one should try to decrease the inaccuracy of one’s beliefs. Furthermore, conditionalization and probabilism apparently follow from a wide range of measures of inaccuracy. However, we argue that there is an under-appreciated diachronic constraint on measures of inaccuracy which limits the measures from which one can prove conditionalization, and none of the remaining measures allow (...) one to prove probabilism. That is, among the measures in the literature, there are some from which one can prove conditionalization, others from which one can prove probabilism, but none from which one can prove both. Hence at present, the accuracy-based approach cannot underwrite both conditionalization and probabilism. (shrink)
There is an important sense in which an agent’s credences are universal: while they reflect an agent’s own judgments, those judgments apply equally to everyone’s bets. This point, while uncontentious, has been overlooked; people automatically assume that credences concern an agent’s own bets, perhaps just because of the name “subjective” that is typically applied to this account of belief. This oversight has had unfortunate consequences for recent epistemology, in particular concerning the Sleeping Beauty case and its myriad variants.
Interpretations of Quantum Mechanics Quantum mechanics is a physical theory developed in the 1920s to account for the behavior of matter on the atomic scale. It has subsequently been developed into arguably the most empirically successful theory in the history of physics. However, it is hard to understand quantum mechanics as a description of the … Continue reading Quantum Mechanics, Interpretations of →.
Just as Bell proposed that we excise the word “measurement” from physics, so I propose that we should excise the word “experience”: “experience” and its cognates should not appear in the formulation of any physical theory, including quantum mechanics and its various interpretations. The reasons are more or less the same as Bell gives for “measurement”: “experience” is a vague term, and experiencing systems are made out of atoms obeying quantum mechanics. Bell’s exhortation concerning “measurement” has largely been taken on (...) board in the foundations of quantum mechanics. But appeals to “experience” remain—in part, I will argue, because of a bad argument that can be traced back to von Neumann, and in part because of mistaken impressions about the fundamentality of experience. (shrink)
Pragmatism about quantum mechanics provides an attractive approach to the question of what quantum mechanics says. However, the conclusions reached by pragmatists concerning the content of quantum mechanics cannot be squared with the way that physicists use quantum mechanics to describe physical systems. In particular, attention to actual use results in ascribing content to claims about physical systems over a much wider range of contexts than countenanced by recent pragmatists. The resulting account of the content of quantum mechanics is much (...) closer to quantum logic, and threatens the pragmatist conclusion that quantum mechanics requires no supplementation. (shrink)
Quantum mechanics arguably provides the best evidence we have for strong emergence. Entangled pairs of particles apparently have properties that fail to supervene on the properties of the particles taken individually. But at the same time, quantum mechanics is a terrible place to look for evidence of strong emergence: the interpretation of the theory is so contested that drawing any metaphysical conclusions from it is risky at best. I run through the standard argument for strong emergence based on entanglement, and (...) show how it rests on shaky assumptions concerning the ontology of the quantum world. In particular, I consider two objections: that the argument involves Bell's theorem, whose premises are often rejected, and that the argument rests on a contested account of parts and wholes. I respond to both objections, showing that, with some important caveats, the argument for emergence based on quantum mechanics remains intact. (shrink)
In 1994, Maudlin proposed proposed an objection to the transactional interpretation, involving an absorber that changes location depending on the trajectory of the particle. Maudlin considered this objection fatal. However, the TI did not die; rather, a number of responses were developed, some attempting to accommodate Maudlin's example within the existing TI, and others modifying the TI. I argue that none of these responses is fully adequate. The reason, I submit, is that there are two aspects to Maudlin's objection; the (...) more readily soluble aspect has received all the attention, but the more problematic aspect has gone unnoticed. I consider the prospects for developing a succesful version of the TI in light of this second aspect of the objection. (shrink)
The Transactional Interpretation of quantum mechanics is a promising way of fulfilling Einstein’s vision of a completed quantum mechanics. However, it has received forceful criticism from Maudlin. Indeed, I shall argue that the force of Maudlin’s criticisms has been underestimated, and that none of the extant responses are adequate. An adequate response, I contend, requires reconceptualizing the kinds of explanations the Transactional Interpretation gives. I sketch such a reinterpretation and argue that it does not fall prey to Maudlin’s objections. However, (...) there remain significant obstacles in the way of formulating a fully adequate version of the Transactional Interpretation. (shrink)
Research ethics committees, while in many ways an excellent innovation, do have some drawbacks. This paper examines three of these. The first problem of such committees is that their approval of specific projects in their own institutions acquires intrinsic value. The second problem relates to the possible devolution of responsibility from the investigator to the committee. The committee approves, the investigator feels relieved of some responsibility and things can be done to patients which neither the committee nor the investigator might (...) countenance if they had sole responsibility. The third problem arises directly from the bureaucratic nature of the committee itself. And one consequence of the resulting rigid guidelines is the insistence, by most committees, on the written consent of patients. Demanding this can, in some circumstances, mean giving the patient very disturbing information. The paper suggests that in patients with a fatal disease where trials compare two accepted therapies committees dispense with written consent. There is a commentary on this paper by Dr D J Weatherall of the Nuffield Department of Clinical Medicine, University of Oxford. (shrink)