A modest proposal concerning laws, counterfactuals, and explanations - - Why be Humean? -- Suggestions from physics for deep metaphysics -- On the passing of time -- Causation, counterfactuals, and the third factor -- The whole ball of wax -- Epilogue : a remark on the method of metaphysics.
In this ingenious and powerfully argued book Tim Maudlin sets out a novel account of logic and semantics which allows him to deal with certain notorious paradoxes which have bedevilled philosophical theories of truth. All philosophers interested in logic and language will find this a stimulating read.
This paper sketches a taxonomy of forms of substantivalism and relationism concerning space and time, and of the traditional arguments for these positions. Several natural sorts of relationism are able to account for Newton's bucket experiment. Conversely, appropriately constructed substantivalism can survive Leibniz's critique, a fact which has been obscured by the conflation of two of Leibniz's arguments. The form of relationism appropriate to the Special Theory of Relativity is also able to evade the problems raised by Field. I survey (...) the effect of the General Theory of Relativity and of plenism on these considerations. (shrink)
The standard mathematical account of the sub-metrical geometry of a space employs topology, whose foundational concept is the open set. This proves to be an unhappy choice for discrete spaces, and offers no insight into the physical origin of geometrical structure. I outline an alternative, the Theory of Linear Structures, whose foundational concept is the line. Application to Relativistic space-time reveals that the whole geometry of space-time derives from temporal structure. In this sense, instead of spatializing time, Relativity temporalizes space.
This concise book introduces nonphysicists to the core philosophical issues surrounding the nature and structure of space and time, and is also an ideal resource for physicists interested in the conceptual foundations of space-time theory. Tim Maudlin's broad historical overview examines Aristotelian and Newtonian accounts of space and time, and traces how Galileo's conceptions of relativity and space-time led to Einstein's special and general theories of relativity. Maudlin explains special relativity using a geometrical approach, emphasizing intrinsic space-time structure rather than (...) coordinate systems or reference frames. He gives readers enough detail about special relativity to solve concrete physical problems while presenting general relativity in a more qualitative way, with an informative discussion of the geometrization of gravity, the bending of light, and black holes. Additional topics include the Twins Paradox, the physical aspects of the Lorentz-FitzGerald contraction, the constancy of the speed of light, time travel, the direction of time, and more.Introduces nonphysicists to the philosophical foundations of space-time theory Provides a broad historical overview, from Aristotle to Einstein Explains special relativity geometrically, emphasizing the intrinsic structure of space-time Covers the Twins Paradox, Galilean relativity, time travel, and more Requires only basic algebra and no formal knowledge of physics. (shrink)
The aim of this essay is to distinguish and analyze several difficulties confronting attempts to reconcile the fundamental quantum mechanical dynamics with Born''s rule. It is shown that many of the proposed accounts of measurement fail at least one of the problems. In particular, only collapse theories and hidden variables theories have a chance of succeeding, and, of the latter, the modal interpretations fail. Any real solution demands new physics.
It has long been a commonplace that there is a problem understanding the role of time when one tries to quantize the General Theory of Relativity (GTR). In his "Thoroughly Modern McTaggart" (Philosophers' Imprint Vol 2, No. 3), John Earman presents several arguments to the conclusion that there is a problem understanding change and the passage of time in the unadorned GTR, quite apart from quantization. His Young McTaggart argues that according to the GTR, no physical magnitude ever changes. A (...) close consideration of Young McTaggart's arguments show that they turn on either a bad choice of formalism or an unwarranted interpretation of the implications of the formalism. This suggests that the problems that arise in quantization may be founded in similar shortcomings. (shrink)
I argue that Norton & Earman's hole argument, despite its historical association with General Relativity, turns upon very general features of any linguistic system that can represent substances by names. After exploring various means by which mathematical objects can be interpreted as representing physical possibilities, I suggest that a form of essentialism can solve the hole dilemma without abandoning either determinism or substantivalism. Finally, I identify the basic tenets of such an essentialism in Newton's writings and consider how they can (...) be updated to apply to the case provided by General Relativity. (shrink)
Any empirical physical theory must have implications for observable events at the scale of everyday life, even though that scale plays no special role in the basic ontology of the theory itself. The fundamental physical scales are microscopic for the “local beables” of the theory and universal scale for the non-local beables. This situation creates strong demands for any precise quantum theory. This paper examines those constraints, and illustrates some ways in which they can be met.
Abraham Stone recently has published an argument purporting to show that David Bohm's interpretation of quantum mechanics fails to solve the measurement problem. Stone's analysis is not correct, as he has failed to take account of the conditions under which the theorems he cites are proven. An explicit presentation of a Bohmian measurement illustrates the flaw in his reasoning.
Richard Healey argues that the Aharonov- Bohm effect demands the recognition of either nonlocal or nonseparable physics in much the way that violations of Bell's inequality do. A careful examination of the effect and the arguments, though, shows that Healey's interpretation of the Aharonov- Bohm effect depends critically on his interpretation of gauge theories, and that the analogy with violations of Bell's inequalities fails.
There are various senses in which a physical theory may be said to "unify" different forces, with the unification being deeper of more shallow in different cases. This paper discusses some of these distinctions.
We criticize speculations to the effect that quantum mechanics is fundamentally about information. We do this by pointing out how unfounded such speculations in fact are. Our analysis focuses on the dubious claims of this kind recently made by Anton Zeilinger.
ABSTRACT In “Temporal Passage and the ‘No Alternate Possibilities Argument’”, Jonathan Tallant takes up one objection based on the observation that if time passes at the rate of one second per second there is no other possible rate at which it could pass. The argument rests on the premise that if time passes at some rate then it could have passed at some other rate. Since no alternative rate seems to be coherent, one concludes that time cannot pass at all. (...) The obvious weak point of the NAP is the premise itself. (shrink)
This paper demonstrates that John Wheeler and Richard Feynman's strategy for avoiding causal paradoxes threatened by backward causation and time-travel can be defeated by designing self-interacting mechanisms with a non-simple topological structure. Time-travel therefore requires constraints on the allowable data on space-like hypersurfaces. The nature and significance of these constraints is discussed.
Violations of Bell's Inequality can only be reliably produced if some information about the apparatus setting on one wing is available on the other, requiring superluminal information transmission. In this paper I inquire into the minimum amount of information needed to generate quantum statistics for correlated photons. Reflection on informational constraints clarifies the significance of Fine's Prism models, and allows the construction of several models more powerful than Fine's. These models are more efficient than Fine claims to be possible and (...) work for the full range of possible analyzer settings. It also demonstrates that the division of theories into those that violate parameter independence and those that violate outcome independence sheds no light on the question of superluminal information transmission. (shrink)