It is proposed that we use the term “approximation” for inexact description of a target system and “idealization” for another system whose properties also provide an inexact description of the target system. Since systems generated by a limiting process can often have quite unexpected, even inconsistent properties, familiar limit systems used in statistical physics can fail to provide idealizations, but are merely approximations. A dominance argument suggests that the limiting idealizations of statistical physics should be demoted to approximations.
Contrary to formal theories of induction, I argue that there are no universal inductive inference schemas. The inductive inferences of science are grounded in matters of fact that hold only in particular domains, so that all inductive inference is local. Some are so localized as to defy familiar characterization. Since inductive inference schemas are underwritten by facts, we can assess and control the inductive risk taken in an induction by investigating the warrant for its underwriting facts. In learning more facts, (...) we extend our inductive reach by supplying more localized inductive inference schemes. Since a material theory no longer separates the factual and schematic parts of an induction, it proves not to be vulnerable to Hume's problem of the justification of induction. (shrink)
Newton’s equations of motion tell us that a mass at rest at the apex of a dome with the shape specified here can spontaneously move. It has been suggested that this indeterminism should be discounted since it draws on an incomplete rendering of Newtonian physics, or it is “unphysical,” or it employs illicit idealizations. I analyze and reject each of these reasons. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, Pittsburgh, PA (...) 15260; e‐mail: email@example.com. (shrink)
The epistemic state of complete ignorance is not a probability distribution. In it, we assign the same, unique, ignorance degree of belief to any contingent outcome and each of its contingent, disjunctive parts. That this is the appropriate way to represent complete ignorance is established by two instruments, each individually strong enough to identify this state. They are the principle of indifference (PI) and the notion that ignorance is invariant under certain redescriptions of the outcome space, here developed into the (...) ‘principle of invariance of ignorance' (PII). Both instruments are so innocuous as almost to be platitudes. Yet the literature in probabilistic epistemology has misdiagnosed them as paradoxical or defective since they generate inconsistencies when conjoined with the assumption that an epistemic state must be a probability distribution. To underscore the need to drop this assumption, I express PII in its most defensible form as relating symmetric descriptions and show that paradoxes still arise if we assume the ignorance state to be a probability distribution. *Received February 2007; revised July 2007. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, Pittsburgh, PA 15260; e-mail: firstname.lastname@example.org. (shrink)
An infinite lottery machine is used as a foil for testing the reach of inductive inference, since inferences concerning it require novel extensions of probability. Its use is defensible if there is some sense in which the lottery is physically possible, even if exotic physics is needed. I argue that exotic physics is needed and describe several proposals that fail and at least one that succeeds well enough.
I deny that the world is fundamentally causal, deriving the skepticism on non-Humean grounds from our enduring failures to find a contingent, universal principle of causality that holds true of our science. I explain the prevalence and fertility of causal notions in science by arguing that a causal character for many sciences can be recovered, when they are restricted to appropriately hospitable domains. There they conform to a loose collection of causal notions that form a folk science of causation. This (...) recovery of causation exploits the same generative power of reduction relations that allows us to recover gravity as a force from Einstein's general relativity and heat as a conserved fluid, the caloric, from modern thermal physics, when each theory is restricted to appropriate domains. Causes are real in science to the same degree as caloric and gravitational forces. (shrink)
Constructivists, such as Harvey Brown, urge that the geometries of Newtonian and special relativistic spacetimes result from the properties of matter. Whatever this may mean, it commits constructivists to the claim that these spacetime geometries can be inferred from the properties of matter without recourse to spatiotemporal presumptions or with few of them. I argue that the construction project only succeeds if constructivists antecedently presume the essential commitments of a realist conception of spacetime. These commitments can be avoided only by (...) adopting an extreme form of operationalism. (shrink)
Thought experiments are ordinary argumentation disguised in a vivid pictorial or narrative form. This account of their nature will allow me to show that empiricism has nothing to fear from thought experiments. They perform no epistemic magic. In so far as they tell us about the world, thought experiments draw upon what we already know of it, either explicitly or tacitly; they then transform that knowledge by disguised argumentation. They can do nothing more epistemically than can argumentation. I defend my (...) account of thought experiments in Section 3 by urging that the epistemic reach of thought experiments turns out to coincide with that of argumentation and that this coincidence is best explained by the simple view that thought experiments just are arguments. Thought experiments can err—-a fact to be displayed by the thought experiment - anti thought experiment pairs of Section 2. Nonetheless thought experiments can be used reliably and, I urge in Section 4., this is only possible if they are governed by some very generalized logic. I will suggest on evolutionary considerations that their logics are most likely the familiar logics of induction and deduction, recovering the view that thought experiment is argumentation. Finally in Section 5 I defend this argument based epistemology of thought experiments against competing accounts. I suggest that these other accounts can offer a viable epistemology only insofar as they already incorporate the notion that thought experimentation is governed by a logic, possibly of very generalized form. (shrink)
In this first part of a two-part paper, we describe efforts in the early decades of this century to restrict the extent of violations of the Second Law of thermodynamics that were brought to light by the rise of the kinetic theory and the identification of fluctuation phenomena. We show how these efforts mutated into Szilard’s proposal that Maxwell’s Demon is exorcised by proper attention to the entropy costs associated with the Demon’s memory and information acquisition. In the second part (...) we will argue that the information theoretic exorcisms of the Demon provide largely illusory benefits. According to the case, they either return a presupposition that can be had without information theoretic consideration or they postulate a broader connection between information and entropy than can be sustained. (shrink)
While there is no universal logic of induction, the probability calculus succeeds as a logic of induction in many contexts through its use of several notions concerning inductive inference. They include Addition, through which low probabilities represent disbelief as opposed to ignorance; and Bayes property, which commits the calculus to a ‘refute and rescale’ dynamics for incorporating new evidence. These notions are independent and it is urged that they be employed selectively according to needs of the problem at hand. It (...) is shown that neither is adapted to inductive inference concerning some indeterministic systems. (shrink)
Must a Maxwell demon must fail to reverse the second law of thermodynamics? Standard attempts to show it must fail make use of notions of information and computation. None of these attempts have succeeded. Worse they have distracted both supporters and opponents of these attempts from a much simpler demonstration of the necessary failure of a Maxwell's demon that employs no notions of information or computation. It requires only Liouville's theorem and its quantum analog.
In a formal theory of induction, inductive inferences are licensed by universal schemas. In a material theory of induction, inductive inferences are licensed by facts. With this change in the conception of the nature of induction, I argue that the celebrated “problem of induction” can no longer be set up and is thereby dissolved. Attempts to recreate the problem in the material theory of induction fail. They require relations of inductive support to conform to an unsustainable, hierarchical empiricism.
Bayesian probabilistic explication of inductive inference conflates neutrality of supporting evidence for some hypothesis H (“not supporting H”) with disfavoring evidence (“supporting not-H”). This expressive inadequacy leads to spurious results that are artifacts of a poor choice of inductive logic. I illustrate how such artifacts have arisen in simple inductive inferences in cosmology. In the inductive disjunctive fallacy, neutral support for many possibilities is spuriously converted into strong support for their disjunction. The Bayesian “doomsday argument” is shown to rely entirely (...) on a similar artifact, for the result disappears in a reanalysis that employs fragments of inductive logic able to represent evidential neutrality. Finally, the mere supposition of a multiverse is not yet enough to warrant the introduction of probabilities without some factual analog of a randomizer over the multiverses. (shrink)
I deny that the world is fundamentally causal, deriving the skepticism on non-Humean grounds from our enduring failures to find a contingent, universal principle of causality that holds true of our science. I explain the prevalence and fertility of causal notions in science by arguing that a causal character for many sciences can be recovered, when they are restricted to appropriately hospitable domains. There they conform to loose and varying collections of causal notions that form folk sciences of causation. This (...) recovery of causation exploits the same generative power of reduction relations that allows us to recover gravity as a force from Einstein's general relativity and heat as a conserved fluid, the caloric, from modern thermal physics, when each theory is restricted to appropriate domains. Causes are real in science to the same degree as caloric and gravitational forces. (shrink)
Landauer's Principle asserts that there is an unavoidable cost in thermodynamic entropy creation when data is erased. It is usually derived from incorrect assumptions, most notably, that erasure must compress the phase space of a memory device or that thermodynamic entropy arises from the probabilistic uncertainty of random data. Recent work seeks to prove Landauer’s Principle without using these assumptions. I show that the processes assumed in the proof, and in the thermodynamics of computation more generally, can be combined to (...) produce devices that both violate the second law and erase data without entropy cost, indicating an inconsistency in the theoretical system. Worse, the standard repertoire of processes selectively neglects thermal fluctuations. Concrete proposals for how we might measure dissipationlessly and expand single molecule gases reversibly are shown to be fatally disrupted by fluctuations. Reversible, isothermal processes on molecular scales are shown to be disrupted by fluctuations that can only be overcome by introducing entropy creating, dissipative processes. (shrink)
Landauer’s principle is the loosely formulated notion that the erasure of n bits of information must always incur a cost of k ln n in thermodynamic entropy. It can be formulated as a precise result in statistical mechanics, but for a restricted class of erasure processes that use a thermodynamically irreversible phase space expansion, which is the real origin of the law’s entropy cost and whose necessity has not been demonstrated. General arguments that purport to establish the unconditional validity of (...) the law fail. They turn out to depend on the illicit formation of a canonical ensemble from memory devices holding random data. To exorcise Maxwell’s demon one must show that all candidate devices—the ordinary and the extraordinary—must fail to reverse the second law of thermodynamics. The theorizing surrounding Landauer’s principle is too fragile and too tied to a few specific examples to support such general exorcism. Charles Bennett’s recent extension of Landauer’s principle to the merging of computational paths fails for the same reasons as trouble the original principle. (shrink)
Standard descriptions of thermodynamically reversible processes attribute contradictory properties to them: they are in equilibrium yet still change their state. Or they are comprised of non-equilibrium states that are so close to equilibrium that the difference does not matter. One cannot have states that both change and no not change at the same time. In place of this internally contradictory characterization, the term “thermodynamically reversible process” is here construed as a label for a set of real processes of change involving (...) only non-equilibrium states. The properties usually attributed to a thermodynamically reversible process are recovered as the limiting properties of this set. No single process, that is, no system undergoing change, equilibrium or otherwise, carries those limiting properties. The paper concludes with an historical survey of characterizations of thermodynamically reversible processes and a critical analysis of their shortcomings. (shrink)
In a material theory of induction, inductive inferences are warranted by facts that prevail locally. This approach, it is urged, is preferable to formal theories of induction in which the good inductive inferences are delineated as those conforming to some universal schema. An inductive inference problem concerning indeterministic, non-probabilistic systems in physics is posed and it is argued that Bayesians cannot responsibly analyze it, thereby demonstrating that the probability calculus is not the universal logic of induction.
iinstein oered the priniple of generl ovrine s the fundmentl physil priniple of his generl theory of reltivityD nd s responsile for extending the priniple of reltivity to elerted motionF his view ws disputed lmost immeditely with the ounterElim tht the priniple ws no reltivity priniple nd ws physilly vuousF he disgreeE ment persists todyF his rtile reviews the development of iinstein9s thought on generl ovrineD its reltion to the foundtions of generl reltivity nd the evolution of the ontinuing dete (...) over his viewpointF.. (shrink)
That past patterns may continue in many different ways has long been identified as a problem for accounts of induction. The novelty of Goodman’s ”new riddle of induction” lies in a meta-argument that purports to show that no account of induction can discriminate between incompatible continuations. That meta-argument depends on the perfect symmetry of the definitions of grue/bleen and green/blue, so that any evidence that favors the ordinary continuation must equally favor the grue-ified continuation. I argue that this very dependence (...) on the perfect symmetry defeats the novelty of the new riddle. The symmetry can be obtained in contrived circumstances, such as when we grue-ify our total science. However, in all such cases, we cannot preclude the possibility that the original and grue-ified descriptions are merely notationally variant descriptions of the same physical facts; or if there are facts that separate them, these facts are ineffable, so that no account of induction should be expected to pick between them. In ordinary circumstances, there are facts that distinguish the regular and grue-ified descriptions. Since accounts of induction can and do call upon these facts, Goodman’s meta-argument cannot provide principled grounds for the failure of all accounts of induction. It assures us only of the failure of accounts of induction, such as unaugmented enumerative induction, that cannot exploit these symmetry breaking facts. (shrink)
Thought experiments in science are merely picturesque argumentation. I support this view in various ways, including the claim that it follows from the fact that thought experiments can err but can still be used reliably. The view is defended against alternatives proposed by my cosymposiasts.
1. Approximations of arbitrarily large but finite systems are often mistaken for infinite idealizations in statistical and thermal physics. The problem is illustrated by thermodynamically reversible processes. They are approximations of processes requiring arbitrarily long, but finite times to complete, not processes requiring an actual infinity of time.2. The present debate over whether phase transitions comprise a failure of reduction is confounded by a confusion of two senses of “level”: the molecular versus the thermodynamic level and the few component versus (...) the many component level. (shrink)
The standard theory of computation excludes computations whose completion requires an infinite number of steps. Malament-Hogarth spacetimes admit observers whose pasts contain entire future-directed, timelike half-curves of infinite proper length. We investigate the physical properties of these spacetimes and ask whether they and other spacetimes allow the observer to know the outcome of a computation with infinitely many steps.
Monte Carlo simulations arrive at their results by introducing randomness, sometimes derived from a physical randomizing device. Nonetheless, we argue, they open no new epistemic channels beyond that already employed by traditional simulations: the inference by ordinary argumentation of conclusions from assumptions built into the simulations. We show that Monte Carlo simulations cannot produce knowledge other than by inference, and that they resemble other computer simulations in the manner in which they derive their conclusions. Simple examples of Monte Carlo simulations (...) are analysed to identify the underlying inferences. (shrink)
Einstein located the foundations of general relativity in simple and vivid physical principles: the principle of equivalence, an extended principle of relativity and Mach's principle. While these ideas played an important heuristic role in Einstein's thinking, they provide a dubious logical foundation for his final theory. Einstein was also guided to his final theory, I argue, by a second tier of more prosaic heuristics. I trace one strand among them. The principle of equivalence guided Einstein well until it led him (...) to a theory that contradicted the conservation of momentum. Einstein converted the requirement of conservation of energy and momentum into a procedure that he used repeatedly for finding gravitational field equations. That procedure survives in present day developments of general relativity. (shrink)
In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein’s other statistical papers of 1905 had already developed and exploited the idea that the ideal (...) gas law is another macroscopic signature of finitely many, spatially localized, independent components and that these papers in turn drew on his first two, “worthless” papers of 1901 and 1902 on intermolecular forces. However, while the ideal gas law was a secure signature of independence, it was harder to use as an indicator that there are finitely many components and that they are spatially localized. Further, since his analysis of the ideal gas law depended on the assumption that the number of components was fixed, its use was precluded for heat radiation, whose component quanta vary in number in most processes. So Einstein needed and found another, more powerful signature of discreteness applicable to heat radiation and which indicated all these properties. It used one of the few processes, volume fluctuation, in which heat radiation does not alter the number of quanta. (shrink)
Einstein proclaimed that we could discover true laws of nature by seeking those with the simplest mathematical formulation. He came to this viewpoint later in his life. In his early years and work he was quite hostile to this idea. Einstein did not develop his later Platonism from a priori reasoning or aesthetic considerations. He learned the canon of mathematical simplicity from his own experiences in the discovery of new theories, most importantly, his discovery of general relativity. Through his neglect (...) of the canon, he realised that he delayed the completion of general relativity by three years and nearly lost priority in discovery of its gravitational field equations. (shrink)
According to the underdetermination thesis, all evidence necessarily underdetermines any scientific theory. Thus it is often argued that our agreement on the content of mature scientific theories must be due to social and other factors. Drawing on a long standing tradition of criticism, I shall argue that the underdetermination thesis is little more than speculation based on an impoverished account of induction. A more careful look at accounts of induction does not support an assured underdetermination or the holism usually associated (...) with it. I also urge that the display of observationally equivalent theories is a self-defeating strategy for supporting the underdetermination thesis. The very fact that observational equivalence can be demonstrated by arguments brief enough to be included in a journal article means that we cannot preclude the possibility that the theories are merely variant formulations of the same theory. (shrink)
How can we reconcile two claims that are now both widely accepted: Kretschmann's claim that a requirement of general covariance is physically vacuous and the standard view that the general covariance of general relativity expresses the physically important diffeomorphism gauge freedom of general relativity? I urge that both claims can be held without contradiction if we attend to the context in which each is made.
Proponents of Bayesian confirmation theory believe that they have the solution to a significant, recalcitrant problem in philosophy of science. It is the identification of the logic that governs evidence and its inductive bearing in science. That is the logic that lets us say that our catalog of planetary observations strongly confirms Copernicus’ heliocentric hypothesis; or that the fossil record is good evidence for the theory of evolution; or that the 3oK cosmic background radiation supports big bang cosmology. The definitive (...) solution to this problem would be a significant achievement. The problem is of central importance to philosophy of science, for, in the end, what distinguishes science from myth making is that we have good evidence for the content of science, or at least of mature sciences, whereas myths are evidentially ungrounded fictions. The core ideas shared by all versions of Bayesian confirmation theory are, at a good first approximation, that a scientist’s beliefs are or should conform to a probability measure; and that the incorporation of new evidence is through conditionalization using Bayes’ theorem. While the burden of this chapter will be to inventory why critics believe this theory may not be the solution after all, it is worthwhile first to summarize here the most appealing virtues of this simple account. There are three. First, the theory reduces the often nebulous notion of a logic of.. (shrink)
Please imagine a long fuse hanging down from the ceiling. It is a carefully woven tube of fabric that holds a core of gunpowder. We note that it is beautifully made, with brightly colored threads intertwined with the coarser bare cotton. It a masterpiece of the modern weaver's art.
The duality of truth and falsity in a Boolean algebra of propositions is used to generate a duality of belief and disbelief. To each additive probability measure that represents belief there corresponds a dual additive measure that represents disbelief. The dual measure has its own peculiar calculus, in which, for example, measures are added when propositions are combined under conjunction. A Venn diagram of the measure has the contradiction as its total space. While additive measures are not self-dual, the epistemic (...) state of complete ignorance is represented by the unique, monotonic, non-additive measure that is self-dual in its contingent propositions. Convex sets of additive measures fail to represent complete ignorance since they are not self-dual. (shrink)
In this second part of our two-part paper we review and analyse attempts since 1950 to use information theoretic notions to exorcise Maxwell’s Demon. We argue through a simple dilemma that these attempted exorcisms are ineffective, whether they follow Szilard in seeking a compensating entropy cost in information acquisition or Landauer in seeking that cost in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law, no further supposition about information and entropy (...) is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon. (shrink)
The thesis that observation necessarily fails to determine theory is false in the sense that observation can provide overwhelming evidence for a particular theory or even a hypothesis within the theory. The saga of quantum discontinuity illustrates the power of evidence to determine theory and shows how that power can be underestimated by inadequate caricatures of the evidential case. That quantum discontinuity can save the phenomena of black body radiation is the widely known result, but it leaves open the possibilities (...) of other accounts. That these phenomena, with the aid of minimal assumptions, entail quantum discontinuity is the crucial but now largely forgotten result. It was first demonstrated by Ehrenfest and Poincaré in 1911 and 1912. (shrink)
My purpose in this chapter is to survey some of the principal approaches to inductive inference in the philosophy of science literature. My first concern will be the general principles that underlie the many accounts of induction in this literature. When these accounts are considered in isolation, as is more commonly the case, it is easy to overlook that virtually all accounts depend on one of very few basic principles and that the proliferation of accounts can be understood as efforts (...) to ameliorate the weaknesses of those few principles. In the earlier sections, I will lay out three inductive principles and the families of accounts of induction they engender. In later sections I will review standard problems in the philosophical literature that have supported some pessimism about induction and suggest that their import has been greatly overrated. In the final sections I will return to the proliferation of accounts of induction that frustrates efforts at a final codification. I will suggest that this proliferation appears troublesome only as long as we expect inductive inference to be subsumed under a single formal theory. If we adopt a material theory of induction in which individual inductions are licensed by particular facts that prevail only in local domains, then the proliferation is expected and not problematic. (shrink)
The replicability of experiment is routinely offered as the gold standard of evidence. I argue that it is not supported by a universal principle of replicability in inductive logic. A failure of replication may not impugn a credible experimental result; and a successful replication can fail to vindicate an incredible experimental result. Rather, employing a material approach to inductive inference, the evidential import of successful replication of an experiment is determined by the prevailing background facts. Commonly, these background facts do (...) support successful replication as a good evidential guide and this has fostered the illusion of a deeper, exceptionless principle. (shrink)
The objection that Einstein's principle of general covariance is not a relativity principle and has no physical content is reviewed. The principal escapes offered for Einstein's viewpoint are evaluated.
I am grateful to Peter Achinstein, Don Howard, and the other participants at the conference, 'The Role of Experiments in Scientific Changer', Virginia Polytechnic Institute and State University, 30 March to 1 April, 1990, for helpful discussion, and especially to Ron Laymon for his discussion comments presented at the conference on an earlier version of this paper.
Einstein learned from the magnet and conductor thought experiments how to use field transformation laws to extend the covariance to Maxwell’s electrodynamics. If he persisted in his use of this device, he would have found that the theory cleaves into two Galilean covariant parts, each with different field transformation laws. The tension between the two parts reflects a failure not mentioned by Einstein: that the relativity of motion manifested by observables in the magnet and conductor thought experiment does not extend (...) to all observables in electrodynamics. An examination of Ritz’s work shows that Einstein’s early view could not have coincided with Ritz’s on an emission theory of light, but only with that of a conveniently reconstructed Ritz. One Ritz-like emission theory, attributed by Pauli to Ritz, proves to be a natural extension of the Galilean covariant part of Maxwell’s theory that happens also to accommodate the magnet and conductor thought experiment. Einstein's famous chasing a light beam thought experiment fails as an objection to an ether-based, electrodynamical theory of light. However it would allow Einstein to formulate his general objections to all emission theories of light in a very sharp form. Einstein found two well known experimental results of 18th and19th century optics compelling (Fizeau’s experiment, stellar aberration), while the accomplished Michelson-Morley experiment played no memorable role. I suggest they owe their importance to their providing a direct experimental grounding for Lorentz’ local time, the precursor of Einstein’s relativity of simultaneity, and do it essentially independently of electrodynamical theory. I attribute Einstein’s success to his determination to implement a principle of relativity in electrodynamics, but I urge that we not invest this stubbornness with any mystical prescience. (shrink)
Mathias Frisch has argued that the requirement that electromagnetic dispersion processes are causal adds empirical content not found in electrodynamic theory. I urge that this attempt to reconstitute a local principle of causality in physics fails. An independent principle is not needed to recover the results of dispersion theory. The use of ‘causality conditions’ proves to be the mere adding of causal labels to an already presumed fact. If instead one seeks a broader, independently formulated grounding for the conditions, that (...) grounding either fails or dissolves into vagueness and ambiguity, as has traditionally been the fate of candidate principles of causality. Introduction Scattering in Classical Electrodynamics Sufficiency of the Physics Failure of the Principle of Causality Proposed 4.1 A sometimes principle 4.2 The conditions of applicability are obscure 4.3 Effects can come before their causes 4.4 Vagueness of the relata and of the notion of causal process Conclusion. (shrink)
The advent of the special theory of relativity in 1905 brought many problems for the physics community. One, it seemed, would not be a great source of trouble. It was the problem of reconciling Newtonian gravitation theory with the new theory of space and time. Indeed it seemed that Newtonian theory could be rendered compatible with special relativity by any number of small modiﬁcations, each of which would be unlikely to lead to any signiﬁcant deviations from the empirically testable conse- (...) 1 quences of Newtonian theory. Einstein’s response to this problem is now legend. He decided almost immediately to abandon the search for a Lorentz covariant gravitation theory, for he had failed to construct such a theory that was compatible with the equality of inertial and gravitational mass. Positing what he later called the principle of equivalence, he decided that gravitation theory held the key to repairing what he perceived as the defect of the special theory of relativity—its relativity principle. (shrink)
Curie’s principle asserts that every symmetry of a cause manifests as a symmetry of the effect. It can be formulated as a tautology that is vacuous until it is instantiated. However instantiation requires us to know the correct way to map causal terminology onto the terms of a science. Causal metaphysics has failed to provide a unique, correct way to carry out the mapping. Thus successful or unsuccessful instantiation merely reflects our freedom of choice in the mapping.
It is common to dismiss the passage of time as illusory since its passage has not been captured within modern physical theories. I argue that this is a mistake. Other than the awkward fact that it does not appear in our physics, there is no indication that the passage of time is an illusion.