It is proposed that we use the term “approximation” for inexact description of a target system and “idealization” for another system whose properties also provide an inexact description of the target system. Since systems generated by a limiting process can often have quite unexpected, even inconsistent properties, familiar limit systems used in statistical physics can fail to provide idealizations, but are merely approximations. A dominance argument suggests that the limiting idealizations of statistical physics should be demoted to approximations.
Contrary to formal theories of induction, I argue that there are no universal inductive inference schemas. The inductive inferences of science are grounded in matters of fact that hold only in particular domains, so that all inductive inference is local. Some are so localized as to defy familiar characterization. Since inductive inference schemas are underwritten by facts, we can assess and control the inductive risk taken in an induction by investigating the warrant for its underwriting facts. In learning more facts, (...) we extend our inductive reach by supplying more localized inductive inference schemes. Since a material theory no longer separates the factual and schematic parts of an induction, it proves not to be vulnerable to Hume's problem of the justification of induction. (shrink)
Newton’s equations of motion tell us that a mass at rest at the apex of a dome with the shape specified here can spontaneously move. It has been suggested that this indeterminism should be discounted since it draws on an incomplete rendering of Newtonian physics, or it is “unphysical,” or it employs illicit idealizations. I analyze and reject each of these reasons. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, Pittsburgh, PA (...) 15260; e‐mail: firstname.lastname@example.org. (shrink)
I deny that the world is fundamentally causal, deriving the skepticism on non-Humean grounds from our enduring failures to find a contingent, universal principle of causality that holds true of our science. I explain the prevalence and fertility of causal notions in science by arguing that a causal character for many sciences can be recovered, when they are restricted to appropriately hospitable domains. There they conform to a loose collection of causal notions that form a folk science of causation. This (...) recovery of causation exploits the same generative power of reduction relations that allows us to recover gravity as a force from Einstein's general relativity and heat as a conserved fluid, the caloric, from modern thermal physics, when each theory is restricted to appropriate domains. Causes are real in science to the same degree as caloric and gravitational forces. (shrink)
The epistemic state of complete ignorance is not a probability distribution. In it, we assign the same, unique, ignorance degree of belief to any contingent outcome and each of its contingent, disjunctive parts. That this is the appropriate way to represent complete ignorance is established by two instruments, each individually strong enough to identify this state. They are the principle of indifference (PI) and the notion that ignorance is invariant under certain redescriptions of the outcome space, here developed into the (...) ‘principle of invariance of ignorance' (PII). Both instruments are so innocuous as almost to be platitudes. Yet the literature in probabilistic epistemology has misdiagnosed them as paradoxical or defective since they generate inconsistencies when conjoined with the assumption that an epistemic state must be a probability distribution. To underscore the need to drop this assumption, I express PII in its most defensible form as relating symmetric descriptions and show that paradoxes still arise if we assume the ignorance state to be a probability distribution. *Received February 2007; revised July 2007. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, Pittsburgh, PA 15260; e-mail: email@example.com. (shrink)
In a formal theory of induction, inductive inferences are licensed by universal schemas. In a material theory of induction, inductive inferences are licensed by facts. With this change in the conception of the nature of induction, I argue that the celebrated “problem of induction” can no longer be set up and is thereby dissolved. Attempts to recreate the problem in the material theory of induction fail. They require relations of inductive support to conform to an unsustainable, hierarchical empiricism.
Constructivists, such as Harvey Brown, urge that the geometries of Newtonian and special relativistic spacetimes result from the properties of matter. Whatever this may mean, it commits constructivists to the claim that these spacetime geometries can be inferred from the properties of matter without recourse to spatiotemporal presumptions or with few of them. I argue that the construction project only succeeds if constructivists antecedently presume the essential commitments of a realist conception of spacetime. These commitments can be avoided only by (...) adopting an extreme form of operationalism. (shrink)
iinstein oered the priniple of generl ovrine s the fundmentl physil priniple of his generl theory of reltivityD nd s responsile for extending the priniple of reltivity to elerted motionF his view ws disputed lmost immeditely with the ounterElim tht the priniple ws no reltivity priniple nd ws physilly vuousF he disgreeE ment persists todyF his rtile reviews the development of iinstein9s thought on generl ovrineD its reltion to the foundtions of generl reltivity nd the evolution of the ontinuing dete (...) over his viewpointF.. (shrink)
Thought experiments are ordinary argumentation disguised in a vivid pictorial or narrative form. This account of their nature will allow me to show that empiricism has nothing to fear from thought experiments. They perform no epistemic magic. In so far as they tell us about the world, thought experiments draw upon what we already know of it, either explicitly or tacitly; they then transform that knowledge by disguised argumentation. They can do nothing more epistemically than can argumentation. I defend my (...) account of thought experiments in Section 3 by urging that the epistemic reach of thought experiments turns out to coincide with that of argumentation and that this coincidence is best explained by the simple view that thought experiments just are arguments. Thought experiments can err—-a fact to be displayed by the thought experiment - anti thought experiment pairs of Section 2. Nonetheless thought experiments can be used reliably and, I urge in Section 4., this is only possible if they are governed by some very generalized logic. I will suggest on evolutionary considerations that their logics are most likely the familiar logics of induction and deduction, recovering the view that thought experiment is argumentation. Finally in Section 5 I defend this argument based epistemology of thought experiments against competing accounts. I suggest that these other accounts can offer a viable epistemology only insofar as they already incorporate the notion that thought experimentation is governed by a logic, possibly of very generalized form. (shrink)
Bayesian probabilistic explication of inductive inference conflates neutrality of supporting evidence for some hypothesis H (“not supporting H”) with disfavoring evidence (“supporting not-H”). This expressive inadequacy leads to spurious results that are artifacts of a poor choice of inductive logic. I illustrate how such artifacts have arisen in simple inductive inferences in cosmology. In the inductive disjunctive fallacy, neutral support for many possibilities is spuriously converted into strong support for their disjunction. The Bayesian “doomsday argument” is shown to rely entirely (...) on a similar artifact, for the result disappears in a reanalysis that employs fragments of inductive logic able to represent evidential neutrality. Finally, the mere supposition of a multiverse is not yet enough to warrant the introduction of probabilities without some factual analog of a randomizer over the multiverses. (shrink)
According to the underdetermination thesis, all evidence necessarily underdetermines any scientific theory. Thus it is often argued that our agreement on the content of mature scientific theories must be due to social and other factors. Drawing on a long standing tradition of criticism, I shall argue that the underdetermination thesis is little more than speculation based on an impoverished account of induction. A more careful look at accounts of induction does not support an assured underdetermination or the holism usually associated (...) with it. I also urge that the display of observationally equivalent theories is a self-defeating strategy for supporting the underdetermination thesis. The very fact that observational equivalence can be demonstrated by arguments brief enough to be included in a journal article means that we cannot preclude the possibility that the theories are merely variant formulations of the same theory. (shrink)
While there is no universal logic of induction, the probability calculus succeeds as a logic of induction in many contexts through its use of several notions concerning inductive inference. They include Addition, through which low probabilities represent disbelief as opposed to ignorance; and Bayes property, which commits the calculus to a ‘refute and rescale’ dynamics for incorporating new evidence. These notions are independent and it is urged that they be employed selectively according to needs of the problem at hand. It (...) is shown that neither is adapted to inductive inference concerning some indeterministic systems. (shrink)
Thought experiments in science are merely picturesque argumentation. I support this view in various ways, including the claim that it follows from the fact that thought experiments can err but can still be used reliably. The view is defended against alternatives proposed by my cosymposiasts.
Landauer's Principle asserts that there is an unavoidable cost in thermodynamic entropy creation when data is erased. It is usually derived from incorrect assumptions, most notably, that erasure must compress the phase space of a memory device or that thermodynamic entropy arises from the probabilistic uncertainty of random data. Recent work seeks to prove Landauer’s Principle without using these assumptions. I show that the processes assumed in the proof, and in the thermodynamics of computation more generally, can be combined to (...) produce devices that both violate the second law and erase data without entropy cost, indicating an inconsistency in the theoretical system. Worse, the standard repertoire of processes selectively neglects thermal fluctuations. Concrete proposals for how we might measure dissipationlessly and expand single molecule gases reversibly are shown to be fatally disrupted by fluctuations. Reversible, isothermal processes on molecular scales are shown to be disrupted by fluctuations that can only be overcome by introducing entropy creating, dissipative processes. (shrink)
Landauer’s principle is the loosely formulated notion that the erasure of n bits of information must always incur a cost of k ln n in thermodynamic entropy. It can be formulated as a precise result in statistical mechanics, but for a restricted class of erasure processes that use a thermodynamically irreversible phase space expansion, which is the real origin of the law’s entropy cost and whose necessity has not been demonstrated. General arguments that purport to establish the unconditional validity of (...) the law fail. They turn out to depend on the illicit formation of a canonical ensemble from memory devices holding random data. To exorcise Maxwell’s demon one must show that all candidate devices—the ordinary and the extraordinary—must fail to reverse the second law of thermodynamics. The theorizing surrounding Landauer’s principle is too fragile and too tied to a few specific examples to support such general exorcism. Charles Bennett’s recent extension of Landauer’s principle to the merging of computational paths fails for the same reasons as trouble the original principle. (shrink)
I give an informal outline of the hole argument which shows that spacetime substantivalism leads to an undesirable indeterminism in a broad class of spacetime theories. This form of the argument depends on the selection of differentiable manifolds within a spacetime theory as representing spacetime. I consider the conditions under which the argument can be extended to address versions of spacetime substantivalism which select these differentiable manifolds plus some further structure to represent spacetime. Finally, I respond to the criticisms of (...) Tim Maudlin and Jeremy Butterfield. (shrink)
Monte Carlo simulations arrive at their results by introducing randomness, sometimes derived from a physical randomizing device. Nonetheless, we argue, they open no new epistemic channels beyond that already employed by traditional simulations: the inference by ordinary argumentation of conclusions from assumptions built into the simulations. We show that Monte Carlo simulations cannot produce knowledge other than by inference, and that they resemble other computer simulations in the manner in which they derive their conclusions. Simple examples of Monte Carlo simulations (...) are analysed to identify the underlying inferences. (shrink)
My purpose in this chapter is to survey some of the principal approaches to inductive inference in the philosophy of science literature. My first concern will be the general principles that underlie the many accounts of induction in this literature. When these accounts are considered in isolation, as is more commonly the case, it is easy to overlook that virtually all accounts depend on one of very few basic principles and that the proliferation of accounts can be understood as efforts (...) to ameliorate the weaknesses of those few principles. In the earlier sections, I will lay out three inductive principles and the families of accounts of induction they engender. In later sections I will review standard problems in the philosophical literature that have supported some pessimism about induction and suggest that their import has been greatly overrated. In the final sections I will return to the proliferation of accounts of induction that frustrates efforts at a final codification. I will suggest that this proliferation appears troublesome only as long as we expect inductive inference to be subsumed under a single formal theory. If we adopt a material theory of induction in which individual inductions are licensed by particular facts that prevail only in local domains, then the proliferation is expected and not problematic. (shrink)
In this first part of a two-part paper, we describe efforts in the early decades of this century to restrict the extent of violations of the Second Law of thermodynamics that were brought to light by the rise of the kinetic theory and the identification of fluctuation phenomena. We show how these efforts mutated into Szilard’s proposal that Maxwell’s Demon is exorcised by proper attention to the entropy costs associated with the Demon’s memory and information acquisition. In the second part (...) we will argue that the information theoretic exorcisms of the Demon provide largely illusory benefits. According to the case, they either return a presupposition that can be had without information theoretic consideration or they postulate a broader connection between information and entropy than can be sustained. (shrink)
Standard descriptions of thermodynamically reversible processes attribute contradictory properties to them: they are in equilibrium yet still change their state. Or they are comprised of non-equilibrium states that are so close to equilibrium that the difference does not matter. One cannot have states that both change and no not change at the same time. In place of this internally contradictory characterization, the term “thermodynamically reversible process” is here construed as a label for a set of real processes of change involving (...) only non-equilibrium states. The properties usually attributed to a thermodynamically reversible process are recovered as the limiting properties of this set. No single process, that is, no system undergoing change, equilibrium or otherwise, carries those limiting properties. The paper concludes with an historical survey of characterizations of thermodynamically reversible processes and a critical analysis of their shortcomings. (shrink)
Proponents of Bayesian confirmation theory believe that they have the solution to a significant, recalcitrant problem in philosophy of science. It is the identification of the logic that governs evidence and its inductive bearing in science. That is the logic that lets us say that our catalog of planetary observations strongly confirms Copernicus’ heliocentric hypothesis; or that the fossil record is good evidence for the theory of evolution; or that the 3oK cosmic background radiation supports big bang cosmology. The definitive (...) solution to this problem would be a significant achievement. The problem is of central importance to philosophy of science, for, in the end, what distinguishes science from myth making is that we have good evidence for the content of science, or at least of mature sciences, whereas myths are evidentially ungrounded fictions. The core ideas shared by all versions of Bayesian confirmation theory are, at a good first approximation, that a scientist’s beliefs are or should conform to a probability measure; and that the incorporation of new evidence is through conditionalization using Bayes’ theorem. While the burden of this chapter will be to inventory why critics believe this theory may not be the solution after all, it is worthwhile first to summarize here the most appealing virtues of this simple account. There are three. First, the theory reduces the often nebulous notion of a logic of.. (shrink)
In a material theory of induction, inductive inferences are warranted by facts that prevail locally. This approach, it is urged, is preferable to formal theories of induction in which the good inductive inferences are delineated as those conforming to some universal schema. An inductive inference problem concerning indeterministic, non-probabilistic systems in physics is posed and it is argued that Bayesians cannot responsibly analyze it, thereby demonstrating that the probability calculus is not the universal logic of induction.
Mathias Frisch has argued that the requirement that electromagnetic dispersion processes are causal adds empirical content not found in electrodynamic theory. I urge that this attempt to reconstitute a local principle of causality in physics fails. An independent principle is not needed to recover the results of dispersion theory. The use of ‘causality conditions’ proves to be the mere adding of causal labels to an already presumed fact. If instead one seeks a broader, independently formulated grounding for the conditions, that (...) grounding either fails or dissolves into vagueness and ambiguity, as has traditionally been the fate of candidate principles of causality. Introduction Scattering in Classical Electrodynamics Sufficiency of the Physics Failure of the Principle of Causality Proposed 4.1 A sometimes principle 4.2 The conditions of applicability are obscure 4.3 Effects can come before their causes 4.4 Vagueness of the relata and of the notion of causal process Conclusion. (shrink)
The standard theory of computation excludes computations whose completion requires an infinite number of steps. Malament-Hogarth spacetimes admit observers whose pasts contain entire future-directed, timelike half-curves of infinite proper length. We investigate the physical properties of these spacetimes and ask whether they and other spacetimes allow the observer to know the outcome of a computation with infinitely many steps.
It is common to dismiss the passage of time as illusory since its passage has not been captured within modern physical theories. I argue that this is a mistake. Other than the awkward fact that it does not appear in our physics, there is no indication that the passage of time is an illusion.
Must a Maxwell demon must fail to reverse the second law of thermodynamics? Standard attempts to show it must fail make use of notions of information and computation. None of these attempts have succeeded. Worse they have distracted both supporters and opponents of these attempts from a much simpler demonstration of the necessary failure of a Maxwell's demon that employs no notions of information or computation. It requires only Liouville's theorem and its quantum analog.
An infinite lottery machine is used as a foil for testing the reach of inductive inference, since inferences concerning it require novel extensions of probability. Its use is defensible if there is some sense in which the lottery is physically possible, even if exotic physics is needed. I argue that exotic physics is needed and describe several proposals that fail and at least one that succeeds well enough.
How can we reconcile two claims that are now both widely accepted: Kretschmann's claim that a requirement of general covariance is physically vacuous and the standard view that the general covariance of general relativity expresses the physically important diffeomorphism gauge freedom of general relativity? I urge that both claims can be held without contradiction if we attend to the context in which each is made.
In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein’s other statistical papers of 1905 had already developed and exploited the idea that the ideal (...) gas law is another macroscopic signature of finitely many, spatially localized, independent components and that these papers in turn drew on his first two, “worthless” papers of 1901 and 1902 on intermolecular forces. However, while the ideal gas law was a secure signature of independence, it was harder to use as an indicator that there are finitely many components and that they are spatially localized. Further, since his analysis of the ideal gas law depended on the assumption that the number of components was fixed, its use was precluded for heat radiation, whose component quanta vary in number in most processes. So Einstein needed and found another, more powerful signature of discreteness applicable to heat radiation and which indicated all these properties. It used one of the few processes, volume fluctuation, in which heat radiation does not alter the number of quanta. (shrink)
1. Approximations of arbitrarily large but finite systems are often mistaken for infinite idealizations in statistical and thermal physics. The problem is illustrated by thermodynamically reversible processes. They are approximations of processes requiring arbitrarily long, but finite times to complete, not processes requiring an actual infinity of time.2. The present debate over whether phase transitions comprise a failure of reduction is confounded by a confusion of two senses of “level”: the molecular versus the thermodynamic level and the few component versus (...) the many component level. (shrink)
Thought experiments in science are merely picturesque argumentation. I support this view in various ways, including the claim that it follows from the fact that thought experiments can err but can still be used reliably. The view is defended against alternatives proposed by my co-symposiasts.
In this second part of our two-part paper we review and analyse attempts since 1950 to use information theoretic notions to exorcise Maxwell’s Demon. We argue through a simple dilemma that these attempted exorcisms are ineffective, whether they follow Szilard in seeking a compensating entropy cost in information acquisition or Landauer in seeking that cost in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law, no further supposition about information and entropy (...) is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon. (shrink)
Preface: This volume originated in a conference on "The Place of Thought Experiments in Science and Philosophy" which was organized by us and held at the Center for Philosophy of Science at the University of Pittsburgh, April 18-20, 1986. The idea behind this conference was to encourage philosophers and scientists to talk to each other about the role of thought experiments in their various disciplines. These papers were either written for the conference, or were written after it by commentators and (...) other participants.... We hope that this volume will be of use to other philosophers and scientists who are interested in thought experiments, as well as inspire more work in this area.... (shrink)
That quantum mechanical measurement processes are indeterministic is widely known. The time evolution governed by the differential Schrödinger equation can also be indeterministic under the extreme conditions of a quantum supertask, the quantum analogue of a classical supertask. Determinism can be restored by requiring normalizability of the supertask state vector, but it must be imposed as an additional constraint on the differential Schrödinger equation.
The thermodynamics of computation assumes that computational processes at the molecular level can be brought arbitrarily close to thermodynamic reversibility and that thermodynamic entropy creation is unavoidable only in data erasure or the merging of computational paths, in accord with Landauer’s principle. The no-go result shows that fluctuations preclude completion of thermodynamically reversible processes. Completion can be achieved only by irreversible processes that create thermodynamic entropy in excess of the Landauer limit.
Einstein proclaimed that we could discover true laws of nature by seeking those with the simplest mathematical formulation. He came to this viewpoint later in his life. In his early years and work he was quite hostile to this idea. Einstein did not develop his later Platonism from a priori reasoning or aesthetic considerations. He learned the canon of mathematical simplicity from his own experiences in the discovery of new theories, most importantly, his discovery of general relativity. Through his neglect (...) of the canon, he realised that he delayed the completion of general relativity by three years and nearly lost priority in discovery of its gravitational field equations. (shrink)
In eternally inflating cosmology, infinitely many pocket universes are seeded. Attempts to show that universes like our observable universe are probable amongst them have failed, since no unique probability measure is recoverable. This lack of definite probabilities is taken to reveal a complete predictive failure. Inductive inference over the pocket universes, it would seem, is impossible. I argue that this conclusion of impossibility mistakes the nature of the problem. It confuses the case in which no inductive inference is possible, with (...) another in which a weaker inductive logic applies. The alternative, applicable inductive logic is determined by background conditions and is the same, non-probabilistic logic as applies to an infinite lottery. This inductive logic does not preclude all predictions, but does affirm that predictions useful to deciding for or against eternal inflation are precluded. (shrink)
That past patterns may continue in many different ways has long been identified as a problem for accounts of induction. The novelty of Goodman’s ”new riddle of induction” lies in a meta-argument that purports to show that no account of induction can discriminate between incompatible continuations. That meta-argument depends on the perfect symmetry of the definitions of grue/bleen and green/blue, so that any evidence that favors the ordinary continuation must equally favor the grue-ified continuation. I argue that this very dependence (...) on the perfect symmetry defeats the novelty of the new riddle. The symmetry can be obtained in contrived circumstances, such as when we grue-ify our total science. However, in all such cases, we cannot preclude the possibility that the original and grue-ified descriptions are merely notationally variant descriptions of the same physical facts; or if there are facts that separate them, these facts are ineffable, so that no account of induction should be expected to pick between them. In ordinary circumstances, there are facts that distinguish the regular and grue-ified descriptions. Since accounts of induction can and do call upon these facts, Goodman’s meta-argument cannot provide principled grounds for the failure of all accounts of induction. It assures us only of the failure of accounts of induction, such as unaugmented enumerative induction, that cannot exploit these symmetry breaking facts. (shrink)
In Norton(2003), it was urged that the world does not conform at a fundamental level to some robust principle of causality. To defend this view, I now argue that the causal notions and principles of modern physics do not express some universal causal principle, brought to light by discoveries in physics. Rather they merely assert that, according to relativity theory, spacetime has an invariant velocity, that of light; and that theories of matter admit no propagations faster than light.
The objection that Einstein's principle of general covariance is not a relativity principle and has no physical content is reviewed. The principal escapes offered for Einstein's viewpoint are evaluated.
The duality of truth and falsity in a Boolean algebra of propositions is used to generate a duality of belief and disbelief. To each additive probability measure that represents belief there corresponds a dual additive measure that represents disbelief. The dual measure has its own peculiar calculus, in which, for example, measures are added when propositions are combined under conjunction. A Venn diagram of the measure has the contradiction as its total space. While additive measures are not self-dual, the epistemic (...) state of complete ignorance is represented by the unique, monotonic, non-additive measure that is self-dual in its contingent propositions. Convex sets of additive measures fail to represent complete ignorance since they are not self-dual. (shrink)