The fundamental problem on which Ilya Prigogine and the Brussels–Austin Group have focused can be stated briefly as follows. Our observations indicate that there is an arrow of time in our experience of the world (e.g., decay of unstable radioactive atoms like uranium, or the mixing of cream in coffee). Most of the fundamental equations of physics are time reversible, however, presenting an apparent conflict between our theoretical descriptions and experimental observations. Many have thought that the observed arrow of time (...) was either an artifact of our observations or due to very special initial conditions. An alternative approach, followed by the Brussels–Austin Group, is to consider the observed direction of time to be a basic physical phenomenon due to the dynamics of physical systems. This essay focuses mainly on recent developments in the Brussels–Austin Group after the mid-1980s. The fundamental concerns are the same as in their earlier approaches (subdynamics, similarity transformations), but the contemporary approach utilizes rigged Hilbert space (whereas the older approaches used Hilbert space). While the emphasis on nonequilibrium statisticalmechanics remains the same, their more recent approach addresses the physical features of large Poincare systems, nonlinear dynamics and the mathematical tools necessary to analyze them. (shrink)
Statisticalmechanics is a strange theory. Its aims are debated, its methods are contested, its main claims have never been fully proven, and their very truth is challenged, yet at the same time, it enjoys huge empirical success and gives us the feeling that we understand important phenomena. What is this weird theory, exactly? Statisticalmechanics is the name of the ongoing attempt to apply mechanics, together with some auxiliary hypotheses, to explain and predict certain (...) phenomena, above all those described by thermodynamics. This paper shows what parts of this objective can be achieved with mechanics by itself. It thus clarifies what roles remain for the auxiliary assumptions that are needed to achieve the rest of the desiderata. Those auxiliary hypotheses are described in another paper in this journal, Foundations of statisticalmechanics: The auxiliary hypotheses. (shrink)
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a (...) probabilistic one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his concept of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concepts that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statisticalmechanics. It has relevance for the project of recovering thermodynamics from statisticalmechanics because, in such a project, it matters which version of the second law we are trying to recover. (shrink)
Statisticalmechanics is the name of the ongoing attempt to explain and predict certain phenomena, above all those described by thermodynamics on the basis of the fundamental theories of physics, in particular mechanics, together with certain auxiliary assumptions. In another paper in this journal, Foundations of statisticalmechanics: Mechanics by itself, I have shown that some of the thermodynamic regularities, including the probabilistic ones, can be described in terms of mechanics by itself. But (...) in order to prove those regularities, in particular the time asymmetric ones, it is necessary to add to mechanics assumptions of three kinds, all of which are under debate in contemporary literature. One kind of assumptions concerns measure and probability, and here, a major debate is around the notion of “typicality.” A second assumption concerns initial conditions, and here, the debate is about the nature and status of the so-called past hypothesis. The third kind of assumptions concerns the dynamics, and here, the possibility and significance of “Maxwell's Demon” is the main topic of discussions. This article describes these assumptions and examines the justification for introducing them, emphasizing the contemporary debates around them. (shrink)
In "Counterfactual Dependence and Time's Arrow", David Lewis defends an analysis of counterfactuals intended to yield the asymmetry of counterfactual dependence: that later affairs depend counterfactually on earlier ones, and not the other way around. I argue that careful attention to the dynamical properties of thermodynamically irreversible processes shows that in many ordinary cases, Lewis's analysis fails to yield this asymmetry. Furthermore, the analysis fails in an instructive way: it teaches us something about the connection between the asymmetry of overdetermination (...) and the asymmetry of entropy. (shrink)
This chapter will review selected aspects of the terrain of discussions about probabilities in statisticalmechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statisticalmechanics. The reason for this is the same reason that (...) convinced Maxwell, Gibbs, and Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statisticalmechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statisticalmechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statisticalmechanics is one that combines epistemic and physical considerations. (shrink)
In this paper I propose an interpretation of classical statisticalmechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statisticalmechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore (...) dispensable, a consequence which suggests interesting possibilities for developing non-equilibrium statisticalmechanics and investigating inter-theoretic answers to the foundational questions of statisticalmechanics. (shrink)
Statisticalmechanics is one of the crucial fundamental theories of physics, and in his new book Lawrence Sklar, one of the pre-eminent philosophers of physics, offers a comprehensive, non-technical introduction to that theory and to attempts to understand its foundational elements. Among the topics treated in detail are: probability and statistical explanation, the basic issues in both equilibrium and non-equilibrium statisticalmechanics, the role of cosmology, the reduction of thermodynamics to statisticalmechanics, and (...) the alleged foundation of the very notion of time asymmetry in the entropic asymmetry of systems in time. The book emphasises the interaction of scientific and philosophical modes of reasoning, and in this way will interest all philosophers of science as well as those in physics and chemistry concerned with philosophical questions. The book could also be read by an informed general reader interested in the foundations of modern science. (shrink)
I give a brief account of the way in which thermodynamics and statisticalmechanics actually work as contemporary scientific theories, and in particular of what statisticalmechanics contributes to thermodynamics over and above any supposed underpinning of the latter's general principles. In doing so, I attempt to illustrate that statisticalmechanics should not be thought of wholly or even primarily as itself a foundational project for thermodynamics, and that conceiving of it this way potentially (...) distorts the foundational study of statisticalmechanics itself. (shrink)
In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the time-reversible laws of microscopic physics. Boltzmann’s analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann’s later work on the subject have little merit. Most twentieth century innovations – such as the identification of the state of a physical system with a probability distribution on its phase space, (...) of its thermodynamic entropy with the Gibbs entropy of , and the invocation of the notions of ergodicity and mixing for the justification of the foundations of statisticalmechanics – are thoroughly misguided. (shrink)
There are two theoretical approaches in statisticalmechanics, one associated with Boltzmann and the other with Gibbs. The theoretical apparatus of the two approaches offer distinct descriptions of the same physical system with no obvious way to translate the concepts of one formalism into those of the other. This raises the question of the status of one approach vis-à-vis the other. We answer this question by arguing that the Boltzmannian approach is a fundamental theory while Gibbsian statistical (...)mechanics is an effective theory, and we describe circumstances under which Gibbsian calculations coincide with the Boltzmannian results. We then point out that regarding GSM as an effective theory has important repercussions for a number of projects, in particular attempts to turn GSM into a nonequilibrium theory. (shrink)
In Boltzmannian statisticalmechanics macro-states supervene on micro-states. This leads to a partitioning of the state space of a system into regions of macroscopically indistinguishable micro-states. The largest of these regions is singled out as the equilibrium region of the system. What justifies this association? We review currently available answers to this question and find them wanting both for conceptual and for technical reasons. We propose a new conception of equilibrium and prove a mathematical theorem which establishes in (...) full generality -- i.e. without making any assumptions about the system's dynamics or the nature of the interactions between its components -- that the equilibrium macro-region is the largest macro-region. We then turn to the question of the approach to equilibrium, of which there exists no satisfactory general answer so far. In our account, this question is replaced by the question when an equilibrium state exists. We prove another -- again fully general -- theorem providing necessary and sufficient conditions for the existence of an equilibrium state. This theorem changes the way in which the question of the approach to equilibrium should be discussed: rather than launching a search for a crucial factor, the focus should be on finding triplets of macro-variables, dynamical conditions, and effective state spaces that satisfy the conditions of the theorem. (shrink)
StatisticalMechanics (SM) involves probabilities. At the same time, most approaches to the foundations of SM—programs whose goal is to understand the macroscopic laws of thermal physics from the point of view of microphysics—are classical; they begin with the assumption that the underlying dynamical laws that govern the microscopic furniture of the world are (or can without loss of generality be treated as) deterministic. This raises some potential puzzles about the proper interpretation of these probabilities. It also raises, (...) more generally, the question of what kinds, if any, of objective probabilities can exist in a deterministic world. (shrink)
The paper reviews statistical models for money, wealth, and income distributions developed in the econophysics literature since the late 1990s. By analogy with the Boltzmann-Gibbs distribution of energy in physics, it is shown that the probability distribution of money is exponential for certain classes of models with interacting economic agents. Alternative scenarios are also reviewed. Data analysis of the empirical distributions of wealth and income reveals a two-class distribution. The majority of the population belongs to the lower class, characterized (...) by the exponential (“thermal”) distribution, whereas a small fraction of the population in the upper class is characterized by the power-law (“superthermal”) distribution. The lower part is very stable, stationary in time, whereas the upper part is highly dynamical and out of equilibrium. (shrink)
In two recent papers Barry Loewer (2001, 2004) has suggested to interpret probabilities in statisticalmechanics as Humean chances in David Lewis’ (1994) sense. I first give a precise formulation of this proposal, then raise two fundamental objections, and finally conclude that these can be overcome only at the price of interpreting these probabilities epistemically.
This paper explores some connections between competing conceptions of scientific laws on the one hand, and a problem in the foundations of statisticalmechanics on the other. I examine two proposals for understanding the time asymmetry of thermodynamic phenomenal: David Albert's recent proposal and a proposal that I outline based on Hans Reichenbach's “branch systems”. I sketch an argument against the former, and mount a defense of the latter by showing how to accommodate statisticalmechanics to (...) recent developments in the philosophy of scientific laws. (shrink)
Consider a gas that is adiabatically isolated from its environment and confined to the left half of a container. Then remove the wall separating the two parts. The gas will immediately start spreading and soon be evenly distributed over the entire available space. The gas has approached equilibrium. Thermodynamics (TD) characterizes this process in terms of an increase of thermodynamic entropy, which attains its maximum value at equilibrium. The second law of thermodynamics captures the irreversibility of this process by positing (...) that in an isolated system such as the gas entropy cannot decrease. The aim of statisticalmechanics (SM) is to explain the behavior of the gas and, in particular, its conformity with the second law in terms of the dynamical laws governing the individual molecules of which the gas is made up. In what follows these laws are assumed to be the ones of Hamiltonian classical mechanics. We should not, however, ask for an explanation of the second law literally construed. This law is a universal law and as such cannot be explained by a statistical theory. But this is not a problem because we.. (shrink)
Discussions of the foundations of Classical Equilibrium StatisticalMechanics (SM) typically focus on the problem of justifying the use of a certain probability measure (the microcanonical measure) to compute average values of certain functions. One would like to be able to explain why the equilibrium behavior of a wide variety of distinct systems (different sorts of molecules interacting with different potentials) can be described by the same averaging procedure. A standard approach is to appeal to ergodic theory to (...) justify this choice of measure. A different approach, eschewing ergodicity, was initiated by A. I. Khinchin. Both explanatory programs have been subjected to severe criticisms. This paper argues that the Khinchin type program deserves further attention in light of relatively recent results in understanding the physics of universal behavior. (shrink)
Interventionism is an approach to the foundations of statisticalmechanics which says that to explain and predict some of the thermodynamic phenomena we need to take into account the inescapable effect of environmental perturbations on the system of interest, in addition to the system's internal dynamics. The literature on interventionism suffers from a curious dual attitude: the approach is often mentioned as a possible framework for understanding statisticalmechanics, only to be quickly and decidedly dismissed. The (...) present paper is an attempt to understand this attraction-repulsion story. It offers a version of interventionism that appears to be defensible, and shows that this version can meet the main objections raised against it. It then investigates some of the philosophical ideas underlying interventionism, and proposes that these may be the source of the resentment interventionism encounters. This paves the way to see some features and consequences of interventionism, often taken to be shortcomings, as philosophically advantageous. (shrink)
Classical statisticalmechanics posits probabilities for various events to occur, and these probabilities seem to be objective chances. This does not seem to sit well with the fact that the theory’s time evolution is deterministic. We argue that the tension between the two is only apparent. We present a theory of Humean objective chance and show that chances thus understood are compatible with underlying determinism and provide an interpretation of the probabilities we find in Boltzmannian statistical (...) class='Hi'>mechanics. (shrink)
The paper reviews statistical models for money, wealth, and income distributions developed in the econophysics literature since the late 1990s. By analogy with the Boltzmann-Gibbs distribution of energy in physics, it is shown that the probability distribution of money is exponential for certain classes of models with interacting economic agents. Alternative scenarios are also reviewed. Data analysis of the empirical distributions of wealth and income reveals a two-class distribution. The majority of the population belongs to the lower class, characterized (...) by the exponential (“thermal”) distribution, whereas a small fraction of the population in the upper class is characterized by the power-law (“superthermal”) distribution. The lower part is very stable, stationary in time, whereas the upper part is highly dynamical and out of equilibrium. (shrink)
The conspicuous similarities between interpretive strategies in classical statisticalmechanics and in quantum mechanics may be grounded on their employment of common implementations of probability. The objective probabilities which represent the underlying stochasticity of these theories can be naturally associated with three of their common formal features: initial conditions, dynamics, and observables. Various well-known interpretations of the two theories line up with particular choices among these three ways of implementing probability. This perspective has significant application to debates (...) on primitive ontology and to the quantum measurement problem. (shrink)
To complete our ontological interpretation of quantum theory we have to conclude a treatment of quantum statisticalmechanics. The basic concepts in the ontological approach are the particle and the wave function. The density matrix cannot play a fundamental role here. Therefore quantum statisticalmechanics will require a further statistical distribution over wave functions in addition to the distribution of particles that have a specified wave function. Ultimately the wave function of the universe will he (...) required, but we show that if the universe in not in thermodynamic equilibrium then it can he treated in terms of weakly interacting large scale constituents that are very nearly independent of each other. In this way we obtain the same results as those of the usual approach within the framework of the ontological interpretation. (shrink)
This paper addresses the question of how we should regard the probability distributions introduced into statisticalmechanics. It will be argued that it is problematic to take them either as purely ontic, or purely epistemic. I will propose a third alternative: they are almost objective probabilities, or epistemic chances. The definition of such probabilities involves an interweaving of epistemic and physical considerations, and thus they cannot be classified as either purely epistemic or purely ontic. This conception, it will (...) be argued, resolves some of the puzzles associated with statistical mechanical probabilities: it explains how probabilistic posits introduced on the basis of incomplete knowledge can yield testable predictions, and it also bypasses the problem of disastrous retrodictions, that is, the fact the standard equilibrium measures yield high probability of the system being in equilibrium in the recent past, even when we know otherwise. As the problem does not arise on the conception of probabilities considered here, there is no need to invoke a Past Hypothesis as a special posit to avoid it. (shrink)
This paper is a discussion of David Albert's approach to the foundations of classical statistical menchanics. I point out a respect in which his account makes a stronger claim about the statistical mechanical probabilities than is usually made, and I suggest what might be motivation for this. I outline a less radical approach, which I attribute to Boltzmann, and I give some reasons for thinking that this approach is all we need, and also the most we are likely (...) to get. The issue between the two accounts turns out to be one about the explanatory role probabilities play in statisticalmechanics. (shrink)
In discussions of the foundations of statisticalmechanics, it is widely held that the Gibbsian and Boltzmannian approaches are incompatible but empirically equivalent; the Gibbsian approach may be calculationally preferable but only the Boltzmannian approach is conceptually satisfactory. I argue against both assumptions. Gibbsian statisticalmechanics is applicable to a wide variety of problems and systems, such as the calculation of transport coefficients and the statisticalmechanics and thermodynamics of mesoscopic systems, in which the (...) Boltzmannian approach is inapplicable. And the supposed conceptual problems with the Gibbsian approach are either misconceived, or apply only to certain versions of the Gibbsian approach, or apply with equal force to both approaches. I conclude that Boltzmannian statisticalmechanics is best seen as a special case of, and not an alternatve to, Gibbsian statisticalmechanics. (shrink)
Statisticalmechanics attempts to explain the behaviour of macroscopic physical systems in terms of the mechanical properties of their constituents. Although it is one of the fundamental theories of physics, it has received little attention from philosophers of science. Nevertheless, it raises philosophical questions of fundamental importance on the nature of time, chance and reduction. Most philosophical issues in this domain relate to the question of the reduction of thermodynamics to statisticalmechanics. This book addresses issues (...) inherent in this reduction: the time-asymmetry of thermodynamics and its absence in statisticalmechanics; the role and essential nature of chance and probability in this reduction when thermodynamics is non-probabilistic; and how, if at all, the reduction is possible. Compiling contributions on current research by experts in the field, this is an invaluable survey of the philosophy of statisticalmechanics for academic researchers and graduate students interested in the foundations of physics. (shrink)
Let us begin with a characteristic example. Consider a gas that is confined to the left half of a box. Now we remove the barrier separating the two halves of the box. As a result, the gas quickly disperses, and it continues to do so until it homogeneously fills the entire box. This is illustrated in Figure 1.
Why does classical equilibrium statisticalmechanics work? Malament and Zabell (1980) noticed that, for ergodic dynamical systems, the unique absolutely continuous invariant probability measure is the microcanonical. Earman and Rédei (1996) replied that systems of interest are very probably not ergodic, so that absolutely continuous invariant probability measures very distant from the microcanonical exist. In response I define the generalized properties of epsilon-ergodicity and epsilon-continuity, I review computational evidence indicating that systems of interest are epsilon-ergodic, I adapt Malament (...) and Zabell’s defense of absolute continuity to support epsilon-continuity, and I prove that, for epsilon-ergodic systems, every epsilon-continuous invariant probability measure is very close to the microcanonical. (shrink)
Huw Price (1996, 2002, 2003) argues that causal-dynamical theories that aim to explain thermodynamic asymmetry in time are misguided. He points out that in seeking a dynamical factor responsible for the general tendency of entropy to increase, these approaches fail to appreciate the true nature of the problem in the foundations of statisticalmechanics (SM). I argue that it is Price who is guilty of misapprehension of the issue at stake. When properly understood, causal-dynamical approaches in the foundations (...) of SM offer a solution for a different problem; a problem that unfortunately receives no attention in Price’s celebrated work. (shrink)
I present in detail the case for regarding black hole thermodynamics as having a statistical-mechanical explanation in exact parallel with the statistical-mechanical explanation believed to underly the thermodynamics of other systems. I focus on three lines of argument: zero-loop and one-loop calculations in quantum general relativity understood as a quantum field theory, using the path-integral formalism; calculations in string theory of the leading-order terms, higher-derivative corrections, and quantum corrections, in the black hole entropy formula for extremal and near-extremal (...) black holes; recovery of the qualitative and quantitative structure of black hole statisticalmechanics via the AdS/CFT correspondence. In each case I briefly review the content of, and arguments for, the form of quantum gravity being used at a introductory level: the paper is aimed at students and non-specialists and does not presume advanced knowledge of quantum gravity.. My conclusion is that the evidence for black hole statisticalmechanics is as solid as we could reasonably expect it to be in the absence of a directly-empirically-verified theory of quantum gravity. (shrink)
The aging of the two brothers in the “twins paradox” is analyzed through the space-time evolution of the densities that correspond to their internal complex structure. Taking into account their relative motion, it is shown that the traveling brother evolves over a shorter interval of time than his twin, which makes him younger than his brother.
We argue that, contrary to some analyses in the philosophy of science literature, ergodic theory falls short in explaining the success of classical equilibrium statisticalmechanics. Our claim is based on the observations that dynamical systems for which statisticalmechanics works are most likely not ergodic, and that ergodicity is both too strong and too weak a condition for the required explanation: one needs only ergodic-like behaviour for the finite set of observables that matter, but the (...) behaviour must ensure that the approach to equilibrium for these observables is on the appropriate time-scale. (shrink)
I discuss a broad critique of the classical approach to the foundations of statisticalmechanics (SM) offered by N. S. Krylov. He claims that the classical approach is in principle incapable of providing the foundations for interpreting the "laws" of statistical physics. Most intriguing are his arguments against adopting a de facto attitude towards the problem of irreversibility. I argue that the best way to understand his critique is as setting the stage for a positive theory which (...) treats SM as a theory in its own right, involving a completely different conception of a system's state. As the orthodox approach treats SM as an extension of the classical or quantum theories (one which deals with large systems), Krylov is advocating a major break with the traditional view of statistical physics. (shrink)
This article distinguishes two different senses of information-theoretic approaches to statisticalmechanics that are often conflated in the literature: those relating to the thermodynamic cost of computational processes and those that offer an interpretation of statisticalmechanics where the probabilities are treated as epistemic. This distinction is then investigated through Earman and Norton’s ([1999]) ‘sound’ and ‘profound’ dilemma for information-theoretic exorcisms of Maxwell’s demon. It is argued that Earman and Norton fail to countenance a ‘sound’ information-theoretic (...) interpretation and this paper describes how the latter inferential interpretations can escape the criticisms of Earman and Norton ([1999]) and Norton ([2005]) by adopting this ‘sound’ horn. This article considers a standard model of Maxwell’s demon to illustrate how one might adopt an information-theoretic approach to statisticalmechanics without a reliance on Landauer’s principle, where the incompressibility of the probability distribution due to Liouville’s theorem is taken as the central feature of such an interpretation. (shrink)
The book explores several open questions in the philosophy of statisticalmechanics. Each chapter is written by a leading expert in the field. Here is a list of some questions that are addressed in the book: 1) Boltzmann showed how the phenomenological gas laws of thermodynamics can be derived from statisticalmechanics. Since classical mechanics is a deterministic theory there are no probabilities in it. Since statisticalmechanics is based on classical mechanics, (...) all the probabilities statisticalmechanics talks about cannot be fundamental. However, if probabilities are epistemic, how can they play a role, as they seem to do, in laws, explanation, and prediction? 2) Many physicists use the notion of typicality instead of the one of probability when discussing statisticalmechanics. What is the connection between the two notions? 3) How can one extend Boltzmann’s analysis to the quantum domain, where some theories are indeterministic? 4) Boltzmann’s explanation fundamentally involves cosmology: for the explanation to go through the Big Bang needs to have had extremely low entropy. Does the fact that the Big Bang was a low entropy state imply that it was, in some sense, “highly improbable” and requires an explanation? 5) What exactly is the connection between statistical and classical mechanics? Is the one of theory reduction or there is no such thing? 6) Statisticalmechanics has two main formulation: one due to Botzmann and the other due to Gibbs. What is the connection between the two formulations . (shrink)
Philosophers of physics are very familiar with foundational problems in quantum mechanics and in the theory of relativity. In both fields, the puzzles, if not solved, are at least reasonably well formulated and possess well-characterized solution strategies. Sklar’s book Physics and Chance focuses on a pair of theories, thermodynamics and statisticalmechanics, for which puzzles and foundational paradoxes abound, but where there is very little agreement upon the means with which they may best be approached. As he (...) notes in the introduction. (shrink)