Citations of:
A field guide to recent work on the foundations of statistical mechanics
In Dean Rickles (ed.), The Ashgate Companion to Contemporary Philosophy of Physics. London, U.K.: Ashgate. pp. 99-196 (2008)
Add citations
You must login to add citations.
|
|
In this paper I will defend the incapacity of the informational frameworks in thermal physics, mainly those that historically and conceptually derive from the work of Brillouin (1962) and Jaynes (1957a), to robustly explain the approach of certain gaseous systems to their state of thermal equilibrium from the dynamics of their molecular components. I will further argue that, since their various interpretative, conceptual and technical-formal resources (e.g. epistemic interpretations of probabilities and entropy measures, identification of thermal entropy as Shannon information, (...) |
|
I will argue, pace a great many of my contemporaries, that there's something right about Boltzmann's attempt to ground the second law of thermodynamics in a suitably amended deterministic time-reversal invariant classical dynamics, and that in order to appreciate what's right about (what was at least at one time) Boltzmann's explanatory project, one has to fully apprehend the nature of microphysical causal structure, time-reversal invariance, and the relationship between Boltzmann entropy and the work of Rudolf Clausius. |
|
Boltzmannian statistical mechanics partitions the phase space of a sys- tem into macro-regions, and the largest of these is identified with equilibrium. What justifies this identification? Common answers focus on Boltzmann’s combinatorial argument, the Maxwell-Boltzmann distribution, and maxi- mum entropy considerations. We argue that they fail and present a new answer. We characterise equilibrium as the macrostate in which a system spends most of its time and prove a new theorem establishing that equilib- rium thus defined corresponds to the largest (...) |
|
In Boltzmannian statistical mechanics macro-states supervene on micro-states. This leads to a partitioning of the state space of a system into regions of macroscopically indistinguishable micro-states. The largest of these regions is singled out as the equilibrium region of the system. What justifies this association? We review currently available answers to this question and find them wanting both for conceptual and for technical reasons. We propose a new conception of equilibrium and prove a mathematical theorem which establishes in full generality (...) |
|
There are results which show that measure-theoretic deterministic models and stochastic models are observationally equivalent. Thus there is a choice between a deterministic and an indeterministic model and the question arises: Which model is preferable relative to evidence? If the evidence equally supports both models, there is underdetermination. This paper first distinguishes between different kinds of choice and clarifies the possible resulting types of underdetermination. Then a new answer is presented: the focus is on the choice between a Newtonian deterministic (...) |
|
|
|
There are two main theoretical frameworks in statistical mechanics, one associated with Boltzmann and the other with Gibbs. Despite their well-known differences, there is a prevailing view that equilibrium values calculated in both frameworks coincide. We show that this is wrong. There are important cases in which the Boltzmannian and Gibbsian equilibrium concepts yield different outcomes. Furthermore, the conditions under which equilibriums exists are different for Gibbsian and Boltzmannian statistical mechanics. There are, however, special circumstances under which it is true (...) |
|
A popular view in contemporary Boltzmannian statistical mechanics is to interpret the measures as typicality measures. In measure-theoretic dynamical systems theory measures can similarly be interpreted as typicality measures. However, a justification why these measures are a good choice of typicality measures is missing, and the paper attempts to fill this gap. The paper first argues that Pitowsky's justification of typicality measures does not fit the bill. Then a first proposal of how to justify typicality measures is presented. The main (...) |
|
It can be shown that certain kinds of classical deterministic and indeterministic descriptions are observationally equivalent. Then the question arises: which description is preferable relative to evidence? This paper looks at the main argument in the literature for the deterministic description by Winnie (The cosmos of science—essays of exploration. Pittsburgh University Press, Pittsburgh, pp 299–324, 1998). It is shown that this argument yields the desired conclusion relative to in principle possible observations where there are no limits, in principle, on observational (...) |
|
This paper develops a philosophical investigation of the merits and faults of a theorem by Lanford , Lanford , Lanford for the problem of the approach towards equilibrium in statistical mechanics. Lanford’s result shows that, under precise initial conditions, the Boltzmann equation can be rigorously derived from the Hamiltonian equations of motion for a hard spheres gas in the Boltzmann-Grad limit, thereby proving the existence of a unique solution of the Boltzmann equation, at least for a very short amount of (...) |
|
Explaining the emergence of stochastic irreversible macroscopic dynamics from time-reversible deterministic microscopic dynamics is one of the key problems in philosophy of physics. The Mori-Zwanzig projection operator formalism, which is one of the most important methods of modern nonequilibrium statistical mechanics, allows for a systematic derivation of irreversible transport equations from reversible microdynamics and thus provides a useful framework for understanding this issue. However, discussions of the MZ formalism in philosophy of physics tend to focus on simple variants rather than (...) |
|
The problem of multiple-computations discovered by Hilary Putnam presents a deep difficulty for functionalism (of all sorts, computational and causal). We describe in out- line why Putnam’s result, and likewise the more restricted result we call the Multiple- Computations Theorem, are in fact theorems of statistical mechanics. We show why the mere interaction of a computing system with its environment cannot single out a computation as the preferred one amongst the many computations implemented by the system. We explain why nonreductive (...) |
|
The arrow of time is a familiar phenomenon we all know from our experience: we remember the past but not the future and control the future but not the past. However, it takes an effort to keep records of the past, and to affect the future. For example, it would take an immense effort to unmix coffee and milk, although we easily mix them. Such time directed phenomena are sub- sumed under the Second Law of Thermodynamics. This law characterizes our (...) |
|
Information, entropy, probability: these three terms are closely interconnected in the prevalent understanding of statistical mechanics, both when this field is taught to students at an introductory level and in advanced research into the field’s foundations. This paper examines the interconnection between these three notions in light of recent research in the foundations of statistical mechanics. It disentangles these concepts and highlights their differences, at the same time explaining why they came to be so closely linked in the literature. In (...) |
|
Statistical mechanics is a strange theory. Its aims are debated, its methods are contested, its main claims have never been fully proven, and their very truth is challenged, yet at the same time, it enjoys huge empirical success and gives us the feeling that we understand important phenomena. What is this weird theory, exactly? Statistical mechanics is the name of the ongoing attempt to apply mechanics, together with some auxiliary hypotheses, to explain and predict certain phenomena, above all those described (...) |
|
Statistical mechanics is the name of the ongoing attempt to explain and predict certain phenomena, above all those described by thermodynamics on the basis of the fundamental theories of physics, in particular mechanics, together with certain auxiliary assumptions. In another paper in this journal, Foundations of statistical mechanics: Mechanics by itself, I have shown that some of the thermodynamic regularities, including the probabilistic ones, can be described in terms of mechanics by itself. But in order to prove those regularities, in (...) |
|
I point out that some common folk wisdom about time reversal invariance in classical mechanics is strictly incorrect, by showing some explicit examples in which classical time reversal invariance fails, even among conservative systems. I then show that there is nevertheless a broad class of familiar classical systems that are time reversal invariant. |
|
While the fundamental laws of physics are time-reversal invariant, most macroscopic processes are irreversible. Given that the fundamental laws are taken to underpin all other processes, how can the fundamental time-symmetry be reconciled with the asymmetry manifest elsewhere? In statistical mechanics, progress can be made with this question. What I dub the ‘Zwanzig–Zeh–Wallace framework’ can be used to construct the irreversible equations of SM from the underlying microdynamics. Yet this framework uses coarse-graining, a procedure that has faced much criticism. I (...) |
|
In his mature writings, Kuhn describes the process of specialisation as driven by a form of incommensurability, defined as a conceptual/linguistic barrier which promotes and guarantees the insularity of specialties. In this paper, we reject the idea that the incommensurability among scientific specialties is a linguistic barrier. We argue that the problem with Kuhn’s characterisation of the incommensurability among specialties is that he presupposes a rather abstract theory of semantic incommensurability, which he then tries to apply to his description of (...) |
|
In the history of science, the birth of classical chemistry and thermodynamics produced an anomaly within Newtonian mechanical paradigm: force and acceleration were no longer citizens of new cited sciences. Scholars tried to reintroduce them within mechanistic approaches, as the case of the kinetic gas theory. Nevertheless, Thermodynamics, in general, and its Second Law, in particular, gradually affirmed their role of dominant not-reducible cognitive paradigms for various scientific disciplines: more than twenty formulations of Second Law—a sort of indisputable intellectual wealth—are (...) |
|
This paper investigates Jaynes’ “unbelievably short proof” of the 2nd law of thermodynamics. It assesses published criticisms of the proof and concludes that these criticisms miss the mark by demanding results that either import expectations of a proof not consistent with an information-theoretic approach, or would require assumptions not employed in the proof itself, as it looks only to establish a weaker conclusion. Finally, a weakness in the proof is identified and illustrated. This weakness stems from the fact the Jaynes’ (...) |
|
This article distinguishes two different senses of information-theoretic approaches to statistical mechanics that are often conflated in the literature: those relating to the thermodynamic cost of computational processes and those that offer an interpretation of statistical mechanics where the probabilities are treated as epistemic. This distinction is then investigated through Earman and Norton’s ([1999]) ‘sound’ and ‘profound’ dilemma for information-theoretic exorcisms of Maxwell’s demon. It is argued that Earman and Norton fail to countenance a ‘sound’ information-theoretic interpretation and this paper (...) |
|
In this paper, I compare the use of the thermodynamic limit in the theory of phase transitions with the infinite-time limit in the explanation of equilibrium statistical mechanics. In the case of phase transitions, I will argue that the thermodynamic limit can be justified pragmatically since the limit behavior also arises before we get to the limit and for values of N that are physically significant. However, I will contend that the justification of the infinite-time limit is less straightforward. In (...) |
|
We often use symmetries to infer outcomes’ probabilities, as when we infer that each side of a fair coin is equally likely to come up on a given toss. Why are these inferences successful? I argue against answering this with an a priori indifference principle. Reasons to reject that principle are familiar, yet instructive. They point to a new, empirical explanation for the success of our probabilistic predictions. This has implications for indifference reasoning in general. I argue that a priori (...) |
|
This pair of articles provides a critical commentary on contemporary approaches to statistical mechanical probabilities. These articles focus on the two ways of understanding these probabilities that have received the most attention in the recent literature: the epistemic indifference approach, and the Lewis-style regularity approach. These articles describe these approaches, highlight the main points of contention, and make some attempts to advance the discussion. The first of these articles provides a brief sketch of statistical mechanics, and discusses the indifference approach (...) |
|
This pair of articles provides a critical commentary on contemporary approaches to statistical mechanical probabilities. These articles focus on the two ways of understanding these probabilities that have received the most attention in the recent literature: the epistemic indifference approach, and the Lewis-style regularity approach. These articles describe these approaches, highlight the main points of contention, and make some attempts to advance the discussion. The second of these articles discusses the regularity approach to statistical mechanical probabilities, and describes some areas (...) |
|
I discuss the formal implementation, interpretation, and justification of likelihood attributions in cosmology. I show that likelihood arguments in cosmology suffer from significant conceptual and formal problems that undermine their applicability in this context. |
|
The conspicuous similarities between interpretive strategies in classical statistical mechanics and in quantum mechanics may be grounded on their employment of common implementations of probability. The objective probabilities which represent the underlying stochasticity of these theories can be naturally associated with three of their common formal features: initial conditions, dynamics, and observables. Various well-known interpretations of the two theories line up with particular choices among these three ways of implementing probability. This perspective has significant application to debates on primitive ontology (...) |
|
In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a consequence which suggests (...) |
|
It is commonly maintained that neuroplastic mechanisms in the brain provide empirical support for the hypothesis of multiple realizability. We show in various case studies that neuroplasticity stems from preexisting mechanisms and processes inherent in the neural structure of the brain. We argue that not only does neuroplasticity fail to provide empirical evidence of multiple realization, its inability to do so strengthens the mind-body identity theory. Finally, we argue that a recently proposed identity theory called Flat Physicalism can be enlisted (...) |
|
I highlight that the aim of using statistical mechanics to underpin irreversible processes is, strictly speaking, ambiguous. Traditionally, however, the task of underpinning irreversible processes has been thought to be synonymous with underpinning the Second Law of thermodynamics. I claim that contributors to the foundational discussion are best interpreted as aiming to provide a microphysical justification of the Minus First Law, despite the ways their aims are often stated. I suggest that contributors should aim at accounting for both the Minus (...) No categories |
|
Despite of its formal precision and its great many applications, Shannon’s theory still offers an active terrain of debate when the interpretation of its main concepts is the task at issue. In this article we try to analyze certain points that still remain obscure or matter of discussion, and whose elucidation contribute to the assessment of the different interpretative proposals about the concept of information. In particular, we argue for a pluralist position, according to which the different views about information (...) |
|
Christopher Timpson proposes a deflationary view about information, according to which the term ‘information’ is an abstract noun and, as a consequence, information is not part of the material contents of the world. The main purpose of the present article consists in supplying a critical analysis of this proposal, which will lead us to conclude that information is an item even more abstract than what Timpson claims. From this view, we embrace a pluralist stance that recognizes the legitimacy of different (...) |
|
This paper presents an in-depth analysis of the anatomy of both thermodynamics and statistical mechanics, together with the relationships between their constituent parts. Based on this analysis, using the renormalization group and finite-size scaling, we give a definition of a large but finite system and argue that phase transitions are represented correctly, as incipient singularities in such systems. We describe the role of the thermodynamic limit. And we explore the implications of this picture of critical phenomena for the questions of (...) |
|
The relationship between statistical mechanics and population genetics has a long history. Both take advantage of statistics to address the behavior of large groups of entities. The main objective of this article is to assess the obstacles population genetics is meeting in its claim to explain biological phenomena from the conceptual apparatus of statistical mechanics according to two recent articles. Several tools available to the latter are missing in the former. Thus, in the absence of an adequate justification of the (...) No categories |
|
Must a theory of quantum gravity have some truth to it if it can recover general relativity in some limit of the theory? This paper answers this question in the negative by indicating that general relativity is multiply realizable in quantum gravity. The argument is inspired by spacetime functionalism—multiple realizability being a central tenet of functionalism—and proceeds via three case studies: induced gravity, thermodynamic gravity, and entanglement gravity. In these, general relativity in the form of the Einstein field equations can (...) No categories |
|
There is a longstanding debate on the metaphysical relation between quantum states and the systems they describe. A series of relatively recent ψ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\psi$$\end{document}-ontology theorems have been taken to show that, provided one accepts certain assumptions, “quantum states are real”. In this paper I investigate the question of what that claim might be taken to mean in light of these theorems. It is argued that, even if one accepts the framework and assumptions (...) No categories |
|
We start by very briefly describing the measurement problem in quantum mechanics and its solution by the Many Worlds Interpretation. We then describe the preferred basis problem, and the role of decoherence in the MWI. We discuss a number of approaches to the preferred basis problem and argue that contrary to the received wisdom, decoherence by itself does not solve the problem. We address Wallace’s emergentist approach based on what he calls Dennett’s criterion, and we compare the logical structure of (...) |
|
This paper makes a novel linkage between the multiple-computations theorem in philosophy of mind and Landauer’s principle in physics. The multiple-computations theorem implies that certain physical systems implement simultaneously more than one computation. Landauer’s principle implies that the physical implementation of “logically irreversible” functions is accompanied by minimal entropy increase. We show that the multiple-computations theorem is incompatible with, or at least challenges, the universal validity of Landauer’s principle. To this end we provide accounts of both ideas in terms of (...) |
|
Can the second law of thermodynamics explain our mental experience of the direction of time? According to an influential approach, the past hypothesis of universal low entropy also explains how the psychological arrow comes about. We argue that although this approach has many attractive features, it cannot explain the psychological arrow after all. In particular, we show that the past hypothesis is neither necessary nor sufficient to explain the psychological arrow on the basis of current physics. We propose two necessary (...) |
|
Special sciences (such as biology, psychology, economics) describe various regularities holding at some high macroscopic level. One of the central questions concerning these macroscopic regularities is how they are related to the laws of physics governing the underlying microscopic physical reality. In this paper we show how a macroscopic regularity may emerge from an underlying micro- scopic structure, and how the appearance of multiple realizability of the special sciences by physics comes about in a reductionist-physicalist framework. On this basis we (...) |
|
According to influential views the probabilities in classical statistical mechanics and other special sciences are objective chances, although the underlying mechanical theory is deterministic, since the deterministic low level is inadmissible or unavailable from the high level. Here two intuitions pull in opposite directions: One intuition is that if the world is deterministic, probability can only express subjective ignorance. The other intuition is that probability of high-level phenomena, especially thermodynamic ones, is dictated by the state of affairs in the world. (...) |
|
In a previous article, we have demonstrated by a general phase space argument that a Maxwellian Demon is compatible with statistical mechanics. In this article, we show how this idea can be put to work in the prevalent model of the Demon, namely, a particle-in-a-box, used, for example, by Szilard and Bennett. In the literature, this model is used in order to show that a Demon is incompatible with statistical mechanics, either classical or quantum. However, we show that a detailed (...) |
|
In our book The Road to Maxwell’s Demon (RMD) (Cambridge University Press 2012) we proposed a new outline for a reductive account of statistical mechanics in which thermodynamics is reduced to classical mechanics. In a recent review Valia Allori says that we misunderstood Boltzmann’s account of statistical mechanics with respect to two issues: (1) the nature of typicality considerations in Boltzmann’s explanation of the Second Law - and here she provides no argument whatsoever; and (2) Boltzmann’s notion of probability. As (...) |
|
This paper describes a version of type identity physicalism, which we call Flat Physicalism, and shows how it meets several objections often raised against identity theories. This identity theory is informed by recent results in the conceptual foundations of physics, and in particular clar- ifies the notion of ‘physical kinds’ in light of a conceptual analysis of the paradigmatic case of reducing thermody- namics to statistical mechanics. We show how Flat Physi- calism is compatible with the appearance of multiple realisation (...) |
|
The histories interpretation provides a consistent realistic ontology for quantum mechanics, based on two main ideas. First, a logic is employed which is compatible with the Hilbert-space structure of quantum mechanics as understood by von Neumann: quantum properties and their negations correspond to subspaces and their orthogonal complements. It employs a special syntactical rule to construct meaningful quantum expressions, quite different from the quantum logic of Birkhoff and von Neumann. Second, quantum time development is treated as an inherently stochastic process (...) |
|
The histories interpretation provides a consistent realistic ontology for quantum mechanics, based on two main ideas. First, a logic is employed which is compatible with the Hilbert-space structure of quantum mechanics as understood by von Neumann: quantum properties and their negations correspond to subspaces and their orthogonal complements. It employs a special syntactical rule to construct meaningful quantum expressions, quite different from the quantum logic of Birkhoff and von Neumann. Second, quantum time development is treated as an inherently stochastic process (...) |
|
Classical statistical mechanics posits probabilities for various events to occur, and these probabilities seem to be objective chances. This does not seem to sit well with the fact that the theory’s time evolution is deterministic. We argue that the tension between the two is only apparent. We present a theory of Humean objective chance and show that chances thus understood are compatible with underlying determinism and provide an interpretation of the probabilities we find in Boltzmannian statistical mechanics. |
|
There are two theoretical approaches in statistical mechanics, one associated with Boltzmann and the other with Gibbs. The theoretical apparatus of the two approaches offer distinct descriptions of the same physical system with no obvious way to translate the concepts of one formalism into those of the other. This raises the question of the status of one approach vis-à-vis the other. We answer this question by arguing that the Boltzmannian approach is a fundamental theory while Gibbsian statistical mechanics is an (...) No categories |