Quantum mechanics is generally regarded as the physical theory that is our best candidate for a fundamental and universal description of the physical world. The conceptual framework employed by this theory differs drastically from that of classical physics. Indeed, the transition from classical to quantum physics marks a genuine revolution in our understanding of the physical world.
A recent argument by Hawthorne and Lasonen-Aarnio purports to show that we can uphold the principle that competently forming conjunctions is a knowledge-preserving operation only at the cost of a rampant skepticism about the future. A key premise of their argument is that, in light of quantum-mechanical considerations, future contingents never quite have chance 1 of being true. We argue, by drawing attention to the order of magnitude of the relevant quantum probabilities, that the skeptical threat of Hawthorne and Lasonen-Aarnio’s (...) argument is illusory. (shrink)
Roughly speaking, classical statistical physics is the branch of theoretical physics that aims to account for the thermal behaviour of macroscopic bodies in terms of a classical mechanical model of their microscopic constituents, with the help of probabilistic assumptions. In the last century and a half, a fair number of approaches have been developed to meet this aim. This study of their foundations assesses their coherence and analyzes the motivations for their basic assumptions, and the interpretations of their central concepts. (...) The most outstanding foundational problems are the explanation of time-asymmetry in thermal behaviour, the relative autonomy of thermal phenomena from their microscopic underpinning, and the meaning of probability. A more or less historic survey is given of the work of Maxwell, Boltzmann and Gibbs in statistical physics, and the problems and objections to which their work gave rise. Next, we review some modern approaches to (i) equilibrium statistical mechanics, such as ergodic theory and the theory of the thermodynamic limit; and to (ii) non-equilibrium statistical mechanics as provided by Lanford's work on the Boltzmann equation, the so-called Bogolyubov-Born-Green-Kirkwood-Yvon approach, and stochastic approaches such as `coarse-graining' and the `open systems' approach. In all cases, we focus on the subtle interplay between probabilistic assumptions, dynamical assumptions, initial conditions and other ingredients used in these approaches. (shrink)
The aim of this article is to analyse the relation between the second law of thermodynamics and the so-called arrow of time. For this purpose, a number of different aspects in this arrow of time are distinguished, in particular those of time-reversal (non-)invariance and of (ir)reversibility. Next I review versions of the second law in the work of Carnot, Clausius, Kelvin, Planck, Gibbs, Caratheodory and Lieb and Yngvason, and investigate their connection with these aspects of the arrow of time. It (...) is shown that this connection varies a great deal along with these formulations of the second law. According to the famous formulation by Planck, the second law expresses the irreversibility of natural processes. But in many other formulations irreversibility or even time-reversal non-invariance plays no role. I therefore argue for the view that the second law has nothing to do with the arrow of time. (shrink)
I consider the problem of extending Reichenbach's principle of the common cause to more than two events, vis-a-vis an example posed by Bernstein. It is argued that the only reasonable extension of Reichenbach's principle stands in conflict with a recent proposal due to Horwich. I also discuss prospects of the principle of the common cause in the light of these and other difficulties known in the literature and argue that a more viable version of the principle is the one provided (...) by Penrose and Percival (1962). (shrink)
Bohr and Heisenberg suggested that the thermodynamical quantities of temperature and energy are complementary in the same way as position and momentum in quantum mechanics. Roughly speaking their idea was that a definite temperature can be attributed to a system only if it is submerged in a heat bath, in which case energy fluctuations are unavoidable. On the other hand, a definite energy can be assigned only to systems in thermal isolation, thus excluding the simultaneous determination of its temperature. Rosenfeld (...) extended this analogy with quantum mechanics and obtained a quantitative uncertainty relation in the form ΔU Δ(1/T) ≥ k, where k is Boltzmann's constant. The two “extreme” cases of this relation would then characterize this complementarity between isolation (U definite) and contact with a heat bath (T definite). Other formulations of the thermodynamical uncertainty relations were proposed by Mandelbrot (1956, 1989), Lindhard (1986), and Lavenda (1987, 1991). This work, however, has not led to a consensus in the literature. It is shown here that the uncertainty relation for temperature and energy in the version of Mandelbrot is indeed exactly analogous to modern formulations of the quantum mechanical uncertainty relations. However, his relation holds only for the canonical distribution, describing a system in contact with a heat bath. There is, therefore, no complementarily between this situation and a thermally isolated system. (shrink)
The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the (...) matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support. (shrink)