The second law of thermodynamics is traditionally interpreted as a coarse-grained result of classical mechanics. Recently its relation with quantum mechanical processes such as decoherence and measurement has been revealed in literature. In this paper we will formulate the second law and the associated time irreversibility following Everett’s idea: systems entangled with an object getting to know the branch in which they live. Accounting for this self-locating knowledge, we get two forms of entropy: objective entropy measuring the uncertainty of the (...) state of the object alone, and subjective entropy measuring the information carried by the self-locating knowledge. By showing that the summation of the two forms of entropy is a conserved and perspective-free quantity, we interpret the second law as a statement of irreversibility in knowledge acquisition. This essentially derives the thermodynamic arrow of time from the subjective arrow of time, and provides a unified explanation for varieties of the second law, as well as the past hypothesis. (shrink)
Ruetsche () argues that the occurrence of unitarily inequivalent representations in quantum theories with infinitely many degrees of freedom poses a novel interpretational problem. According to Ruetsche, such theories compel us to reject the so-called ideal of pristine interpretation; she puts forward the ‘coalescence approach’ as an alternative. In this paper I offer a novel defence of the coalescence approach. The defence rests on the claim that the ideal of pristine interpretation already fails before one considers the peculiarities of QM∞: (...) there are pre-QM∞ parallels to coalescence. Despite this departure from pristinism, the ‘modest’ view that emerges poses no threat to scientific realism. (shrink)
If there are fundamental laws of nature, can they fail to be exact? In this paper, I consider the possibility that some fundamental laws are vague. I call this phenomenon 'fundamental nomic vagueness.' I characterize fundamental nomic vagueness as the existence of borderline lawful worlds and the presence of several other accompanying features. Under certain assumptions, such vagueness prevents the fundamental physical theory from being completely expressible in the mathematical language. Moreover, I suggest that such vagueness can be regarded as (...) 'vagueness in the world.' For a case study, we turn to the Past Hypothesis, a postulate that (partially) explains the direction of time in our world. We have reasons to take it seriously as a candidate fundamental law of nature. Yet it is vague: it admits borderline (nomologically) possible worlds. An exact version would lead to an untraceable arbitrariness absent in any other fundamental laws. However, the dilemma between fundamental nomic vagueness and untraceable arbitrariness is dissolved in a new quantum theory of time's arrow. (shrink)
Entanglement measures quantify the amount of quantum entanglement that is contained in quantum states. Typically, different entanglement measures do not have to be partially ordered. The presence of a definite partial order between two entanglement measures for all quantum states, however, allows for meaningful conceptualization of sensitivity to entanglement, which will be greater for the entanglement measure that produces the larger numerical values. Here, we have investigated the partial order between the normalized versions of four entanglement measures based on Schmidt (...) decomposition of bipartite pure quantum states, namely, concurrence, tangle, entanglement robustness and Schmidt number. We have shown that among those four measures, the concurrence and the Schmidt number have the highest and the lowest sensitivity to quantum entanglement, respectively. Further, we have demonstrated how these measures could be used to track the dynamics of quantum entanglement in a simple quantum toy model composed of two qutrits. Lastly, we have employed state-dependent entanglement statistics to compute measurable correlations between the outcomes of quantum observables in agreement with the uncertainty principle. The presented results could be helpful in quantum applications that require monitoring of the available quantum resources for sharp identification of temporal points of maximal entanglement or system separability. (shrink)
Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, and the (...) infinite particles case. (shrink)
The paper investigates the understanding of quantum indistinguishability after quantum information in comparison with the “classical” quantum mechanics based on the separable complex Hilbert space. The two oppositions, correspondingly “distinguishability / indistinguishability” and “classical / quantum”, available implicitly in the concept of quantum indistinguishability can be interpreted as two “missing” bits of classical information, which are to be added after teleportation of quantum information to be restored the initial state unambiguously. That new understanding of quantum indistinguishability is linked to the (...) distinction of classical (Maxwell-Boltzmann) versus quantum (either Fermi-Dirac or Bose-Einstein) statistics. The latter can be generalized to classes of wave functions (“empty” qubits) and represented exhaustively in Hilbert arithmetic therefore connectible to the foundations of mathematics, more precisely, to the interrelations of propositional logic and set theory sharing the structure of Boolean algebra and two anti-isometric copies of Peano arithmetic. (shrink)
Maxwell’s Demon is a thought experiment devised by J. C. Maxwell in 1867 in order to show that the Second Law of thermodynamics is not universal, since it has a counter-example. Since the Second Law is taken by many to provide an arrow of time, the threat to its universality threatens the account of temporal directionality as well. Various attempts to “exorcise” the Demon, by proving that it is impossible for one reason or another, have been made throughout the years, (...) but none of them were successful. We have shown (in a number of publications) by a general state-space argument that Maxwell’s Demon is compatible with classical mechanics, and that the most recent solutions, based on Landauer’s thesis, are not general. In this paper we demonstrate that Maxwell’s Demon is also compatible with quantum mechanics. We do so by analyzing a particular (but highly idealized) experimental setup and proving that it violates the Second Law. Our discussion is in the framework of standard quantum mechanics; we give two separate arguments in the framework of quantum mechanics with and without the projection postulate. We address in our analysis the connection between measurement and erasure interactions and we show how these notions are applicable in the microscopic quantum mechanical structure. We discuss what might be the quantum mechanical counterpart of the classical notion of “macrostates”, thus explaining why our Quantum Demon setup works not only at the micro level but also at the macro level, properly understood. One implication of our analysis is that the Second Law cannot provide a universal lawlike basis for an account of the arrow of time; this account has to be sought elsewhere. (shrink)
Following the experimental discovery of the Higgs boson, physicists explained the discovery to the public by appealing to analogies with condensed matter physics. The historical root of these analogies is the analogies to models of superconductivity that inspired the introduction of spontaneous symmetry breaking into particle physics in the early 1960s. We offer a historical and philosophical analysis of the analogies between the Higgs model of the electroweak interaction and the Ginsburg-Landau and Bardeen-Cooper-Schrieffer models of superconductivity, respectively. The conclusion of (...) our analysis is that both sets of analogies are purely formal in virtue of the fact that they are accompanied by substantial physical disanalogies. In particular, the formal analogies do not map the temporal, causal, or modal structures of SSB in superconductivity to temporal, causal, or modal structures in the Higgs model. These substantial physical disanalogies mean that analogies to models of superconductivity cannot supply the basis for the physical interpretation of EW SSB; however, an appreciation of the contrast between the physical interpretations of SSB in superconductivity and the Higgs model does help to clarify some foundational issues. Unlike SSB in superconductivity, SSB in the Higgs sector of the Standard Model is neither a temporal nor a causal process. We discuss the implications for the `eating' metaphor for mass gain in the Higgs model. Furthermore, the distinction between the phenomenological GL model and the dynamical BCS model does not carry over to EW models, which clarifies the desiderata for so-called `dynamical' models of EW SSB. Finally, the development of the Higgs model is an illuminating case study for philosophers of science because it illustrates how purely formal analogies can play a fruitful heuristic role in physics. (shrink)
Entanglement is one of the most striking features of quantum mechanics, and yet it is not specifically quantum. More specific to quantum mechanics is the connection between entanglement and thermodynamics, which leads to an identification between entropies and measures of pure state entanglement. Here we search for the roots of this connection, investigating the relation between entanglement and thermodynamics in the framework of general probabilistic theories. We first address the question whether an entangled state can be transformed into another by (...) means of local operations and classical communication. Under two operational requirements, we prove a general version of the Lo-Popescu theorem, which lies at the foundations of the theory of pure-state entanglement. We then consider a resource theory of purity where free operations are random reversible transformations, modelling the scenario where an agent has limited control over the dynamics of a closed system. Our key result is a duality between the resource theory of entanglement and the resource theory of purity, valid for every physical theory where all processes arise from pure states and reversible interactions at the fundamental level. As an application of the main result, we establish a one-to-one correspondence between entropies and measures of pure bipartite entanglement and exploit it to define entanglement measures in the general probabilistic framework. In addition, we show a duality between the task of information erasure and the task of entanglement generation, whereby the existence of entropy sinks (systems that can absorb arbitrary amounts of information) becomes equivalent to the existence of entanglement sources (correlated systems from which arbitrary amounts of entanglement can be extracted). (shrink)
In quantum theory every state can be diagonalized, i.e. decomposed as a convex combination of perfectly distinguishable pure states. This elementary structure plays an ubiquitous role in quantum mechanics, quantum information theory, and quantum statistical mechanics, where it provides the foundation for the notions of majorization and entropy. A natural question then arises: can we reconstruct these notions from purely operational axioms? We address this question in the framework of general probabilistic theories, presenting a set of axioms that guarantee that (...) every state can be diagonalized. The first axiom is Causality, which ensures that the marginal of a bipartite state is well defined. Then, Purity Preservation states that the set of pure transformations is closed under composition. The third axiom is Purification, which allows to assign a pure state to the composition of a system with its environment. Finally, we introduce the axiom of Pure Sharpness, stating that for every system there exists at least one pure effect occurring with unit probability on some state. For theories satisfying our four axioms, we show a constructive algorithm for diagonalizing every given state. The diagonalization result allows us to formulate a majorization criterion that captures the convertibility of states in the operational resource theory of purity, where random reversible transformations are regarded as free operations. (shrink)
Muller and Saunders () purport to demonstrate that, surprisingly, bosons and fermions are discernible; this article disputes their arguments, then derives a similar conclusion in a more satisfactory fashion. After briefly explicating their proof and indicating how it escapes earlier indiscernibility results, we note that the observables which Muller and Saunders argue discern particles are (i) non-symmetric in the case of bosons and (ii) trivial multiples of the identity in the case of fermions. Both problems undermine the claim that they (...) have shown particles to be physically discernible. We then prove two results concerning observables that are truly physical: one showing when particles are discernible and one showing when they are not (categorically) discernible. Along the way we clarify some frequently misunderstood issues concerning the interpretation of quantum observables. 1 Background2 Criticisms2.1 Bosons2.2 Fermions3 Reformulating the Insight3.1 What weakly discerns?3.2 General results4 Conclusion. (shrink)
In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein’s other statistical papers of 1905 had already developed and exploited the idea that the ideal (...) gas law is another macroscopic signature of finitely many, spatially localized, independent components and that these papers in turn drew on his first two, “worthless” papers of 1901 and 1902 on intermolecular forces. However, while the ideal gas law was a secure signature of independence, it was harder to use as an indicator that there are finitely many components and that they are spatially localized. Further, since his analysis of the ideal gas law depended on the assumption that the number of components was fixed, its use was precluded for heat radiation, whose component quanta vary in number in most processes. So Einstein needed and found another, more powerful signature of discreteness applicable to heat radiation and which indicated all these properties. It used one of the few processes, volume fluctuation, in which heat radiation does not alter the number of quanta. (shrink)
There have been attempts to derive anti-haeccetistic conclusions from the fact that quantum mechanics (QM) appeals to non-standard statistics. Since in fact QM acknowledges two kinds of such statistics, Bose-Einstein and Fermi-Dirac, I argue that we could in the same vein derive the sharper anti-haeccetistic conclusion that bosons are bundles of tropes and fermions are bundles of universals. Moreover, since standard statistics is still appropriate at the macrolevel, we could also venture to say that no anti-haecceitistic conclusion is warranted for (...) ordinary objects, which could then tentatively be identified with substrates. In contrast to this, however, there has been so far no acknowledgement of the possibility of inclusivism, according to which ontological accounts of particulars as widely different as those can possibly coexist in one world picture. The success of the different statistics in physics at least calls for a revision in this respect. (shrink)
Two approaches toward the arrow of time for scattering processes have been proposed in rigged Hilbert space quantum mechanics. One, due to Arno Bohm, involves preparations and registrations in laboratory operations and results in two semigroups oriented in the forward direction of time. The other, employed by the Brussels-Austin group, is more general, involving excitations and de-excitations of systems, and apparently results in two semigroups oriented in opposite directions of time. It turns out that these two time arrows can be (...) related to each other via Wigner's extensions of the spacetime symmetry group. Furthermore, their are subtle differences in causality as well as the possibilities for the existence and creation of time-reversed states depending on which time arrow is chosen. (shrink)
In this paper the quantum covariant relativistic dynamics of many bodies is reconsidered. It is emphasized that this is an event dynamics. The events are quantum statistically correlated by the global parameter τ. The derivation of an event Boltzmann equation emphasizes this. It is shown that this Boltzmann equation may be viewed as exact in a dilute event limit ignoring three event correlations. A quantum entropy principle is obtained for the marginal Wigner distribution function. By means of event linking (concatenations) (...) particle properties such as the equation of state may be obtained. We further reconsider the generalized quantum equilibrium ensemble theory and the free event case of the Fermi-Dirac and Bose-Einstein distributions, and some consequences. The ultra-relativistic limit differs from the non-covariant theory and is a test of this point of view. (shrink)
Arno Bohm and Ilya Prigogine's Brussels-Austin Group have been working on the quantum mechanical arrow of time and irreversibility in rigged Hilbert space quantum mechanics. A crucial notion in Bohm's approach is the so-called preparation/registration arrow. An analysis of this arrow and its role in Bohm's theory of scattering is given. Similarly, the Brussels-Austin Group uses an excitation/de-excitation arrow for ordering events, which is also analyzed. The relationship between the two approaches is discussed focusing on their semi-group operators and time (...) arrows. Finally a possible realist interpretation of the rigged Hilbert space formulation of quantum mechanics is considered. (shrink)
Classical mechanics is empirically successful because the probabilistic mean values of quantum mechanical observables follow the classical equations of motion to a good approximation (Messiah 1970, 215). We examine this claim for the one-dimensional motion of a particle in a box, and extend the idea by deriving a special case of the ideal gas law in terms of the mean value of a generalized force used to define "pressure." The examples illustrate the importance of probabilistic averaging as a method of (...) abstracting away from the messy details of microphenomena, not only in physics, but in other sciences as well. (shrink)
In 1916, Einstein rederived the blackbody radiation law of Planck that originated the idea of quantized energy one hundred years ago. For this purpose, Einstein introduced the concept of transition probability, which had a profound influence on the development of quantum theory. In this article, we adopt Einstein's assumptions with two exceptions and seek the statistical condition for the thermal equilibrium of matter without referring to the inner details of either statistical thermodynamics or quantum theory. It is shown that the (...) conditions of thermodynamic equilibrium of electromagnetic radiation and the energy balance of thermal radiation by the matter, between any of its two energy-states, not only result in Planck's radiation law and the Bohr frequency condition, but they remarkably yield the law of the statistical thermal equilibrium of matter: the Maxwell–Boltzmann distribution. Since the transition probabilities of the modern quantum theory of radiation coincide with their definition in Einstein's theory of blackbody radiation, the presented deduction of the Maxwell–Boltzmann distribution is equally valid within the bounds of modern quantum theory. Consequently, within the framework of the fundamental assumptions, the Maxwell–Boltzmann distribution of energy-states is not only a sufficient, but a necessary condition for thermal equilibrium between the matter and radiation. (shrink)
If a classical system has infinitely many degrees of freedom, its Hamiltonian quantization need not be unique up to unitary equivalence. I sketch different approaches (Hilbert space and algebraic) to understanding the content of quantum theories in light of this non‐uniqueness, and suggest that neither approach suffices to support explanatory aspirations encountered in the thermodynamic limit of quantum statistical mechanics.
The thermodynamic behavior is analyzed of a single classical charged particle in thermal equilibrium with classical electromagnetic thermal radiation, while electrostatically bound by a fixed charge distribution of opposite sign. A quasistatic displacement of this system in an applied electrostatic potential is investigated. Treating the system nonrelativistically, the change in internal energy, the work done, and the change in caloric entropy are all shown to be expressible in terms of averages involving the distribution of the position coordinates alone. A convenient (...) representation for the probability distribution is shown to be the ensemble average of the absolute square value of an expansion over the eigenstates of a Schrödinger-like equation, since the heat flow is shown to vanish for each hypothetical “state.” Subject to key assumptions highlighted here, the demand that the entropy be a function of state results in statistical averages in agreement with the form in quantum statistical mechanics. Examining the very low and very high temperature situations yields Planck's and Boltzmann's constants. The blackbody radiation spectrum is then deduced. From the viewpoint of the theory explored here, the method in quantum statistical mechanics of statistically counting the “states” at thermal equilibrium by using the energy eigenvalue structure, is simply a convenient counting scheme, rather than actually representing averages involving physically discrete energy states. (shrink)
In quantum mechanics, the expression for entropy is usually taken to be -kTr(ln), where is the density matrix. The convention first appears in Von Neumann's Mathematical Foundations of Quantum Mechanics. The argument given there to justify this convention is the only one hitherto offered. All the arguments in the field refer to it at one point or another. Here this argument is shown to be invalid. Moreover, it is shown that, if entropy is -kTr(ln), then perpetual motion machines are possible. (...) This and other considerations support the conclusion that this expression is not the quantum-mechanical correlate of thermodynamic entropy. Its usefulness in quantum-statistical mechanics can be explained by its being a convenient quantification of information, but information and entropy are not synonymous. As the present paper shows, one can change while the other is conserved. (shrink)
Quasi-probability distribution functions fj WW, fj MM for quantum spin-j systems are derived based on the Wigner-Weyl, Margenau-Hill approaches. A probability distribution fj sph which is nonzero only on the surface of the sphere of radius √j(j+1) is obtained by expressing the characteristic function in terms of the spherical moments. It is shown that the Wigner-Weyl distribution function turns out to be a distribution over the sphere in the classical limit.
I discuss the question: Is it possible to prepare, by purely thermodynamic means, an ensemble described by a quantum state having a definite phase relation between two component states which have never been in direct contact? Resolution of this question requires us to take explicit account of the nature of the correlations between the system and its thermal environment.
A fundamental problem in understanding the nature of time is explaining its directionality. This 1990 PhD thesis re-examines the concepts of time flow, the physical directionality of time, and the semantics of tensed language. Several novel results are argued for that contradict the orthodox anti-realist views still dominant in the subject. Specifically, the concept of "metaphysical time flow" is supported as a valid scientific concept, and argued to be intrinsic to the directionality of objective probabilities in quantum mechanics; the common (...) claims that quantum probability theory is time reversible is shown to be based on an analytic error, stemming from a false choice for the criterion for reversibility of probabilistic theories (recognized by Satosi Watanabe in the 1950s but ignored in all philosophical discussions); and a consistent semantics for tensed language (adapted from the tree model of Storrs McCall) is constructed, showing that the common rejection of "time flow" as having no meaningful semantics is false. These debates are still ongoing in almost exactly the same state they were pre-1990, and there is appears to be no visible progress in the subject. Critical points made against errors in the orthodox account (which has been sustained for 70 years by the anti-realist philosophy of time, typified by the "Pittsburg School" of Grunbaum-Earman-Norton-Roberts), are still not recognized in the philosophy of time or physics. Some key technical proofs in this thesis have been published in physics proper. See p.ii-iii for full original abstract. (This pdf is uploaded from the Massey University archive.). (shrink)
Using Schrödinger's generalized probability relations of quantum mechanics, it is possible to generate a canonical ensemble, the ensemble normally associated with thermodynamic equilibrium, by at least two methods, statistical mixing and subensemble selection, that do not involve thermodynamic equilibration. Thus the question arises as to whether an observer making measurements upon systems from a canonical ensemble can determine whether the systems were prepared by mixing, equilibration, or selection. Investigation of this issue exposes antinomies in quantum statistical thermodynamics. It is conjectured (...) that resolution of these paradoxes may involve a new law of motion in quantum dynamics. (shrink)
It is shown that in the quantum theory of systems with a finite number of degrees of freedom which employs a set of algebraic states, a statistical element introduced by averaging the mean values of operators over the distribution of continuous quantities (a spectrum point of a canonical operator and time) is conserved for the limiting transition to the δ distribution. On that basis, quantum statistical dynamics, i.e., a theory in which dynamics (time evolution) includes a statistical element, is advanced. (...) The theory is equivalent to orthodox quantum mechanics as regards the orthodox states, but is essentially different with respect to the coherence properties in a continuous spectrum. The measurement-process theory, including the statistical interpretation of quantum mechanics, and the irreversibility theory are constructed, and the law of increasing chaos, which is a strengthening of the law of entropy increase, is obtained. In our theory, mechanics and statistics are organically connected, whereby the fundamental nature of probabilities in quantum physics manifests itself. (shrink)
In an earlier paper by one of us [K.-E. Hellwig (1981)], elements of discrete quantum stochastic processes which arise when the classical probability space is replaced by quantum theory have been considered. In the present paper a general formulation is given and its properties are compared with those of classical stochastic processes. Especially, it is asked whether such processes can be Markovian. An example is given and similarities to methods in quantum statistical thermodynamics are pointed out.
The aim of the present paper is to show that the formalism of equilibrium quantum statistical mechanics can fully be incorporated into Ludwig's embedding scheme for classical theories in many-body quantum mechanics. A construction procedure based on a recently developed reconstruction procedure for the so-called macro-observable is presented which leads to the explicit determination of the set of classical ensembles compatible with the embedding scheme.
It is shown that the traditional formalism of equilibrium quantum statistical mechanics may fully be incorporated into a general macro-observable approach to quantum statistical mechanics recently proposed by the same author. (1,2) In particular, the partition functions which in the traditional approach are assumed to connect nonnormalized density operators with thermodynamic functions are reinterpreted as functions connecting so-called quantum mechanical effect operators with state parameters. It is argued that these functions although only part of a much richer internal structure of (...) the macro-observable are sufficient to cope with all problems one usually encounters in equilibrium quantum statistical mechanics. p]Denn eigentlich unternehmen wir umsonst, das Wesen eines Dinges auszudrücken. Wirkungen werden wir gewahr, und eine vollständige Geschichte dieser Wirkungen umfaßte wohl allenfalls das Wesen jenes Dinges.Johann W. v. Goethe Farbenlehre. (shrink)
Various formalisms for recasting quantum mechanics in the framework of classical mechanics on phase space are reviewed and compared. Recent results in stochastic quantum mechanics are shown to avoid the difficulties encountered by the earlier approach of Wigner, as well as to avoid the well-known incompatibilities of relativity and ordinary quantum theory. Specific mappings among the various formalisms are given.
The aim of this paper is to reconcile the two modes of description of macrosystems, i.e., to remove certain inconsistencies between the classical phenomenological and the quantum-theoretical descriptions of a macrosystem. Starting from Ludwig's formulation of a general framework for classical theories and his ansatz for a compatibility condition between the quantum theoretical and the classical mode of description for a macrosystem, we try to make clear what the “classical content” of many-body quantum theory really is. It is shown that (...) this classical content may be described by a certain “observable,” i.e., an operator-valued measure over the Borel sets of the classical trajectory space. There exists a reduced time evolution within this classical content which has the structure of a true semigroup in case of an irreversible classical time evolution. (shrink)
If the ordinary quantal Liouville equation ℒρ= $\dot \rho $ is generalized by discarding the customary stricture that ℒ be of the standard Hamiltonian commutator form, the new quantum dynamics that emerges has sufficient theoretical fertility to permit description even of a thermodynamically irreversible process in an isolated system, i.e., a motion ρ(t) in which entropy increases but energy is conserved. For a two-level quantum system, the complete family of time-independent linear superoperators ℒ that generate such motions is derived; and (...) a physically interesting example is presented in detail. (shrink)
In previous publications we have criticized the usual application of information theory to quantal situations and proposed a new version of information-theoretic quantum statistics. This paper is the first in a two-part series in which our new approach is applied to the fundamental problem of thermodynamic equilibrium. Part I deals in particular with informational definitions of equilibrium and the identification of thermodynamic analogs in our modified quantum statistics formalism.
This communication is part I of a series of papers which explore the theoretical possibility of generalizing quantum dynamics in such a way that the predicted motions of an isolated system would include the irreversible (entropy-increasing) state evolutions that seem essential if the second law of thermodynamics is ever to become a theorem of mechanics. In this first paper, the general mathematical framework for describing linear but not necessarily Hamiltonian mappings of the statistical operator is reviewed, with particular attention to (...) detailed representations of the Kossakowski conditions for the case of a two-level system. (shrink)
A universal, unified theory of transformations of physical systems based on the propositions of probabilistic physics is developed. This is applied to the treatment of decay processes and intramolecular rearrangements. Some general features of decay processes are elucidated. A critical analysis of the conventional quantum theories of decay and of Slater's quantum theory of intramolecular rearrangements is given. It is explained why, despite the incorrectness of the decay theories in principle, they can give correct estimations of decay rate constants. The (...) reasons for the validity of the Arrhenius formula for the temperature dependence of an intramolecular rearrangement rate constant are discussed. A criterion for the possibility of a proper intramolecular rearrangement is given. The issue of causality in quantum physics is settled. (shrink)
The use of joint distribution functions for noncommuting observables in quantum thermodynamics is investigated in the light of L. Cohen's proof that such distributions are not determined by the quantum state. Cohen's proof is irrelevant to uses of the functions that do not depend on interpreting them as distributions. An example of this, from quantum Onsager theory, is discussed. Other uses presuppose that correlations betweenp andq values depend at least on the state. But correlations may be fixed by the state (...) even though the distribution varies from one ensemble to another represented by that state. Taking covariance as a measure of correlation, it is shown that the different commonly used joint distributions yield the same correlations for a given state. A general characterization is given for a family of distributions with this same covariance. (shrink)
The Jarzynski equality equates the mean of the exponential of the negative of the work (per fixed temperature) done by a changing Hamiltonian on a system, initially in thermal equilibrium at that temperature, to the ratio of the final to the initial equilibrium partition functions of the system at that fixed temperature. It thus relates two thermal equilibrium quantum states.
The recent renewed interest in the foundation of quantum statistical mechanics and in the dynamics of isolated quantum systems has led to a revival of the old approach by von Neumann to investigate the problem of thermalization only in terms of quantum dynamics in an isolated system [1, 2]. It has been demonstrated in some general or concrete settings that a pure initial state evolving under quantum dynamics indeed approaches an equilibrium state [3–9]. The underlying idea that a single pure (...) quantum state can fully describe thermal equilibrium has also become much more concrete [10–12]. (shrink)
In Bohmian mechanics the distribution |ψ|2 is regarded as the equilibrium distribution. We consider its uniqueness, ﬁnding that it is the unique equivariant distribution that is also a local functional of the wave function ψ.
We discuss the content and significance of John von Neumann’s quantum ergodic theorem (QET) of 1929, a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of what we call normal typicality, i.e., the statement that, for typical large systems, every initial wave function ψ0 from an energy shell is “normal”: it evolves in such a way that |ψt ψt| is, for most t, macroscopically equivalent to the micro-canonical density matrix. The QET (...) has been mostly forgotten after it was criticized as a dynamically vacuous statement in several papers in the 1950s. However, we point out that this criticism does not apply to the actual QET, a correct statement of which does not appear in these papers, but to a different (indeed weaker) statement. Furthermore, we formulate a stronger statement of normal typicality, based on the observation that the bound on the deviations from the average specified by von Neumann is unnecessarily coarse and a much tighter (and more relevant) bound actually follows from his proof. (shrink)
We consider an isolated, macroscopic quantum system. Let H be a microcanonical “energy shell,” i.e., a subspace of the system’s Hilbert space spanned by the (finitely) many energy eigenstates with energies between E and E + δE. The thermal equilibrium macro-state at energy E corresponds to a subspace Heq of H such that dim Heq/ dim H is close to 1. We say that a system with state vector ψ H is in thermal equilibrium if ψ is “close” to Heq. (...) We show that for “typical” Hamiltonians with given eigenvalues, all initial state vectors ψ0 evolve in such a way that ψt is in thermal equilibrium for most times t. This result is closely related to von Neumann’s quantum ergodic theorem of 1929. (shrink)
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one he calls the (...) “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψt is close to their micro-canonical distribution. (shrink)
Using linear invariant operators in a constructive way we find the most general thermal density operator and Wigner function for time-dependent generalized oscillators. The general Wigner function has five free parameters and describes the thermal Wigner function about a classical trajectory in phase space. The contour of the Wigner function depicts an elliptical orbit with a constant area moving about the classical trajectory, whose eccentricity determines the squeezing of the initial vacuum.
The utilisation of quantum theories within social science and biology is often reasonably met with dubiety. It would be even more controversial should such theories be applied to concepts under the domain of eugenics. Nonetheless, this can open up a fresh and unique understanding of theories that are usually understood by their classical structure. We will provide quantum interpretations of dysgenics and dysgenic traits from different scopes and procedures. The way dysgenic traits are in a flux with the environment that (...) they interact will be analysed as well as how they interact with each other under quantum conditions. We will also take into account of factors such as intelligence, genetic heritability, and other biological and cognitive factors and try to study their frameworks in non-classical ways. Using what we have theorised, we will also attempt to create numerical and empirical analyses of some of the theories that we have proposed. (shrink)
A major part of Einstein’s 1905 light quantum paper is devoted to arguing that high frequency heat radiation bears the characteristic signature of a microscopic energy distribution of independent, spatially localized components. The content of his light quantum proposal was precarious in that it contradicted the great achievement of nineteenth century physics, the wave theory of light and its accommodation in electrodynamics. However the methods used to arrive at it were both secure and familiar to Einstein in 1905. A mainstay (...) of Einstein’s research in statistical physics, extending to his earliest publications of 1901 and 1902, had been the inferring of the microscopic constitution of systems from their macroscopic properties. In his statistical work of 1905, Einstein dealt with several thermal systems consisting of many, independent, spatially localized components. They were the dilute sugar solutions of his doctoral dissertation and suspended particles of his Brownian motion paper. (shrink)
Textbooks in quantum mechanics frequently claim that quantum mechanics explains the success of classical mechanics because “the mean values [of quantum mechanical observables] follow the classical equations of motion to a good approximation,” while “the dimensions of the wave packet be small with respect to the characteristic dimensions of the problem.” The equations in question are Ehrenfest’s famous equations. We examine this case for the one-dimensional motion of a particle in a box, and extend the idea deriving a special case (...) of the ideal gas law in terms of the mean value of a generalized force, which has been used in statistical mechanics to define ‘pressure’. The example may be an important test case for recent philosophical theories about the relationship between micro-theories and macro-theories in science. (shrink)