Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...) most important notions of entropy and to clarify the relations between them, After setting the stage by introducing the thermodynamic entropy, we discuss notions of entropy in information theory, statistical mechanics, dynamical systems theory and fractal geometry. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...) paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
Daniel R. Brooks and E. O. Wiley have proposed a theory of evolution in which fitness is merely a rate determining factor. Evolution is driven by non-equilibrium processes which increase the entropy and information content of species together. Evolution can occur without environmental selection, since increased complexity and organization result from the likely capture at the species level of random variations produced at the chemical level. Speciation can occur as the result of variation within the species which decreases the (...) probability of sharing genetic information. Critics of the Brooks-Wiley theory argue that they have abused terminology from information theory and t thermodynamics. In this paper I review the essentials of the theory, and give an account of hierarchical physical information systems within which the theory can be interpreted. I then show how the major conceptual objections can be answered. (shrink)
For the first time Entropy has been completely revised and updated to include a new subtitle which reflects the expanded focus on the greenhouse effect--the largest crisis ever to face mankind.
The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open (...) future, To this end, the approaches of Boltzmann, Reichenbach (and his followers), and Albert are analysed. It is argued that we should look for alternative approaches instead of this, namely we should consider a temporally asymmetrical physical theory or seek a source of the asymmetry of time in metaphysics. This second approach may even turn out to be complementary if only we accept that metaphysics can complement scientific research programmes. (shrink)
Both a method of therapy and an exploration of psychic reality, free association is a fundamental element of psychoanalytical practices that refers to the way a patient is asked to describe what comes spontaneously to mind in the therapeutic setting. This paper examines the role of free association from the point of view of psychoanalysis and neuroscience in order to improve our understanding of therapeutic effects induced by psychoanalytic therapies and psychoanalysis. In this regard, we first propose a global overview (...) of the historical origins of the concept of free association in psychoanalysis and examine how Freud established its principles. Then, from Freud’s distinction between primary and secondary processes, we proceed to compare the psychoanalytic model with research originating from cognitive psychology and neuroscience. The notions of entropy and free energy appear particularly relevant at the intersection of these different domains. Finally, we propose the notion of symbolizing transmodality to describe certain specificities of symbolization processes within free association and we summarize the main functions of free association in psychoanalytic practices. (shrink)
The present paper is a survey of the economics of survival, a branch of ecological economics that stresses the preservation of the opportunities of future generations over an extended time horizon. It outlines the main analytical foundation of the branch – in which the concept of entropy is a major building block –, and its analysis of the interaction between the economic system and the environment. Regarding its outlook of the future, we see that the founders of the branch (...) were mainly concerned with the consequences of a serious depletion of natural resources – particularly the energetic capital of the earth. More recently, however, emphasis is being placed on problems that stem from the fragility of the global ecosystem in face of the disturbances caused by the entropic acceleration imposed by mankind. It is feared that the ongoing expansion of the scale of the economy may bring about irreversible damages to vital environmental functions, such as protection against undesirable consequences of solar radiation, maintenance of temperature within a range that will support life, and preservation of ecosystem resiliency. (shrink)
Although the laws of thermodynamics are well established for black hole horizons, much less has been said in the literature to support the extension of these laws to more general settings such as an asymptotic de Sitter horizon or a Rindler horizon (the event horizon of an asymptotic uniformly accelerated observer). In the present paper we review the results that have been previously established and argue that the laws of black hole thermodynamics, as well as their underlying statistical mechanical content, (...) extend quite generally to what we call here “causal horizons.” The root of this generalization is the local notion of horizon entropy density. (shrink)
The Madelung equations map the non-relativistic time-dependent Schrödinger equation into hydrodynamic equations of a virtual fluid. While the von Neumann entropy remains constant, we demonstrate that an increase of the Shannon entropy, associated with this Madelung fluid, is proportional to the expectation value of its velocity divergence. Hence, the Shannon entropy may grow due to an expansion of the Madelung fluid. These effects result from the interference between solutions of the Schrödinger equation. Growth of the Shannon (...) class='Hi'>entropy due to expansion is common in diffusive processes. However, in the latter the process is irreversible while the processes in the Madelung fluid are always reversible. The relations between interference, compressibility and variation of the Shannon entropy are then examined in several simple examples. Furthermore, we demonstrate that for classical diffusive processes, the “force” accelerating diffusion has the form of the positive gradient of the quantum Bohm potential. Expressing then the diffusion coefficient in terms of the Planck constant reveals the lower bound given by the Heisenberg uncertainty principle in terms of the product between the gas mean free path and the Brownian momentum. (shrink)
This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity (...) of MAXENT inference to the choice of the algebra of possibilities even though all empirical constraints imposed on the MAXENT solution are satisfied in each measure space considered. The resulting MAXENT distribution is not invariant over the choice of measure space. Thus, old and familiar problems with the Laplacian principle of Insufficient Reason also plague MAXENT theory. Result 3 builds upon the findings of Friedman and Shimony (1971; 1973) and demonstrates the absence of an exchangeable, Bayesian model for predictive MAXENT distributions when the MAXENT constraints are interpreted according to Jaynes's (1978) prescription for his (1963) Brandeis Dice problem. Lastly, Result 4 generalizes the Friedman and Shimony objection to cross-entropy (Kullback-information) shifts subject to a constraint of a new odds-ratio for two disjoint events. (shrink)
Ethics is an unconventional field of research for a surgeon, as ethics in surgery owns several specificities and surgery is considered an aggressive specialty. Therefore, the interest of research in medical ethics is sometimes unclear.In this short essay, we discussed the interest of research in medical ethics using a comparison to thermodynamics and mainly, entropy. During the transformation of a figure from one state to another, some energy is released or absorbed; yet, a part of this energy is wasted (...) because of “unordered” reactions: it is Entropy.This “wasted energy” exists in Medical practice and justifies research in Medical ethics. (shrink)
One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can (...) help to make sense of this seemingly odd feature of maximum entropy inference. My analysis is based on the claim that, in applying the principle of maximum entropy, Judy Benjamin is not acting out of a concern to maximize uncertainty in the face of new evidence, but is rather exercising a certain brand of epistemic charity towards her informant. This epistemic charity takes the form of an assumption on the part of Judy Benjamin that her informant’s evidential report leaves out no relevant information. Such a reconceptualization of the motives underlying Judy Benjamin’s appeal to the principle of maximum entropy can help to further our understanding of the true epistemological grounds of this principle and correct a common misapprehension regarding its relationship to the principle of insufficient reason. 1Introduction2The Principle of Maximum Entropy3An Apologia for Judy Benjamin4Conclusion: Entropy and Insufficient Reason. (shrink)
I discuss the statistical mechanics of gravitating systems and in particular its cosmological implications, and argue that many conventional views on this subject in the foundations of statistical mechanics embody significant confusion; I attempt to provide a clearer and more accurate account. In particular, I observe that (i) the role of gravity in entropy calculations must be distinguished from the entropy of gravity, that (ii) although gravitational collapse is entropy-increasing, this is not usually because the collapsing matter (...) itself increases in entropy, and that (iii) the Second Law of thermodynamics does not owe its validity to the statistical mechanics of gravitational collapse. (shrink)
Entropy and information are both emerging as currencies of interdisciplinary dialogue, most recently in evolutionary theory. If this dialogue is to be fruitful, there must be general agreement about the meaning of these terms. That this is not presently the case owes principally to the supposition of many information theorists that information theory has succeeded in generalizing the entropy concept. The present paper will consider the merits of the generalization thesis, and make some suggestions for restricting both (...) class='Hi'>entropy and information to specific arenas of discourse. (shrink)
I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...) theory has some success in evaluating a wide variety of ordinary counterfactuals, it fails as an explanation of counterfactual asymmetry. (shrink)
This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
A new axiomatic characterization with a minimum of conditions for entropy as a function on the set of states in quantum mechanics is presented. Traditionally unspoken assumptions are unveiled and replaced by proven consequences of the axioms. First the Boltzmann–Planck formula is derived. Building on this formula, using the Law of Large Numbers—a basic theorem of probability theory—the von Neumann formula is deduced. Axioms used in older theories on the foundations are now derived facts.
Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that (...) in the special case introduced by Wagner pme does not contradict jup, but elegantly generalizes it and offers a more integrated approach to probability updating. (shrink)
In this essay I critically examine the role of entropy of mixing in articulating a macroscopic criterion for the sameness and difference of chemical substances. Consider three cases of mixing in which entropy change occurs: isotopic variants, spin isomers, and populations of atoms in different orthogonal quantum states. Using these cases I argue that entropy of mixing tracks differences between physical states, differences that may or may not correspond to a difference of substance. It does not provide (...) a criterion for the sameness and difference of substance that is appropriate to chemistry. (shrink)
Neural complexity and brain entropy have gained greater interest in recent years. The dynamics of neural signals and their relations with information processing continue to be investigated through different measures in a variety of noteworthy studies. The BEN of spontaneous neural activity decreases during states of reduced consciousness. This evidence has been showed in primary consciousness states, such as psychedelic states, under the name of “the entropic brain hypothesis.” In this manuscript we propose an extension of this hypothesis to (...) physiological and pathological aging. We review this particular facet of the complexity of the brain, mentioning studies that have investigated BEN in primary consciousness states, and extending this view to the field of neuroaging with a focus on resting-state functional Magnetic Resonance Imaging. We first introduce historic and conceptual ideas about entropy and neural complexity, treating the mindbrain as a complex nonlinear dynamic adaptive system, in light of the free energy principle. Then, we review the studies in this field, analyzing the idea that the aim of the neurocognitive system is to maintain a dynamic state of balance between order and chaos, both in terms of dynamics of neural signals and functional connectivity. In our exploration we will review studies both on acute psychedelic states and more chronic psychotic states and traits, such as those in schizophrenia, in order to show the increase of entropy in those states. Then we extend our exploration to physiological and pathological aging, where BEN is reduced. Finally, we propose an interpretation of these results, defining a general trend of BEN in primary states and cognitive aging. (shrink)
In theories of gravity with a positive cosmological constant, we consider product solutions with flux, of the form (A)dS p ×S q . Most solutions are shown to be perturbatively unstable, including all uncharged dS p ×S q spacetimes. For dimensions greater than four, the stable class includes universes whose entropy exceeds that of de Sitter space, in violation of the conjectured “N-bound.” Hence, if quantum gravity theories with finite-dimensional Hilbert space exist, the specification of a positive cosmological constant (...) will not suffice to characterize the class of spacetimes they describe. (shrink)
The Anthropocene crisis is frequently described as the rarefaction of resources or resources per capita. However, both energy and minerals correspond to fundamentally conserved quantities from the perspective of physics. A specific concept is required to understand the rarefaction of available resources. This concept, entropy, pertains to energy and matter configurations and not just to their sheer amount. However, the physics concept of entropy is insufficient to understand biological and social organizations. Biological phenomena display both historicity and systemic (...) properties. A biological organization, the ability of a specific living being to last over time, results from history, expresses itself by systemic properties, and may require generating novelties The concept of anti-entropy stems from the combination of these features. We propose that Anthropocene changes disrupt biological organizations by randomizing them, that is, decreasing anti-entropy. Moreover, second-order disruptions correspond to the decline of the ability to produce functional novelties, that is, to produce anti-entropy. (shrink)
Integrating concepts of maintenance and of origins is essential to explaining biological diversity. The unified theory of evolution attempts to find a common theme linking production rules inherent in biological systems, explaining the origin of biological order as a manifestation of the flow of energy and the flow of information on various spatial and temporal scales, with the recognition that natural selection is an evolutionarily relevant process. Biological systems persist in space and time by transfor ming energy from one state (...) to another in a manner that generates structures which allows the system to continue to persist. Two classes of energetic transformations allow this; heat-generating transformations, resulting in a net loss of energy from the system, and conservative transformations, changing unusable energy into states that can be stored and used subsequently. All conservative transformations in biological systems are coupled with heat-generating transformations; hence, inherent biological production, or genealogical proesses, is positively entropic. There is a self-organizing phenomenology common to genealogical phenomena, which imparts an arrow of time to biological systems. Natural selection, which by itself is time-reversible, contributes to the organization of the self-organized genealogical trajectories. The interplay of genealogical (diversity-promoting) and selective (diversity-limiting) processes produces biological order to which the primary contribution is genealogical history. Dynamic changes occuring on times scales shorter than speciation rates are microevolutionary; those occuring on time scales longer than speciation rates are macroevolutionary. Macroevolutionary processes are neither redicible to, nor autonomous from, microevolutionary processes. (shrink)
We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
The idea that the changing entropy of a system is relevant to explaining why we know more about the system's past than about its future has been criticized on several fronts. This paper assesses the criticisms and clarifies the epistemology of the inference problem. It deploys a Markov process model to investigate the relationship between entropy and temporally asymmetric inference.
A probability distribution can be given to the set of isomorphism classes of models with universe {1, ..., n} of a sentence in first-order logic. We study the entropy of this distribution and derive a result from the 0–1 law for first-order sentences.
Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set environment.
We review some notions for general quantum entropies. The entropy of the compound systems is discussed and a numerical computation of the quantum dynamical systems is carried for the noisy optical channel.
This paper argues that striving is a cardinal virtue in sport and life. It is an overlooked virtue that is an important component of human happiness and a source of a sense of dignity. The human ps...
The language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. Do common descriptors such as disorder, missing information, and multiplicity help or hinder understanding? Can the language of entropy be helpful in cases where entropy is not well defined? We argue in favor of the descriptor spreading, which entails space, time, and energy in a fundamental way. This includes spreading of energy spatially (...) during processes and temporal spreading over accessible microstates states in thermodynamic equilibrium. Various examples illustrate the value of the spreading metaphor. To provide further support for this metaphor’s utility, it is shown how a set of reasonable spreading properties can be used to derive the entropy function. A main conclusion is that it is appropriate to view entropy’s symbol S as shorthand for spreading. (shrink)
The notion of conditional entropy is extended to noncomposite systems. The \-deformed entropic inequalities, which usually are associated with correlations of the subsystem degrees of freedom in bipartite systems, are found for the noncomposite systems. New entropic inequalities for quantum tomograms of qudit states including the single qudit states are obtained. The Araki–Lieb inequality is found for systems without subsystems.
The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While (...) the conjecture is known to hold for monadic languages as well as for premiss sentences containing only existential or only universal quantifiers, its status for premiss sentences of greater quantifier complexity is, in general, unknown. I here show that the first approach fails to provide a sensible answer for some \-premiss sentences. I discuss implications of this failure for the first strategy and consequences for the entropy-limit conjecture. (shrink)
The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While (...) the conjecture is known to hold for monadic languages as well as for premiss sentences containing only existential or only universal quantifiers, its status for premiss sentences of greater quantifier complexity is, in general, unknown. I here show that the first approach fails to provide a sensible answer for some \-premiss sentences. I discuss implications of this failure for the first strategy and consequences for the entropy-limit conjecture. (shrink)
In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein’s other statistical papers of 1905 had already developed and exploited the idea that the ideal (...) gas law is another macroscopic signature of finitely many, spatially localized, independent components and that these papers in turn drew on his first two, “worthless” papers of 1901 and 1902 on intermolecular forces. However, while the ideal gas law was a secure signature of independence, it was harder to use as an indicator that there are finitely many components and that they are spatially localized. Further, since his analysis of the ideal gas law depended on the assumption that the number of components was fixed, its use was precluded for heat radiation, whose component quanta vary in number in most processes. So Einstein needed and found another, more powerful signature of discreteness applicable to heat radiation and which indicated all these properties. It used one of the few processes, volume fluctuation, in which heat radiation does not alter the number of quanta. (shrink)
It is shown that entropy increase in thermodynamic systems can plausibly be accounted for by the random action of vacuum radiation. A recent calculation by Rueda using stochastic electrodynamics (SED) shows that vacuum radiation causes a particle to undergo a rapid Brownian motion about its average dynamical trajectory. It is shown that the magnitude of spatial drift calculated by Rueda can also be predicted by assuming that the average magnitudes of random shifts in position and momentum of a particle (...) correspond to the lower limits of the uncertainty relation. The latter analysis yields a plausible expression for the shift in momentum caused by vacuum radiation. It is shown that when the latter shift in momentum is magnified in particle interactions, the fractional change in each momentum component is on the order of unity within a few collision times, for gases and (plausibly) for denser systems over a very broad range of physical conditions. So any system of particles in this broad range of conditions would move to maximum entropy, subject to its thermodynamic constraints, within a few collision times. It is shown that the spatial drift caused by vacuum radiation, as predicted by the above SED calculation, can be macroscopic in some circumstances, and an experimental test of this effect is proposed. Consistency of the above results with quantum mechanics is discussed, and it is shown that the diffusion constant associated with the above Brownian drift is the same as that used in stochastic interpretations of the Schrödinger equation. (shrink)
It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of the same train of events. In physical terminology the energy is degraded by an increase in entropy due to an (...) increased randomness in the positions or velocities of components, wave functions, complexions in phase space; in informational terminology some information about the same components has been lost or the negentropy has been decreased. In equilibrium the system has for a given energy content maximum randomness. One consequence of this dual aspect was the idea to apply the methods of statistical mechanics to problems of communication and Brillouin showed that Fermi-Dirac statistics or generalized Fermi statistics are applicable for example to a transmission of signals such as telegrams. (shrink)
A gas relaxing into equilibrium is often taken to be a process in which a system moves from an “improbable” to a “probable” state. Given that the thermodynamic entropy increases during such a process, it is natural to conjecture that the thermodynamic entropy is a measure of the probability of a macrostate. For nonideal classical gases, however, I claim that there is no clear sense in which the thermodynamic entropy of a macrostate measures its probability. We must (...) therefore reject the idea that thermodynamic entropy and probability are connected in a deep and general way. (shrink)
Entropy is proposed as a concept which in its broader scope can contribute to the study of the General Information System. This paper attempts to identify a few fundamental subconcepts and LEMMAS which will serve to facilitate further study of system order. The paper discusses: partitioning order into logical and arbitrary kinds; the relationship of order to pattern; and suggested approaches to evaluating and improving the General Information System.
Markov models of evolution describe changes in the probability distribution of the trait values a population might exhibit. In consequence, they also describe how entropy and conditional entropy values evolve, and how the mutual information that characterizes the relation between an earlier and a later moment in a lineage’s history depends on how much time separates them. These models therefore provide an interesting perspective on questions that usually are considered in the foundations of physics—when and why does (...) class='Hi'>entropy increase and at what rates do changes in entropy take place? They also throw light on an important epistemological question: are there limits on what your observations of the present can tell you about the evolutionary past? (shrink)