Results for 'Entropy'

1000+ found
Order:
See also
  1. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  2.  71
    Entropy in Evolution.John Collier - 1986 - Biology and Philosophy 1 (1):5-24.
    Daniel R. Brooks and E. O. Wiley have proposed a theory of evolution in which fitness is merely a rate determining factor. Evolution is driven by non-equilibrium processes which increase the entropy and information content of species together. Evolution can occur without environmental selection, since increased complexity and organization result from the likely capture at the species level of random variations produced at the chemical level. Speciation can occur as the result of variation within the species which decreases the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  3. Logical Entropy: Introduction to Classical and Quantum Logical Information Theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  4.  73
    Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  5. Gravity, Entropy, and Cosmology: In Search of Clarity.David Wallace - 2010 - British Journal for the Philosophy of Science 61 (3):513-540.
    I discuss the statistical mechanics of gravitating systems and in particular its cosmological implications, and argue that many conventional views on this subject in the foundations of statistical mechanics embody significant confusion; I attempt to provide a clearer and more accurate account. In particular, I observe that (i) the role of gravity in entropy calculations must be distinguished from the entropy of gravity, that (ii) although gravitational collapse is entropy-increasing, this is not usually because the collapsing matter (...)
    Direct download (15 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  6.  86
    Horizon Entropy.Ted Jacobson & Renaud Parentani - 2003 - Foundations of Physics 33 (2):323-348.
    Although the laws of thermodynamics are well established for black hole horizons, much less has been said in the literature to support the extension of these laws to more general settings such as an asymptotic de Sitter horizon or a Rindler horizon (the event horizon of an asymptotic uniformly accelerated observer). In the present paper we review the results that have been previously established and argue that the laws of black hole thermodynamics, as well as their underlying statistical mechanical content, (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  7.  75
    Entropy and Uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   46 citations  
  8.  29
    The Entropy Law and the Economic Process.L. A. Boland - 1976 - Synthese 33 (2):371-391.
  9. The Entropy Theory of Counterfactuals.Douglas Kutach - 2002 - Philosophy of Science 69 (1):82-104.
    I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...)
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   23 citations  
  10.  4
    Evolution as Entropy: Toward a Unified Theory of Biology.D. R. BROOKS - 1986 - University of Chicago Press.
    "By combining recent advances in the physical sciences with some of the novel ideas, techniques, and data of modern biology, this book attempts to achieve a new and different kind of evolutionary synthesis. I found it to be challenging, fascinating, infuriating, and provocative, but certainly not dull."--James H, Brown, University of New Mexico "This book is unquestionably mandatory reading not only for every living biologist but for generations of biologists to come."--Jack P. Hailman, Animal Behaviour , review of the first (...)
    Direct download  
     
    Export citation  
     
    Bookmark   90 citations  
  11.  39
    Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety.Jacob B. Hirsh, Raymond A. Mar & Jordan B. Peterson - 2012 - Psychological Review 119 (2):304-320.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   28 citations  
  12.  67
    Entropy and Information in Evolving Biological Systems.Daniel R. Brooks, John Collier, Brian A. Maurer, Jonathan D. H. Smith & E. O. Wiley - 1989 - Biology and Philosophy 4 (4):407-432.
    Integrating concepts of maintenance and of origins is essential to explaining biological diversity. The unified theory of evolution attempts to find a common theme linking production rules inherent in biological systems, explaining the origin of biological order as a manifestation of the flow of energy and the flow of information on various spatial and temporal scales, with the recognition that natural selection is an evolutionarily relevant process. Biological systems persist in space and time by transfor ming energy from one state (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  13.  76
    Is Entropy Relevant to the Asymmetry Between Retrodiction and Prediction?Martin Barrett & Elliott Sober - 1992 - British Journal for the Philosophy of Science 43 (2):141-160.
    The idea that the changing entropy of a system is relevant to explaining why we know more about the system's past than about its future has been criticized on several fronts. This paper assesses the criticisms and clarifies the epistemology of the inference problem. It deploys a Markov process model to investigate the relationship between entropy and temporally asymmetric inference.
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  14.  1
    Entropy and Art: An Essay on Disorder and Order.Rudolf Arnheim - 1971 - Berkeley: University of California Press.
    Views the process of artistic creation in light of the conflict between man's quest for order and increasing universal disorder.
    Direct download  
     
    Export citation  
     
    Bookmark   6 citations  
  15. Entropy and Information: Suggestions for Common Language.Jeffrey S. Wicken - 1987 - Philosophy of Science 54 (2):176-193.
    Entropy and information are both emerging as currencies of interdisciplinary dialogue, most recently in evolutionary theory. If this dialogue is to be fruitful, there must be general agreement about the meaning of these terms. That this is not presently the case owes principally to the supposition of many information theorists that information theory has succeeded in generalizing the entropy concept. The present paper will consider the merits of the generalization thesis, and make some suggestions for restricting both (...) and information to specific arenas of discourse. (shrink)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  16. Atoms, Entropy, Quanta: Einstein's Miraculous Argument of 1905.John D. Norton - 2006 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 37 (1):71-100.
    In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein’s other statistical papers of 1905 had already developed and exploited the idea that the ideal (...)
    Direct download (13 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  17.  14
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  18.  1
    Entropy and Art: An Essay on Disorder and Order.Rudolf Arnheim - 2010 - University of California Press.
    This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  19. Entropy.G. J. Whitrow - 1967 - In Paul Edwards (ed.), The Encyclopedia of Philosophy. New York: Macmillan.
  20.  5
    The Entropy Law and the Economic Process.L. A. Boland - 1972 - Philosophy of Science 39 (3):423-424.
    Direct download  
     
    Export citation  
     
    Bookmark   34 citations  
  21.  29
    Information vs. entropy vs. probability.Orly Shenker - 2019 - European Journal for Philosophy of Science 10 (1):1-25.
    Information, entropy, probability: these three terms are closely interconnected in the prevalent understanding of statistical mechanics, both when this field is taught to students at an introductory level and in advanced research into the field’s foundations. This paper examines the interconnection between these three notions in light of recent research in the foundations of statistical mechanics. It disentangles these concepts and highlights their differences, at the same time explaining why they came to be so closely linked in the literature. (...)
    Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark   3 citations  
  22.  5
    Entropy: A New World View.Jeremy Rifkin - 1980 - Viking Press.
  23.  19
    Atoms, Entropy, Quanta: Einstein’s Miraculous Argument of 1905.John D. Norton - 2005 - Studies in History and Philosophy of Modern Physics 37 (1):71-100.
    In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high-frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein's other statistical papers of 1905 had already developed and exploited the idea that the ideal gas (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  24.  36
    Entropy and Art: An Essay on Disorder and Order.Rudolf Arnheim - 1973 - Journal of Aesthetics and Art Criticism 32 (2):280-281.
    This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  25.  18
    Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  26.  10
    Striving, Entropy, and Meaning.J. S. Russell - 2020 - Journal of the Philosophy of Sport 47 (3):419-437.
    This paper argues that striving is a cardinal virtue in sport and life. It is an overlooked virtue that is an important component of human happiness and a source of a sense of dignity. The human ps...
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  27. Unbounded Entropy in Spacetimes with Positive Cosmological Constant.Raphael Bousso, Oliver DeWolfe & Robert C. Myers - 2003 - Foundations of Physics 33 (2):297-321.
    In theories of gravity with a positive cosmological constant, we consider product solutions with flux, of the form (A)dS p ×S q . Most solutions are shown to be perturbatively unstable, including all uncharged dS p ×S q spacetimes. For dimensions greater than four, the stable class includes universes whose entropy exceeds that of de Sitter space, in violation of the conjectured “N-bound.” Hence, if quantum gravity theories with finite-dimensional Hilbert space exist, the specification of a positive cosmological constant (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  28.  78
    Entropy Increase and Information Loss in Markov Models of Evolution.Elliott Sober & Mike Steel - 2011 - Biology and Philosophy 26 (2):223-250.
    Markov models of evolution describe changes in the probability distribution of the trait values a population might exhibit. In consequence, they also describe how entropy and conditional entropy values evolve, and how the mutual information that characterizes the relation between an earlier and a later moment in a lineage’s history depends on how much time separates them. These models therefore provide an interesting perspective on questions that usually are considered in the foundations of physics—when and why does (...) increase and at what rates do changes in entropy take place? They also throw light on an important epistemological question: are there limits on what your observations of the present can tell you about the evolutionary past? (shrink)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  29.  70
    Eschatology and Entropy: An Alternative to Robert John Russell's Proposal.Klaus Nürnberger - 2012 - Zygon 47 (4):970-996.
    Traditional eschatology clashes with the theory of entropy. Trying to bridge the gap, Robert John Russell assumes that theology and science are based on contradictory, yet equally valid, metaphysical assumptions, each one capable of questioning and impacting the other. The author doubts that Russell's proposal will convince empirically oriented scientists and attempts to provide a viable alternative. Historical‐critical analysis suggests that biblical future expectations were redemptive responses to changing human needs. Apocalyptic visions were occasioned by heavy suffering in postexilic (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  30.  10
    Striving, Entropy, and Meaning.J. S. Russell - forthcoming - Tandf: Journal of the Philosophy of Sport:1-19.
  31. An Introduction to Logical Entropy and its Relation to Shannon Entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  32.  26
    Entropy in Relation to Incomplete Knowledge.Michael J. Zenzen - 1985
  33. Entropy, Information and Evolution: New Perspectives on Physical and Biological Evolution.Bruce H. Weber, David J. Depew, James D. Smith & C. Dyke - 1990 - Behavior and Philosophy 18 (2):79-84.
     
    Export citation  
     
    Bookmark   15 citations  
  34. Entropy and Vacuum Radiation.Jean E. Burns - 1998 - Foundations of Physics 28 (7):1191-1207.
    It is shown that entropy increase in thermodynamic systems can plausibly be accounted for by the random action of vacuum radiation. A recent calculation by Rueda using stochastic electrodynamics (SED) shows that vacuum radiation causes a particle to undergo a rapid Brownian motion about its average dynamical trajectory. It is shown that the magnitude of spatial drift calculated by Rueda can also be predicted by assuming that the average magnitudes of random shifts in position and momentum of a particle (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  35. Entropy in Relation to Incomplete Knowledge.K. G. Denbigh, J. S. Denbigh & H. D. Zeh - 1991 - British Journal for the Philosophy of Science 42 (1):111-144.
     
    Export citation  
     
    Bookmark   16 citations  
  36.  86
    Entropy, Its Language, and Interpretation.Harvey S. Leff - 2007 - Foundations of Physics 37 (12):1744-1766.
    The language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. Do common descriptors such as disorder, missing information, and multiplicity help or hinder understanding? Can the language of entropy be helpful in cases where entropy is not well defined? We argue in favor of the descriptor spreading, which entails space, time, and energy in a fundamental way. This includes spreading of energy spatially (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  37.  64
    Maximum Entropy Inference as a Special Case of Conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  38. Einstein, Entropy, and Anomalies.Daniel Sirtes & Eric Oberheim - 2006 - AIP Conference Proceedings 861:1147-1154.
    This paper strengthens and defends the pluralistic implications of Einstein's successful, quantitative predictions of Brownian motion for a philosophical dispute about the nature of scientific advance that began between two prominent philosophers of science in the second half of the twentieth century (Thomas Kuhn and Paul Feyerabend). Kuhn promoted a monistic phase-model of scientific advance, according to which a paradigm driven `normal science' gives rise to its own anomalies, which then lead to a crisis and eventually a scientific revolution. Feyerabend (...)
    Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark   7 citations  
  39.  14
    Fuzzy Entropy for Pythagorean Fuzzy Sets with Application to Multicriterion Decision Making.Miin-Shen Yang & Zahid Hussain - 2018 - Complexity 2018:1-14.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  40.  97
    Can the Maximum Entropy Principle Be Explained as a Consistency Requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   23 citations  
  41.  39
    Physical Entropy and the Senses.Kenneth H. Norwich - 2005 - Acta Biotheoretica 53 (3):167-180.
    With reference to two specific modalities of sensation, the taste of saltiness of chloride salts, and the loudness of steady tones, it is shown that the laws of sensation (logarithmic and power laws) are expressions of the entropy per mole of the stimulus. That is, the laws of sensation are linear functions of molar entropy. In partial verification of this hypothesis, we are able to derive an approximate value for the gas constant, a fundamental physical constant, directly from (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  42.  27
    Entropy and Chemical Substance.Robin Findlay Hendry - 2010 - Philosophy of Science 77 (5):921-932.
  43.  25
    Entropy and Counterfactual Asymmetry.Douglas Kutach - 2001 - Dissertation, Rutgers the State University of New Jersey - New Brunswick
    I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe by developing a new semantic element for counterfactuals called objective assertibility and a method of evaluating counterfactuals that constrains consideration to possibilities where the early universe has low entropy. The resulting theory vindicates the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain (...)
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  44.  50
    Entropy and Evil.Robert John Russell - 1984 - Zygon 19 (4):449-468.
  45. How Does the Entropy/Information Bound Work?Jacob D. Bekenstein - 2005 - Foundations of Physics 35 (11):1805-1823.
    According to the universal entropy bound, the entropy of a complete weakly self-gravitating physical system can be bounded exclusively in terms of its circumscribing radius and total gravitating energy. The bound’s correctness is supported by explicit statistical calculations of entropy, gedanken experiments involving the generalized second law, and Bousso’s covariant holographic bound. On the other hand, it is not always obvious in a particular example how the system avoids having too many states for given energy, and hence (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  46.  29
    Entropy in Relation to Incomplete Knowledge. K. G. Denbigh, J. S. Denbigh.Michael J. Zenzen - 1986 - Philosophy of Science 53 (3):451-452.
  47.  15
    Sample Entropy, Univariate, and Multivariate Multi-Scale Entropy in Comparison with Classical Postural Sway Parameters in Young Healthy Adults.Jiann-Shing Shieh, Clint Hansen, Qin Wei, Paul Fourcade, Brice Isableu & Lina Majed - 2017 - Frontiers in Human Neuroscience 11.
  48.  96
    Entropy and Nonsense.Harold Morowitz - 1986 - Biology and Philosophy 1 (4):473-476.
  49. Boltzmann Entropy for Dense Fluids Not in Local Equilibrium.Sheldon Goldstein - manuscript
    Using computer simulations, we investigate the time evolution of the (Boltzmann) entropy of a dense fluid not in local equilibrium. The macrovariables M describing the system are the (empirical) particle density f = {f(x,v)} and the total energy E. We find that S(ft,E) is a monotone increasing in time even when its kinetic part is decreasing. We argue that for isolated Hamiltonian systems monotonicity of S(Mt) = S(MXt) should hold generally for ‘‘typical’’ (the overwhelming majority of) initial microstates (phase (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  50.  12
    Maximum Power and Maximum Entropy Production: Finalities in Nature.Stanley Salthe - 2010 - Cosmos and History 6 (1):114-121.
    I begin with the definition of power, and find that it is finalistic inasmuch as work directs energy dissipation in the interests of some system. The maximum power principle of Lotka and Odum implies an optimal energy efficiency for any work; optima are also finalities. I advance a statement of the maximum entropy production principle, suggesting that most work of dissipative structures is carried out at rates entailing energy flows faster than those that would associate with maximum power. This (...)
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
1 — 50 / 1000