Results for 'entropy '

999 found
Order:
See also
  1. Abner Shimony.Carnap On Entropy - 1975 - In Jaakko Hintikka (ed.), Rudolf Carnap, Logical Empiricist: Materials and Perspectives. D. Reidel Pub. Co.. pp. 381.
     
    Export citation  
     
    Bookmark  
  2. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  3. Gravity, Entropy, and Cosmology: in Search of Clarity.David Wallace - 2010 - British Journal for the Philosophy of Science 61 (3):513-540.
    I discuss the statistical mechanics of gravitating systems and in particular its cosmological implications, and argue that many conventional views on this subject in the foundations of statistical mechanics embody significant confusion; I attempt to provide a clearer and more accurate account. In particular, I observe that (i) the role of gravity in entropy calculations must be distinguished from the entropy of gravity, that (ii) although gravitational collapse is entropy-increasing, this is not usually because the collapsing matter (...)
    Direct download (15 more)  
     
    Export citation  
     
    Bookmark   22 citations  
  4. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  5.  96
    Entropy in evolution.John Collier - 1986 - Biology and Philosophy 1 (1):5-24.
    Daniel R. Brooks and E. O. Wiley have proposed a theory of evolution in which fitness is merely a rate determining factor. Evolution is driven by non-equilibrium processes which increase the entropy and information content of species together. Evolution can occur without environmental selection, since increased complexity and organization result from the likely capture at the species level of random variations produced at the chemical level. Speciation can occur as the result of variation within the species which decreases the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   24 citations  
  6.  9
    Economics, Entropy and the Long Term Future: Conceptual Foundations and the Perspective of the Economics of Survival.Charles C. Mueller - 2001 - Environmental Values 10 (3):361-384.
    The present paper is a survey of the economics of survival, a branch of ecological economics that stresses the preservation of the opportunities of future generations over an extended time horizon. It outlines the main analytical foundation of the branch – in which the concept of entropy is a major building block –, and its analysis of the interaction between the economic system and the environment. Regarding its outlook of the future, we see that the founders of the branch (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
  7. Entropy and the Direction of Time.Jerzy Gołosz - 2021 - Entropy 23 (4):388.
    The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  8.  21
    On Entropy of Quantum Compound Systems.Noboru Watanabe - 2015 - Foundations of Physics 45 (10):1311-1329.
    We review some notions for general quantum entropies. The entropy of the compound systems is discussed and a numerical computation of the quantum dynamical systems is carried for the noisy optical channel.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  9.  99
    Horizon Entropy.Ted Jacobson & Renaud Parentani - 2003 - Foundations of Physics 33 (2):323-348.
    Although the laws of thermodynamics are well established for black hole horizons, much less has been said in the literature to support the extension of these laws to more general settings such as an asymptotic de Sitter horizon or a Rindler horizon (the event horizon of an asymptotic uniformly accelerated observer). In the present paper we review the results that have been previously established and argue that the laws of black hole thermodynamics, as well as their underlying statistical mechanical content, (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  10.  16
    Entropy: Into the Greenhouse World.Jeremy Rifkin & Ted Howard - 1989 - Bantam.
    For the first time Entropy has been completely revised and updated to include a new subtitle which reflects the expanded focus on the greenhouse effect--the largest crisis ever to face mankind.
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  11.  1
    From Randomness and Entropy to the Arrow of Time.Lena Zuchowski - 2024 - Cambridge University Press.
    The Element reconstructs, analyses and compares different derivational routes to a grounding of the Arrow of Time in entropy. It also evaluates the link between entropy and visible disorder, and the related claim of an alignment of the Arrow of Time with a development from order to visible disorder. The Element identifies three different entropy-groundings for the Arrow of Time: (i) the Empirical Arrow of Time, (ii) the Universal Statistical Arrow of Time, and (iii) the Local Statistical (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  12.  28
    On Entropy Production in the Madelung Fluid and the Role of Bohm’s Potential in Classical Diffusion.Eyal Heifetz, Roumen Tsekov, Eliahu Cohen & Zohar Nussinov - 2016 - Foundations of Physics 46 (7):815-824.
    The Madelung equations map the non-relativistic time-dependent Schrödinger equation into hydrodynamic equations of a virtual fluid. While the von Neumann entropy remains constant, we demonstrate that an increase of the Shannon entropy, associated with this Madelung fluid, is proportional to the expectation value of its velocity divergence. Hence, the Shannon entropy may grow due to an expansion of the Madelung fluid. These effects result from the interference between solutions of the Schrödinger equation. Growth of the Shannon (...) due to expansion is common in diffusive processes. However, in the latter the process is irreversible while the processes in the Madelung fluid are always reversible. The relations between interference, compressibility and variation of the Shannon entropy are then examined in several simple examples. Furthermore, we demonstrate that for classical diffusive processes, the “force” accelerating diffusion has the form of the positive gradient of the quantum Bohm potential. Expressing then the diffusion coefficient in terms of the Planck constant reveals the lower bound given by the Heisenberg uncertainty principle in terms of the product between the gas mean free path and the Brownian momentum. (shrink)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  13.  11
    Entropy, Free Energy, and Symbolization: Free Association at the Intersection of Psychoanalysis and Neuroscience.Thomas Rabeyron & Claudie Massicotte - 2020 - Frontiers in Psychology 11.
    Both a method of therapy and an exploration of psychic reality, free association is a fundamental element of psychoanalytical practices that refers to the way a patient is asked to describe what comes spontaneously to mind in the therapeutic setting. This paper examines the role of free association from the point of view of psychoanalysis and neuroscience in order to improve our understanding of therapeutic effects induced by psychoanalytic therapies and psychoanalysis. In this regard, we first propose a global overview (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  14. Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  15.  40
    Entropy in Relation to Incomplete Knowledge.Michael J. Zenzen - 1985 - Cambridge University Press.
    This book is about an important issue which has arisen within two of the branches of physical science - namely thermodynamics and statistical mechanics - where the notion of entropy plays an essential role. A number of scientists and information theorists have maintained that entropy is a subjective concept and is a measure of human ignorance. Such a view, if it is valid, would create some profound philosophical problems and would tend to undermine the objectivity of the scientific (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  16.  8
    Entropy and Art: An Essay on Disorder and Order.Rudolf Arnheim - 1971 - Berkeley,: University of California Press.
    This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  17. The entropy theory of counterfactuals.Douglas N. Kutach - 2002 - Philosophy of Science 69 (1):82-104.
    I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  18.  94
    Entropy and information in evolving biological systems.Daniel R. Brooks, John Collier, Brian A. Maurer, Jonathan D. H. Smith & E. O. Wiley - 1989 - Biology and Philosophy 4 (4):407-432.
    Integrating concepts of maintenance and of origins is essential to explaining biological diversity. The unified theory of evolution attempts to find a common theme linking production rules inherent in biological systems, explaining the origin of biological order as a manifestation of the flow of energy and the flow of information on various spatial and temporal scales, with the recognition that natural selection is an evolutionarily relevant process. Biological systems persist in space and time by transfor ming energy from one state (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  19. Entropy and information: Suggestions for common language.Jeffrey S. Wicken - 1987 - Philosophy of Science 54 (2):176-193.
    Entropy and information are both emerging as currencies of interdisciplinary dialogue, most recently in evolutionary theory. If this dialogue is to be fruitful, there must be general agreement about the meaning of these terms. That this is not presently the case owes principally to the supposition of many information theorists that information theory has succeeded in generalizing the entropy concept. The present paper will consider the merits of the generalization thesis, and make some suggestions for restricting both (...) and information to specific arenas of discourse. (shrink)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  20.  19
    Entropies and the Anthropocene crisis.Maël Montévil - 2021 - AI and Society:1-21.
    The Anthropocene crisis is frequently described as the rarefaction of resources or resources per capita. However, both energy and minerals correspond to fundamentally conserved quantities from the perspective of physics. A specific concept is required to understand the rarefaction of available resources. This concept, entropy, pertains to energy and matter configurations and not just to their sheer amount. However, the physics concept of entropy is insufficient to understand biological and social organizations. Biological phenomena display both historicity and systemic (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  21. Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   54 citations  
  22.  71
    Entropy and Art: An Essay on Disorder and Order.Rudolf Arnheim - 1973 - Journal of Aesthetics and Art Criticism 32 (2):280-281.
    This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  23.  58
    Entropy and Chemical Substance.Robin Findlay Hendry - 2010 - Philosophy of Science 77 (5):921-932.
    In this essay I critically examine the role of entropy of mixing in articulating a macroscopic criterion for the sameness and difference of chemical substances. Consider three cases of mixing in which entropy change occurs: isotopic variants, spin isomers, and populations of atoms in different orthogonal quantum states. Using these cases I argue that entropy of mixing tracks differences between physical states, differences that may or may not correspond to a difference of substance. It does not provide (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  24.  94
    Is entropy relevant to the asymmetry between retrodiction and prediction?Martin Barrett & Elliott Sober - 1992 - British Journal for the Philosophy of Science 43 (2):141-160.
    The idea that the changing entropy of a system is relevant to explaining why we know more about the system's past than about its future has been criticized on several fronts. This paper assesses the criticisms and clarifies the epistemology of the inference problem. It deploys a Markov process model to investigate the relationship between entropy and temporally asymmetric inference.
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  25.  91
    Characterizing Entropy in Statistical Physics and in Quantum Information Theory.Bernhard Baumgartner - 2014 - Foundations of Physics 44 (10):1107-1123.
    A new axiomatic characterization with a minimum of conditions for entropy as a function on the set of states in quantum mechanics is presented. Traditionally unspoken assumptions are unveiled and replaced by proven consequences of the axioms. First the Boltzmann–Planck formula is derived. Building on this formula, using the Law of Large Numbers—a basic theorem of probability theory—the von Neumann formula is deduced. Axioms used in older theories on the foundations are now derived facts.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  26. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment.Shyamal Dalapati, Surapati Pramanik, Shariful Alam, Florentin Smarandache & Tapan Kumar Roy - 2017 - Neutrosophic Sets and Systems 18:43-57.
    Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set environment.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  27.  65
    Psychological entropy: A framework for understanding uncertainty-related anxiety.Jacob B. Hirsh, Raymond A. Mar & Jordan B. Peterson - 2012 - Psychological Review 119 (2):304-320.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   37 citations  
  28.  23
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  29.  11
    Brain Entropy During Aging Through a Free Energy Principle Approach.Filippo Cieri, Xiaowei Zhuang, Jessica Z. K. Caldwell & Dietmar Cordes - 2021 - Frontiers in Human Neuroscience 15.
    Neural complexity and brain entropy have gained greater interest in recent years. The dynamics of neural signals and their relations with information processing continue to be investigated through different measures in a variety of noteworthy studies. The BEN of spontaneous neural activity decreases during states of reduced consciousness. This evidence has been showed in primary consciousness states, such as psychedelic states, under the name of “the entropic brain hypothesis.” In this manuscript we propose an extension of this hypothesis to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  30. Entropy of Polysemantic Words for the Same Part of Speech.Mihaela Colhon, Florentin Smarandache & Dan Valeriu Voinea - unknown
    In this paper, a special type of polysemantic words, that is, words with multiple meanings for the same part of speech, are analyzed under the name of neutrosophic words. These words represent the most dif cult cases for the disambiguation algorithms as they represent the most ambiguous natural language utterances. For approximate their meanings, we developed a semantic representation framework made by means of concepts from neutrosophic theory and entropy measure in which we incorporate sense related data. We show (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  31.  12
    Economics, Entropy and the Long Term Future: Conceptual Foundations and the Perspective of the Economics of Survival.Charles C. Mueller - 2001 - Environmental Values 10 (3):361-384.
    The present paper is a survey of the economics of survival, a branch of ecological economics that stresses the preservation of the opportunities of future generations over an extended time horizon. It outlines the main analytical foundation of the branch - in which the concept of entropy is a major building block -, and its analysis of the interaction between the economic system and the environment. Regarding its outlook of the future, we see that the founders of the branch (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  32.  50
    The Entropy Law and the Economic Process.L. A. Boland - 1976 - Synthese 33 (2):371-391.
  33.  8
    Entropy of eye movement during rapid automatized naming.Hongan Wang, Fulin Liu, Yuhong Dong & Dongchuan Yu - 2022 - Frontiers in Human Neuroscience 16.
    Numerous studies have focused on the understanding of rapid automatized naming, which can be applied to predict reading abilities and developmental dyslexia in children. Eye tracking technique, characterizing the essential ocular activities, might have the feasibility to reveal the visual and cognitive features of RAN. However, traditional measures of eye movements ignore many dynamical details about the visual and cognitive processing of RAN, and are usually associated with the duration of time spent on some particular areas of interest, fixation counts, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  34.  24
    Understanding entropy.Peter G. Nelson - 2021 - Foundations of Chemistry 24 (1):3-13.
    A new way of understanding entropy as a macroscopic property is presented. This is based on the fact that heat flows from a hot body to a cold one even when the hot one is smaller and has less energy. A quantity that determines the direction of flow is shown to be the increment of heat gained divided by the absolute temperature. The same quantity is shown to determine the direction of other processes taking place in isolated systems provided (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  35.  25
    Striving, entropy, and meaning.J. S. Russell - 2020 - Journal of the Philosophy of Sport 47 (3):419-437.
    This paper argues that striving is a cardinal virtue in sport and life. It is an overlooked virtue that is an important component of human happiness and a source of a sense of dignity. The human ps...
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  36.  11
    Entropy and art.Rudolf Arnheim - 1971 - Berkeley,: University of California Press.
    Views the process of artistic creation in light of the conflict between man's quest for order and increasing universal disorder.
    Direct download  
     
    Export citation  
     
    Bookmark   6 citations  
  37.  35
    Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  38.  93
    Entropy increase and information loss in Markov models of evolution.Elliott Sober & Mike Steel - 2011 - Biology and Philosophy 26 (2):223-250.
    Markov models of evolution describe changes in the probability distribution of the trait values a population might exhibit. In consequence, they also describe how entropy and conditional entropy values evolve, and how the mutual information that characterizes the relation between an earlier and a later moment in a lineage’s history depends on how much time separates them. These models therefore provide an interesting perspective on questions that usually are considered in the foundations of physics—when and why does (...) increase and at what rates do changes in entropy take place? They also throw light on an important epistemological question: are there limits on what your observations of the present can tell you about the evolutionary past? (shrink)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  39.  59
    Entropy and evil.Robert John Russell - 1984 - Zygon 19 (4):449-468.
    This paper explores a possible relationship between entropy and evil in terms of metaphor. After presenting the various meanings of entropy in classical thermodynamics and statical mechanics, and the Augustinian and Irenaean theodicies, several similarities and dissimilarities between entropy and evil are described. Underlying the concepts of evil and entropy is the assumption that time has a direction. After examining the scientific basis for this assumption, it is hypothesized that, if evil is real in nature, (...) is what one would expect to find at the level of physical processes, and conversely that, if entropy is coupled to a physical arrow of time, one could expect to find dissipative yet catalytic processes in history and religious experience. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  40. Atoms, entropy, quanta: Einstein's miraculous argument of 1905.John D. Norton - 2006 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 37 (1):71-100.
    In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein’s other statistical papers of 1905 had already developed and exploited the idea that the ideal (...)
    Direct download (13 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  41.  25
    Striving, entropy, and meaning.J. S. Russell - 2020 - Journal of the Philosophy of Sport 47 (3):419-437.
    ABSTRACT This paper argues that striving is a cardinal virtue in sport and life. It is an overlooked virtue that is an important component of human happiness and a source of a sense of dignity. The human psychological capacity for striving emerged as a trait for addressing the entropic features of our existence, but it can be engaged and used for other purposes. Sport is one such example. Sport appears exceptional in being designed specifically to test and display our capacities (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  42.  10
    Entropy and Entropic Differences in the Work of Michel Serres.Lilian Kroth - 2024 - Theory, Culture and Society 41 (2):21-35.
    Michel Serres’s philosophy of entropy takes what he famously calls the ‘Northwest Passage’ between the sciences and the humanities. By contextualizing his approach to entropy and affirming the role of a philosophy of difference, this paper explores Serres’s approach by means of ‘entropic differences’. It claims that entropy – or rather, entropies – provide Serres with a paradigmatic case for critical translations between different domains of knowledge. From his early Hermès series, through to The Birth of Physics (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  43. Unbounded Entropy in Spacetimes with Positive Cosmological Constant.Raphael Bousso, Oliver DeWolfe & Robert C. Myers - 2003 - Foundations of Physics 33 (2):297-321.
    In theories of gravity with a positive cosmological constant, we consider product solutions with flux, of the form (A)dS p ×S q . Most solutions are shown to be perturbatively unstable, including all uncharged dS p ×S q spacetimes. For dimensions greater than four, the stable class includes universes whose entropy exceeds that of de Sitter space, in violation of the conjectured “N-bound.” Hence, if quantum gravity theories with finite-dimensional Hilbert space exist, the specification of a positive cosmological constant (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  44.  30
    The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):1-20.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  45.  15
    The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):423-442.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  46.  25
    Atoms, Entropy, Quanta: Einstein’s Miraculous Argument of 1905.John D. Norton - 2005 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 37 (1):71-100.
    In the sixth section of his light quantum paper of 1905, Einstein presented the miraculous argument, as I shall call it. Pointing out an analogy with ideal gases and dilute solutions, he showed that the macroscopic, thermodynamic properties of high-frequency heat radiation carry a distinctive signature of finitely many, spatially localized, independent components and so inferred that it consists of quanta. I describe how Einstein's other statistical papers of 1905 had already developed and exploited the idea that the ideal gas (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  47.  29
    Deformed Entropy and Information Relations for Composite and Noncomposite Systems.Vladimir N. Chernega, Olga V. Man’ko & Vladimir I. Man’ko - 2015 - Foundations of Physics 45 (7):783-798.
    The notion of conditional entropy is extended to noncomposite systems. The \-deformed entropic inequalities, which usually are associated with correlations of the subsystem degrees of freedom in bipartite systems, are found for the noncomposite systems. New entropic inequalities for quantum tomograms of qudit states including the single qudit states are obtained. The Araki–Lieb inequality is found for systems without subsystems.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  48.  32
    Entropy: a new world view.Jeremy Rifkin - 1980 - New York: Viking Press. Edited by Ted Howard.
  49.  98
    Entropy, Its Language, and Interpretation.Harvey S. Leff - 2007 - Foundations of Physics 37 (12):1744-1766.
    The language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. Do common descriptors such as disorder, missing information, and multiplicity help or hinder understanding? Can the language of entropy be helpful in cases where entropy is not well defined? We argue in favor of the descriptor spreading, which entails space, time, and energy in a fundamental way. This includes spreading of energy spatially (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  50.  13
    The Entropy Law and the Economic Process.L. A. Boland - 1972 - Philosophy of Science 39 (3):423-424.
1 — 50 / 999