Related categories

30 found
Order:
  1. added 2019-05-20
    Bertrand’s Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - forthcoming - Philosophy and Phenomenological Research.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the chords, that is unique (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  2. added 2019-05-14
    Objective Bayesian Nets.Jon Williamson - manuscript
    I present a formalism that combines two methodologies: objective Bayesianism and Bayesian nets. According to objective Bayesianism, an agent’s degrees of belief (i) ought to satisfy the axioms of probability, (ii) ought to satisfy constraints imposed by background knowledge, and (iii) should otherwise be as non-committal as possible (i.e. have maximum entropy). Bayesian nets offer an efficient way of representing and updating probability functions. An objective Bayesian net is a Bayesian net representation of the maximum entropy probability function.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  3. added 2019-05-14
    How to Exploit Parametric Uniformity for Maximum Entropy Reasoning in a Relational Probabilistic Logic.Marc Finthammer & Christoph Beierle - 2012 - In Luis Farinas del Cerro, Andreas Herzig & Jerome Mengin (eds.), Logics in Artificial Intelligence. Springer. pp. 189--201.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  4. added 2019-05-14
    A Problem for Relative Information Minimizers in Probability Kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.
  5. added 2019-05-14
    Probability Kinematics.Zoltan Domotor, Mario Zanotti & Henson Graves - 1980 - Synthese 44 (3):421 - 442.
    Probability kinematics is studied in detail within the framework of elementary probability theory. The merits and demerits of Jeffrey's and Field's models are discussed. In particular, the principle of maximum relative entropy and other principles are used in an epistemic justification of generalized conditionals. A representation of conditionals in terms of Bayesian conditionals is worked out in the framework of external kinematics.
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  6. added 2019-05-13
    Predictive Statistical Mechanics and Macroscopic Time Evolution: Hydrodynamics and Entropy Production.Domagoj Kuić - 2016 - Foundations of Physics 46 (7):891-914.
    In the previous papers, it was demonstrated that applying the principle of maximum information entropy by maximizing the conditional information entropy, subject to the constraint given by the Liouville equation averaged over the phase space, leads to a definition of the rate of entropy change for closed Hamiltonian systems without any additional assumptions. Here, we generalize this basic model and, with the introduction of the additional constraints which are equivalent to the hydrodynamic continuity equations, show that the results obtained are (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  7. added 2019-05-13
    Entropy and the Unity of Knowledge. [REVIEW]J. H. B. - 1962 - Review of Metaphysics 15 (4):676-677.
    In this inaugural address, a professor of applied mathematics develops the theme that new concepts such as "entropy" introduced in the mathematical description of nature have an influence far beyond the mathematical sciences, extending to such diverse fields as biology, the social sciences, religion, philosophy, literary analysis, etc.--B. J. H.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  8. added 2019-05-13
    Justifying Objective Bayesianism on Predicate Languages.Jürgen Landes & Jon Williamson - unknown
    Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  9. added 2019-04-01
    Entropy and Uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity of (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    Bookmark   37 citations  
  10. added 2019-02-19
    Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - forthcoming - British Journal for the Philosophy of Science:axy013.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this paper, I present an analysis of the Judy Benjamin problem that can help to (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  11. added 2019-02-19
    Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  12. added 2019-02-19
    Maximum Entropy Applied to Inductive Logic and Reasoning.Jürgen Landes & Jon Williamson - unknown
    This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  13. added 2019-02-19
    The Principle of Maximum Entropy and a Problem in Probability Kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports maxent. This (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  14. added 2019-02-19
    Objective Bayesianism and the Maximum Entropy Principle.Jürgen Landes & Jon Williamson - unknown
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  15. added 2019-02-19
    First-Order Probabilistic Conditional Logic and Maximum Entropy.J. Fisseler - 2012 - Logic Journal of the IGPL 20 (5):796-830.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  16. added 2019-02-19
    Maximum Power and Maximum Entropy Production: Finalities in Nature.Stanley Salthe - 2010 - Cosmos and History 6 (1):114-121.
    I begin with the definition of power, and find that it is finalistic inasmuch as work directs energy dissipation in the interests of some system. The maximum power principle of Lotka and Odum implies an optimal energy efficiency for any work; optima are also finalities. I advance a statement of the maximum entropy production principle, suggesting that most work of dissipative structures is carried out at rates entailing energy flows faster than those that would associate with maximum power. This is (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  17. added 2019-02-19
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  18. added 2019-02-19
    Explaining Default Intuitions Using Maximum Entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  19. added 2019-02-19
    Maximum Shannon Entropy, Minimum Fisher Information, and an Elementary Game.Shunlong Luo - 2002 - Foundations of Physics 32 (11):1757-1772.
    We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is no solution (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  20. added 2019-02-19
    Common Sense and Maximum Entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  21. added 2019-02-19
    The Constraint Rule of the Maximum Entropy Principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  22. added 2019-02-19
    Can the Maximum Entropy Principle Be Explained as a Consistency Requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  23. added 2019-02-19
    The W Systems: Between Maximum Entropy and Minimal Ranking….Michael Freund - 1994 - Journal of Applied Non-Classical Logics 4 (1):79-90.
  24. added 2019-02-19
    Application of the Maximum Entropy Principle to Nonlinear Systems Far From Equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 239.
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  25. added 2019-02-19
    A Fuzzy Neuron Based Upon Maximum Entropy Ordered Weighted Averaging.Michael O'Hagan - 1991 - In B. Bouchon-Meunier, R. R. Yager & L. A. Zadeh (eds.), Uncertainty in Knowledge Bases. Springer. pp. 598--609.
  26. added 2019-02-19
    Maximum Entropy Inference as a Special Case of Conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  27. added 2019-02-19
    The Status of the Principle of Maximum Entropy.Abner Shimony - 1985 - Synthese 63 (1):35 - 53.
  28. added 2019-02-19
    Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.J. E. Shore & R. W. Johnson - 1980 - IEEE Transactions on Information Theory:26-37.
    Remove from this list  
    Translate
     
     
    Export citation  
     
    Bookmark   24 citations  
  29. added 2019-02-19
    Analysis of the Maximum Entropy Principle “Debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
    Jaynes's maximum entropy principle (MEP) is analyzed by considering in detail a recent controversy. Emphasis is placed on the inductive logical interpretation of “probability” and the concept of “total knowledge.” The relation of the MEP to relative frequencies is discussed, and a possible realm of its fruitful application is noted.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  30. added 2019-02-19
    The Well-Posed Problem.Edwin T. Jaynes - 1973 - Foundations of Physics 3 (4):477-493.
    Many statistical problems, including some of the most important for physical applications, have long been regarded as underdetermined from the standpoint of a strict frequency definition of probability; yet they may appear wellposed or even overdetermined by the principles of maximum entropy and transformation groups. Furthermore, the distributions found by these methods turn out to have a definite frequency correspondence; the distribution obtained by invariance under a transformation group is by far the most likely to be observed experimentally, in the (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   44 citations