Related categories

36 found
Order:
  1. Aggregating Agents with Opinions About Different Propositions.Richard Pettigrew - manuscript
    There are many reasons we might want to take the opinions of various individuals and aggregate them to give the opinions of the group they constitute. If all the individuals in the group have probabilistic opinions about the same propositions, there is a host of aggregation functions we might deploy, such as linear or geometric pooling. However, there are also cases where different members of the group assign probabilities to different sets of propositions, which might overlap a lot, a little, (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  2. Probabilistic Stability, Agm Revision Operators and Maximum Entropy.Krzysztof Mierzewski - forthcoming - Review of Symbolic Logic:1-34.
    Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of maximum (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  3. The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2021 - Studia Logica 109 (2):423-442.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While the conjecture is (...)
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  4. Towards the Entropy-Limit Conjecture.Jürgen Landes, Soroush Rafiee Rad & Jon Williamson - 2021 - Annals of Pure and Applied Logic 172 (2):102870.
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size n of the sublanguage (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  5. Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2020 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the chords, that is unique (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  6. Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can help to (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  7. Predictive Statistical Mechanics and Macroscopic Time Evolution: Hydrodynamics and Entropy Production.Domagoj Kuić - 2016 - Foundations of Physics 46 (7):891-914.
    In the previous papers, it was demonstrated that applying the principle of maximum information entropy by maximizing the conditional information entropy, subject to the constraint given by the Liouville equation averaged over the phase space, leads to a definition of the rate of entropy change for closed Hamiltonian systems without any additional assumptions. Here, we generalize this basic model and, with the introduction of the additional constraints which are equivalent to the hydrodynamic continuity equations, show that the results obtained are (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  8. Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  9. The Principle of Maximum Entropy and a Problem in Probability Kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports maxent. This (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  10. How to Exploit Parametric Uniformity for Maximum Entropy Reasoning in a Relational Probabilistic Logic.Marc Finthammer & Christoph Beierle - 2012 - In Luis Farinas del Cerro, Andreas Herzig & Jerome Mengin (eds.), Logics in Artificial Intelligence. Springer. pp. 189--201.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  11. First-Order Probabilistic Conditional Logic and Maximum Entropy.J. Fisseler - 2012 - Logic Journal of the IGPL 20 (5):796-830.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  12. Symmetry, Invariance and Ontology in Physics and Statistics.Julio Michael Stern - 2011 - Symmetry 3 (3):611-635.
    This paper has three main objectives: (a) Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b) Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics) or subjective (in statistics) interpretations vs. objective interpretations that (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  13. Maximum Power and Maximum Entropy Production: Finalities in Nature.Stanley Salthe - 2010 - Cosmos and History 6 (1):114-121.
    I begin with the definition of power, and find that it is finalistic inasmuch as work directs energy dissipation in the interests of some system. The maximum power principle of Lotka and Odum implies an optimal energy efficiency for any work; optima are also finalities. I advance a statement of the maximum entropy production principle, suggesting that most work of dissipative structures is carried out at rates entailing energy flows faster than those that would associate with maximum power. This is (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  14. Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  15. Explaining Default Intuitions Using Maximum Entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  16. Maximum Shannon Entropy, Minimum Fisher Information, and an Elementary Game.Shunlong Luo - 2002 - Foundations of Physics 32 (11):1757-1772.
    We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is no solution (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  17. Entropia a Modelovanie.Ján Paulov - 2002 - Organon F: Medzinárodný Časopis Pre Analytickú Filozofiu 9 (2):157-175.
    It is well know that mathematical modelling in social sciences, particularly when concepts originally rooted in natural sciences are used, is, from methodological point of view, a touchy subject since the problem of reductionism can appear in this context. This paper addresses such a subject for its main objective is to discuss how the entropy concept, originally physical one, can generally be used in modelling, especially in the domain of social sciences. The way how this topic is approached in this (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  18. Common Sense and Maximum Entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  19. The Constraint Rule of the Maximum Entropy Principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  20. Can the Maximum Entropy Principle Be Explained as a Consistency Requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   22 citations  
  21. The W Systems: Between Maximum Entropy and Minimal Ranking….Michael Freund - 1994 - Journal of Applied Non-Classical Logics 4 (1):79-90.
  22. Application of the Maximum Entropy Principle to Nonlinear Systems Far From Equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 239.
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  23. A Fuzzy Neuron Based Upon Maximum Entropy Ordered Weighted Averaging.Michael O'Hagan - 1991 - In B. Bouchon-Meunier, R. R. Yager & L. A. Zadeh (eds.), Uncertainty in Knowledge Bases. Springer. pp. 598--609.
  24. Entropy and Uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity of (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    Bookmark   44 citations  
  25. The Status of the Principle of Maximum Entropy.Abner Shimony - 1985 - Synthese 63 (1):35 - 53.
  26. Maximum Entropy Inference as a Special Case of Conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  27. A Problem for Relative Information Minimizers in Probability Kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.
  28. Probability Kinematics.Zoltan Domotor, Mario Zanotti & Henson Graves - 1980 - Synthese 44 (3):421 - 442.
    Probability kinematics is studied in detail within the framework of elementary probability theory. The merits and demerits of Jeffrey's and Field's models are discussed. In particular, the principle of maximum relative entropy and other principles are used in an epistemic justification of generalized conditionals. A representation of conditionals in terms of Bayesian conditionals is worked out in the framework of external kinematics.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  29. Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.J. E. Shore & R. W. Johnson - 1980 - IEEE Transactions on Information Theory:26-37.
    Remove from this list  
    Translate
     
     
    Export citation  
     
    Bookmark   28 citations  
  30. Analysis of the Maximum Entropy Principle “Debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
    Jaynes's maximum entropy principle (MEP) is analyzed by considering in detail a recent controversy. Emphasis is placed on the inductive logical interpretation of “probability” and the concept of “total knowledge.” The relation of the MEP to relative frequencies is discussed, and a possible realm of its fruitful application is noted.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  31. The Well-Posed Problem.Edwin T. Jaynes - 1973 - Foundations of Physics 3 (4):477-493.
    Many statistical problems, including some of the most important for physical applications, have long been regarded as underdetermined from the standpoint of a strict frequency definition of probability; yet they may appear wellposed or even overdetermined by the principles of maximum entropy and transformation groups. Furthermore, the distributions found by these methods turn out to have a definite frequency correspondence; the distribution obtained by invariance under a transformation group is by far the most likely to be observed experimentally, in the (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   55 citations  
  32. Entropy and the Unity of Knowledge. [REVIEW]J. H. B. - 1962 - Review of Metaphysics 15 (4):676-677.
    In this inaugural address, a professor of applied mathematics develops the theme that new concepts such as "entropy" introduced in the mathematical description of nature have an influence far beyond the mathematical sciences, extending to such diverse fields as biology, the social sciences, religion, philosophy, literary analysis, etc.--B. J. H.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  33. Objective Bayesian Nets.Jon Williamson - manuscript
    I present a formalism that combines two methodologies: objective Bayesianism and Bayesian nets. According to objective Bayesianism, an agent’s degrees of belief (i) ought to satisfy the axioms of probability, (ii) ought to satisfy constraints imposed by background knowledge, and (iii) should otherwise be as non-committal as possible (i.e. have maximum entropy). Bayesian nets offer an efficient way of representing and updating probability functions. An objective Bayesian net is a Bayesian net representation of the maximum entropy probability function.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  34. Justifying Objective Bayesianism on Predicate Languages.Jürgen Landes & Jon Williamson - unknown
    Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  35. Maximum Entropy Applied to Inductive Logic and Reasoning.Jürgen Landes & Jon Williamson - unknown
    This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  36. Objective Bayesianism and the Maximum Entropy Principle.Jürgen Landes & Jon Williamson - unknown
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   12 citations