Contents
42 found
Order:
  1. Bertrand’s Paradox and the Principle of Indifference.Nicholas Shackel - 2024 - Abingdon: Routledge.
    Events between which we have no epistemic reason to discriminate have equal epistemic probabilities. Bertrand’s chord paradox, however, appears to show this to be false, and thereby poses a general threat to probabilities for continuum sized state spaces. Articulating the nature of such spaces involves some deep mathematics and that is perhaps why the recent literature on Bertrand’s Paradox has been almost entirely from mathematicians and physicists, who have often deployed elegant mathematics of considerable sophistication. At the same time, the (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  2. Bertrand’s Paradox Resolution and Its Implications for the Bing–Fisher Problem.Richard A. Chechile - 2023 - Mathematics 11 (15).
    Bertrand’s paradox is a problem in geometric probability that has resisted resolution for more than one hundred years. Bertrand provided three seemingly reasonable solutions to his problem — hence the paradox. Bertrand’s paradox has also been influential in philosophical debates about frequentist versus Bayesian approaches to statistical inference. In this paper, the paradox is resolved (1) by the clarification of the primary variate upon which the principle of maximum entropy is employed and (2) by imposing constraints, based on a mathematical (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  3. Representing preorders with injective monotones.Pedro Hack, Daniel A. Braun & Sebastian Gottwald - 2022 - Theory and Decision 93 (4):663-690.
    We introduce a new class of real-valued monotones in preordered spaces, injective monotones. We show that the class of preorders for which they exist lies in between the class of preorders with strict monotones and preorders with countable multi-utilities, improving upon the known classification of preordered spaces through real-valued monotones. We extend several well-known results for strict monotones (Richter–Peleg functions) to injective monotones, we provide a construction of injective monotones from countable multi-utilities, and relate injective monotones to classic results concerning (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  4. From Art to Information System.Miro Brada - 2021 - AGI Laboratory.
    This insight to art came from chess composition concentrating art in a very dense form. To identify and mathematically assess the uniqueness is the key applicable to other areas eg. computer programming. Maximization of uniqueness is minimization of entropy that coincides as well as goes beyond Information Theory (Shannon, 1948). The reusage of logic as a universal principle to minimize entropy, requires simplified architecture and abstraction. Any structures (e.g. plugins) duplicating or dividing functionality increase entropy and so unreliability (eg. British (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  5. Causal versions of maximum entropy and principle of insufficient reason.Dominik Janzing - 2021 - Journal of Causal Inference 9 (1):285-301.
    The principle of insufficient reason assigns equal probabilities to each alternative of a random experiment whenever there is no reason to prefer one over the other. The maximum entropy principle generalizes PIR to the case where statistical information like expectations are given. It is known that both principles result in paradoxical probability updates for joint distributions of cause and effect. This is because constraints on the conditional P P\left result in changes of P P\left that assign higher probability to those (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  6. The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):423-442.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While the conjecture is (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  7. Towards the entropy-limit conjecture.Jürgen Landes, Soroush Rafiee Rad & Jon Williamson - 2020 - Annals of Pure and Applied Logic 172 (2):102870.
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size n of the sublanguage (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  8. Probabilistic stability, agm revision operators and maximum entropy.Krzysztof Mierzewski - 2020 - Review of Symbolic Logic:1-38.
    Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of maximum (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  9. Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can help to (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  10. Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the chords, that is unique (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  11. Predictive Statistical Mechanics and Macroscopic Time Evolution: Hydrodynamics and Entropy Production.Domagoj Kuić - 2016 - Foundations of Physics 46 (7):891-914.
    In the previous papers, it was demonstrated that applying the principle of maximum information entropy by maximizing the conditional information entropy, subject to the constraint given by the Liouville equation averaged over the phase space, leads to a definition of the rate of entropy change for closed Hamiltonian systems without any additional assumptions. Here, we generalize this basic model and, with the introduction of the additional constraints which are equivalent to the hydrodynamic continuity equations, show that the results obtained are (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  12. Justifying Objective Bayesianism on Predicate Languages.Jürgen Landes & Jon Williamson - 2015 - Entropy 17 (4):2459-2543.
    Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  13. Maximum Entropy Applied to Inductive Logic and Reasoning.Jürgen Landes & Jon Williamson (eds.) - 2015 - Ludwig-Maximilians-Universität München.
    This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  14. Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  15. The principle of maximum entropy and a problem in probability kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports maxent. This (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  16. What You See Is What You Get.Jeff B. Paris - 2014 - Entropy 16 (11):6186-6194.
    This paper corrects three widely held misunderstandings about Maxent when used in common sense reasoning: That it is language dependent; That it produces objective facts; That it subsumes, and so is at least as untenable as, the paradox-ridden Principle of Insufficient Reason.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  17. Objective Bayesianism and the maximum entropy principle.Jürgen Landes & Jon Williamson - 2013 - Entropy 15 (9):3528-3591.
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  18. How to exploit parametric uniformity for maximum entropy reasoning in a relational probabilistic logic.Marc Finthammer & Christoph Beierle - 2012 - In Luis Farinas del Cerro, Andreas Herzig & Jerome Mengin (eds.), Logics in Artificial Intelligence. Springer. pp. 189--201.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  19. First-order probabilistic conditional logic and maximum entropy.J. Fisseler - 2012 - Logic Journal of the IGPL 20 (5):796-830.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  20. Symmetry, Invariance and Ontology in Physics and Statistics.Julio Michael Stern - 2011 - Symmetry 3 (3):611-635.
    This paper has three main objectives: (a) Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b) Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics) or subjective (in statistics) interpretations vs. objective interpretations that (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  21. Maximum power and maximum entropy production: finalities in nature.Stanley Salthe - 2010 - Cosmos and History 6 (1):114-121.
    I begin with the definition of power, and find that it is finalistic inasmuch as work directs energy dissipation in the interests of some system. The maximum power principle of Lotka and Odum implies an optimal energy efficiency for any work; optima are also finalities. I advance a statement of the maximum entropy production principle, suggesting that most work of dissipative structures is carried out at rates entailing energy flows faster than those that would associate with maximum power. This is (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   6 citations  
  22. Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  23. Explaining default intuitions using maximum entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  24. Maximum Shannon Entropy, Minimum Fisher Information, and an Elementary Game.Shunlong Luo - 2002 - Foundations of Physics 32 (11):1757-1772.
    We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is no solution (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  25. Entropia a modelovanie.Ján Paulov - 2002 - Organon F: Medzinárodný Časopis Pre Analytickú Filozofiu 9 (2):157-175.
    It is well know that mathematical modelling in social sciences, particularly when concepts originally rooted in natural sciences are used, is, from methodological point of view, a touchy subject since the problem of reductionism can appear in this context. This paper addresses such a subject for its main objective is to discuss how the entropy concept, originally physical one, can generally be used in modelling, especially in the domain of social sciences. The way how this topic is approached in this (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  26. Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  27. In defense of the maximum entropy inference process.J. Paris & A. Vencovská - 1997 - International Journal of Approximate Reasoning 17 (1):77-103.
    This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the maximum entropy inference process, ME, is the only inference process respecting “common sense.” This result was criticized on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identified with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependent. In a (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   24 citations  
  28. The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  29. Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  30. The W systems: between maximum entropy and minimal ranking….Michael Freund - 1994 - Journal of Applied Non-Classical Logics 4 (1):79-90.
  31. Application of the maximum entropy principle to nonlinear systems far from equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 239.
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  32. A fuzzy neuron based upon maximum entropy ordered weighted averaging.Michael O'Hagan - 1991 - In B. Bouchon-Meunier, R. R. Yager & L. A. Zadeh (eds.), Uncertainty in Knowledge Bases. Springer. pp. 598--609.
  33. Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity of (...)
    Remove from this list   Direct download (8 more)  
     
    Export citation  
     
    Bookmark   54 citations  
  34. The status of the principle of maximum entropy.Abner Shimony - 1985 - Synthese 63 (1):35 - 53.
  35. Maximum entropy inference as a special case of conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  36. A problem for relative information minimizers in probability kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.
  37. Probability kinematics.Zoltan Domotor, Mario Zanotti & Henson Graves - 1980 - Synthese 44 (3):421 - 442.
    Probability kinematics is studied in detail within the framework of elementary probability theory. The merits and demerits of Jeffrey's and Field's models are discussed. In particular, the principle of maximum relative entropy and other principles are used in an epistemic justification of generalized conditionals. A representation of conditionals in terms of Bayesian conditionals is worked out in the framework of external kinematics.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  38. Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.J. E. Shore & R. W. Johnson - 1980 - IEEE Transactions on Information Theory:26-37.
    Remove from this list  
     
    Export citation  
     
    Bookmark   32 citations  
  39. Analysis of the maximum entropy principle “debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
    Jaynes's maximum entropy principle (MEP) is analyzed by considering in detail a recent controversy. Emphasis is placed on the inductive logical interpretation of “probability” and the concept of “total knowledge.” The relation of the MEP to relative frequencies is discussed, and a possible realm of its fruitful application is noted.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  40. The Well-Posed Problem.Edwin T. Jaynes - 1973 - Foundations of Physics 3 (4):477-493.
    Many statistical problems, including some of the most important for physical applications, have long been regarded as underdetermined from the standpoint of a strict frequency definition of probability; yet they may appear wellposed or even overdetermined by the principles of maximum entropy and transformation groups. Furthermore, the distributions found by these methods turn out to have a definite frequency correspondence; the distribution obtained by invariance under a transformation group is by far the most likely to be observed experimentally, in the (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   67 citations  
  41. Entropy and the Unity of Knowledge. [REVIEW]J. H. B. - 1962 - Review of Metaphysics 15 (4):676-677.
    In this inaugural address, a professor of applied mathematics develops the theme that new concepts such as "entropy" introduced in the mathematical description of nature have an influence far beyond the mathematical sciences, extending to such diverse fields as biology, the social sciences, religion, philosophy, literary analysis, etc.--B. J. H.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  42. Objective bayesian nets.Jon Williamson - manuscript
    I present a formalism that combines two methodologies: objective Bayesianism and Bayesian nets. According to objective Bayesianism, an agent’s degrees of belief (i) ought to satisfy the axioms of probability, (ii) ought to satisfy constraints imposed by background knowledge, and (iii) should otherwise be as non-committal as possible (i.e. have maximum entropy). Bayesian nets offer an efficient way of representing and updating probability functions. An objective Bayesian net is a Bayesian net representation of the maximum entropy probability function.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   7 citations