Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Jürgen Landes; Jon Williamson (2015)
Publisher: MDPI AG
Journal: Entropy
Languages: English
Types: Article
Subjects: BC, predicate language, Astrophysics, g-entropy, B1, QB460-466, Q, objective Bayesianism, Science, Physics, scoring rule, QA273, QC1-999, minimax
Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism to inductive logic. We show that the maximum entropy principle can be motivated largely in terms of minimising worst-case expected loss.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 1. Williamson, J. In defence of objective Bayesianism; Oxford University Press: Oxford, UK, 2010.
    • 2. There are several alternatives to the objective Bayesian account of strength of belief, including subjective Bayesianism, imprecise probability, the theory of Dempster-Shafer belief functions and related theories. Here we only have the space to motivate objective Bayesianism, not to assess these other views.
    • 3. Taking the convex hull may mean that a calibrated belief function does not satisfy the known constraints on physical probability. For example, if is known to be a statement about the past then it is known that its physical probability is 0 or 1; bel is not constrained to be 0 or 1, however, unless it is also known whether or not is true. Similarly, it may be known that two propositions are probabilistically independent with respect to physical probability; this need not imply that they are probabilistically independent with respect to epistemic probability. See Williamson [1] (pp. 44-45) for further discussion of this point.
    • 4. Landes, J.; Williamson, J. Objective Bayesianism and the Maximum Entropy Principle. Entropy 2013, 15, 3528-3591.
    • 5. Gaifman, H. Concerning Measures in First Order Calculi. Isr. J. Math. 1964, 2, 1-18.
    • 6. Williamson, J. Lectures on Inductive Logic; Oxford University Press: Oxford, UK, 2015.
    • 7. Paris, J.B. The Uncertain Reasoner's Companion; Cambridge University Press: Cambridge, UK, 1994.
    • 8. Williamson, J. Probability logic. In Handbook of the Logic of Argument and Inference: the Turn toward the Practical; Gabbay, D., Johnson, R., Ohlbach, H.J., Woods, J., Eds.; Elsevier: Amsterdam, The Netherlands, 2002; pp. 397-424.
    • 9. Haenni, R.; Romeijn, J.W.; Wheeler, G.; Williamson, J. Probabilistic Logics and Probabilistic Networks; Synthese Library, Springer: Dordrecht, The Netherlands, 2011.
    • 10. Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley and Sons: New York, NY, USA, 1991.
    • 11. Rudin, W. Principles of Mathematical Analysis, 3 ed.; McGraw-Hill: New York, USA, 1973.
    • 12. de Finetti, B. Theory of Probability; Wiley: London, UK, 1974.
    • 13. Joyce, J.M. A Nonpragmatic Vindication of Probabilism. Philos. Sci. 1998, 65, 575-603.
    • 14. Joyce, J.M. Accuracy and Coherence: Prospects for an Alethic Epistemology of Partial Belief. In Degrees of Belief; Huber, F., Schmidt-Petri, C., Eds.; Synthese Library 342; Springer: New York, NY, USA, 2009.
    • 15. Grünwald, P.; Dawid, A.P. Game Theory, Maximum Entropy, Minimum Discrepancy, and Robust Bayesian Decision Theory. Ann. Stat. 2004, 32, 1367-1433.
    • 16. Savage, L.J. Elicitation of Personal Probabilities and Expectations. J. Am. Stat. Assoc. 1971, 66, 783-801.
    • 17. Popper, K.R. The Logic of Scientific Discovery; Routledge: London, UK, 1999.
    • 18. Barnett, O.; Paris, J.B. Maximum Entropy Inference with Quantified Knowledge. Logic J. IGPL 2008, 16, 85-98.
    • 19. Williamson, J. Objective Bayesian Probabilistic Logic. J. Algorithm. 2008, 63, 167-183.
  • No related research data.
  • Discovered through pilot similarity algorithms. Send us your feedback.

Share - Bookmark

Cite this article