Advertisement

KI - Künstliche Intelligenz

, Volume 33, Issue 1, pp 97–100 | Cite as

Concepts and Algorithms for Computing Maximum Entropy Distributions for Knowledge Bases with Relational Probabilistic Conditionals

  • Marc FinthammerEmail author
Dissertation and Habilitation Abstracts
  • 36 Downloads

Introduction

Many practical problems are concerned with incomplete and uncertain knowledge about domains where relations among different objects play an important role. Probabilistic conditionals provide an adequate way to express such uncertain, rule-like knowledge of the form “If A holds, then B holds with probability p”, where A and B may be not just propositional but relational formulas. For example, consider the following setting which takes places in movie business: An actor can be awarded with certain awards, e.g.  Oscar, Palme d’Or, Golden Bear. Depending on that, some director might consider to engage that actor with a probability of 0.3. This scenario can be modeled by the probabilistic conditional \(r\!: \left( {engage}(X, Z) \,|\, {awarded}(X, Y)\right) [0.3 ]\)

Keywords

Relational probabilistic knowledge bases Uncertain knowledge Probabilistic reasoning Maximum entropy principle 

References

  1. 1.
    Beierle C, Kern-Isberner G, Finthammer M, Potyka N (2015) Extending and completing probabilistic knowledge and beliefs without bias. KI Künstliche Intelligenz 29(3):255–262CrossRefGoogle Scholar
  2. 2.
    Beierle C, Krämer A (2015) Achieving parametric uniformity for knowledge bases in a relational probabilistic conditional logic with maximum entropy semantics. Ann Math Artif Intell 73:1–2MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Darroch JN, Ratcliff D (1972) Generalized iterative scaling for log-linear models. In: Annals of Mathematical Statistics, vol 43. Institute of Mathematical Statistics, pp 1470–1480Google Scholar
  4. 4.
    Finthammer M (2017) Concepts and algorithms for computing maximum entropy distributions for knowledge bases with relational probabilistic conditionals. Dissertations in artificial intelligence. IOS PressGoogle Scholar
  5. 5.
    Finthammer M, Beierle C (2016) On the relationship between aggregating semantics and FO-PCL grounding semantics for relational probabilistic conditionals. In: Beierle C, Brewka G, Thimm M (eds) Computational models of rationality. Essays Dedicated to Gabriele Kern-Isberner on the Occasion of her 60th Birthday, Tributes, vol 29. College Publications, pp 297–315Google Scholar
  6. 6.
    Finthammer M, Thimm M (2012) An integrated development environment for probabilistic relational reasoning. Logic J IGPL 20(5):831–871MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Fisseler J (2012) First-order probabilistic conditional logic and maximum entropy. Logic J IGPL 20:796–830MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Kern-Isberner G (2001) Conditionals in nonmonotonic reasoning and belief revision, LNAI, vol 2087. Springer, New YorkCrossRefzbMATHGoogle Scholar
  9. 9.
    Kern-Isberner G, Thimm M (2010) Novel semantical approaches to relational probabilistic conditionals. Proc KR 2010:382–392Google Scholar
  10. 10.
    Paris JB (1994) The uncertain reasoner’s companion. Cambridge University Press, CambridgezbMATHGoogle Scholar
  11. 11.
    Thimm M, Kern-Isberner G (2012) On probabilistic inference in relational conditional logics. Logic J IGPL 20(5):872–908MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Wilhelm M, Kern-Isberner G, Finthammer M, Beierle C (2018) A generalized iterative scaling algorithm for maximum entropy model computations respecting probabilistic independencies. In: Proceedings of FoIKS-2018Google Scholar

Copyright information

© Gesellschaft für Informatik e.V. and Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Faculty of Mathematics and Computer ScienceFernUniversität in HagenHagenGermany

Personalised recommendations