Probabilisitc Logic Programming under Maximum Entropy

  • Thomas Lukasiewicz
  • Gabriele Kern-Isberner
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1638)

Abstract

In this paper, we focus on the combination of probabilistic logic programming with the principle of maximum entropy. We start by defining probabilistic queries to probabilistic logic programs and their answer substitutions under maximum entropy. We then present an efficient linear programming characterization for the problem of deciding whether a probabilistic logic program is satisfiable. Finally, and as a central contribution of this paper, we introduce an efficient technique for approximative probabilistic logic programming under maximum entropy. This technique reduces the original entropy maximization task to solving a modified and relatively small optimization problem.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    K. R. Apt. Logic programming. In J. van Leeuwen, editor, Handbook of Theoretical Computer Science, volume B, chapter 10, pages 493–574. MIT Press, 1990.Google Scholar
  2. 2.
    F. Bacchus, A. Grove, J. Y. Halpern, and D. Koller. From statistical knowledge bases to degrees of beliefs. Artif. Intell., 87:75–143, 1996.CrossRefMathSciNetGoogle Scholar
  3. 3.
    B. de Finetti. Theory of Probability. J. Wiley, New York, 1974.Google Scholar
  4. 4.
    A. Dekhtyar and V. S. Subrahmanian. Hybrid probabilistic programs. In Proc. of the 14th International Conference on Logic Programming, pages 391–405, 1997.Google Scholar
  5. 5.
    R. Fagin, J. Y. Halpern, and N. Megiddo. A logic for reasoning about probabilities. Inf. Comput., 87:78–128, 1990.CrossRefMathSciNetMATHGoogle Scholar
  6. 6.
    V. Fischer and M. Schramm. tabl — a tool for efficient compilation of probabilistic constraints. Technical Report TUM-I9636, TU München, 1996.Google Scholar
  7. 7.
    A. J. Grove, J. H. Halpern, and D. Koller. Random worlds and maximum entropy. J. Artif. Intell. Res., 2:33–88, 1994.MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    P. Haddawy. Generating Bayesian networks from probability logic knowledge bases. In Proceedings of the 10th Conference on Uncertainty in Artificial Intelligence, pages 262–269. Morgan Kaufmann, 1994.Google Scholar
  9. 9.
    J. Y. Halpern. An analysis of first-order logics of probability. Artif. Intell., 46:311–350, 1990.MATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    M. Jaeger. Relational Bayesian networks. In Proceedings of the 13th Conference on Uncertainty in Artificial Intelligence, pages 266–273. Morgan Kaufmann, 1997.Google Scholar
  11. 11.
    E. T. Jaynes. Papers on Probability, Statistics and Statistical Physics. D. Reidel, Dordrecht, Holland, 1983.MATHGoogle Scholar
  12. 12.
    R. W. Johnson and J. E. Shore. Comments on and corrections to “Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy”. IEEE Trans. Inf. Theory, IT-29(6):942–943, 1983.CrossRefMathSciNetGoogle Scholar
  13. 13.
    G. Kern-Isberner. Characterizing the principle of minimum cross-entropy within a conditional-logical framework. Artif. Intell., 98:169–208, 1998.MATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    T. Lukasiewicz. Probabilistic logic programming. In Proc. of the 13th European Conference on Artificial Intelligence, pages 388–392. J. Wiley & Sons, 1998.Google Scholar
  15. 15.
    T. Lukasiewicz. Local probabilistic deduction from taxonomic and probabilistic knowledge-bases over conjunctive events. Int. J. Approx. Reason., 1999. To appear.Google Scholar
  16. 16.
    T. Lukasiewicz. Probabilistic deduction with conditional constraints over basic events. J. Artif. Intell. Res., 1999. To appear.Google Scholar
  17. 17.
    R. T. Ng. Semantics, consistency, and query processing of empirical deductive databases. IEEE Trans. Knowl. Data Eng., 9(1):32–49, 1997.CrossRefGoogle Scholar
  18. 18.
    R. T. Ng and V. S. Subrahmanian. A semantical framework for supporting subjective and conditional probabilities in deductive databases. J. Autom. Reasoning, 10(2):191–235, 1993.MATHCrossRefMathSciNetGoogle Scholar
  19. 19.
    N. J. Nilsson. Probabilistic logic. Artif. Intell., 28:71–88, 1986.MATHCrossRefMathSciNetGoogle Scholar
  20. 20.
    J. B. Paris and A. Vencovska. A note on the inevitability of maximum entropy. Int. J. Approx. Reasoning, 14:183–223, 1990.CrossRefMathSciNetGoogle Scholar
  21. 21.
    J. Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, San Mateo, CA, 1988.Google Scholar
  22. 22.
    D. Poole. Probabilistic Horn abduction and Bayesian networks. Artif. Intell., 64:81–129, 1993.MATHCrossRefGoogle Scholar
  23. 23.
    W. Rödder and G. Kern-Isberner. Representation and extraction of information by probabilistic logic. Inf. Syst., 21(8):637–652, 1997.CrossRefGoogle Scholar
  24. 24.
    W. Rödder and C.-H. Meyer. Coherent knowledge processing at maximum entropy by SPIRIT. In Proceedings of the 12th Conference on Uncertainty in Artificial Intelligence, pages 470–476. Morgan Kaufmann, 1996.Google Scholar
  25. 25.
    M. Schramm, V. Fischer, and P. Trunk. Probabilistic knowledge representation and reasoning with maximum entropy-the system PIT, Technical report (forthcoming), 1999.Google Scholar
  26. 26.
    C. E. Shannon and W. Weaver. A Mathematical Theory of Communication. University of Illinois Press, Urbana, Illinois, 1949.Google Scholar
  27. 27.
    J. E. Shore and R. W. Johnson. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans. Inf. Theory, IT-26:26–37, 1980.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Thomas Lukasiewicz
    • 1
  • Gabriele Kern-Isberner
    • 2
  1. 1.Institut für Informatik, Universität GießenGießenGermany
  2. 2.Fachbereich Informatik, FernUniversität HagenHagenGermany

Personalised recommendations