Probabilistic knowledge representation and reasoning at maximum entropy by SPIRIT

  • Carl-Heinz Meyer
  • Wilhelm Rödder
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1137)


Current probabilistic expert systems assume complete knowledge of the joint distribution. To specify this distribution one has to construct a directed acyclic graph attached by a lot of tables filled with conditional probabilities. Often these probabilities are unknown and the quantification is more or less arbitrary. SPIRIT is an expert system shell for probabilistic knowledge bases which uses the principle of maximum entropy to avoid these lacks. Knowledge acquisition is performed by specifying probabilistic facts and rules on discrete variables in an extended propositional logic syntax. The shell generates the unique probability distribution which respects all facts and rules and maximizes entropy. After creating this distribution the shell is ready for answering simple and complex queries. The process of knowledge acquisition, knowledge processing and answering queries is revealed in detail on a nontrivial example.


Expert System Uncertain Reasoning Maximum Entropy 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    P. Cheeseman (1983). A method of computing generalized Bayesian probability values for expert systems. Proc., Intl. Joint Conf. on AI (IJCAI-83), Karlsruhe, Germany, 198–202.Google Scholar
  2. [2]
    Csiszár, I.; I-divergence Geometry of Probability Distributions and Minimization Problems, Ann. Prob. 3, S. 146–158, 1975.Google Scholar
  3. [3]
    Hájek, P.; Havránek, T.; Jiroušek, R.; Uncertain Information Processing in Expert Systems, CRC-Press, Boca Raton, Florida, 1992.Google Scholar
  4. [4]
    Jensen, F.V.; Jensen, F.; Optimal junction trees, in: Proceedings of the Tenth conference on Uncertainty in Artificial Intelligence, Ramon Lopez de Maturas, David Poole (eds.) Morgan Kaufmann Inc., 360–366, 1994.Google Scholar
  5. [5]
    Kane, T.B.; Reasoning with Maximum Entropy in Expert Systems, W.T. Grandy, Jr. and L.H. Schick (eds.), Maximum Entropy and Bayesian Methods, 201–203, 1991Google Scholar
  6. [6]
    Kjaerulff, U.; Triangulation of graphs — algorithms giving small total state space, Technical Report R90-09, Dept. of Mathematics and Computer Science, Aalborg University, 1990.Google Scholar
  7. [7]
    Lauritzen, S.L.; Spiegelhalter, D.J.; Local Computations with probabilities on graphical structures and their application to expert systems, J. Roy. Stat. Soc. Ser. B, 50, 157–224 (with discussion), 1988.Google Scholar
  8. [8]
    Lauritzen, S.L.; Thiesson, B.; Spiegelhalter, D.J.; Diagnostic systems by model selection: a case study, in: Selecting Models from Data, Lecture Notes in Statistics 89, P. Cheeseman and R.W. Oldford (eds.), Springer, 143–152, 1994.Google Scholar
  9. [9]
    Lemmer, J.F.; Generalized bayesian updating of incompletely specified distributions, Large Scale Systems, 5, 1983Google Scholar
  10. [10]
    Lemmer, J.F.; Barth, S.W.; Efficient minimum information updating for bayesian inferencing, in: Proc. Nation. Conf. on Artificial Intelligence AAAI, Pittsburgh, 1983Google Scholar
  11. [11]
    Neapolitan, R.E.; Probabilistic Reasoning in Expert Systems — Theory and Algorithms, John Wiley & Sons, New York, 1990.Google Scholar
  12. [12]
    Nilsson, N.J.; Probabilistic Logic, Artificial Intelligence 28 (no.1): 71–87. 1986Google Scholar
  13. [13]
    Paris, J.B.; Vencovská, A.; A Note on the Inevitability of Maximum Entropy, Int. J. Approximate. Reasoning, 4, 183–223, 1990.Google Scholar
  14. [14]
    Pearl, J.; Probabilistic Reasoning in Intelligent Systems, Morgan Kaufmann, San Mateo, California, 1988Google Scholar
  15. [15]
    Shore, J.E.; Relative Entropy, Probabilistic Inference, and AI, in: Uncertainty in Artificial Intelligence, ed. L.N. Kanal and J.F. Lemmer, North-Holland, 1986Google Scholar
  16. [16]
    Shore, J.E.; Johnson, R.W.; Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross Entropy, IEEE Trans. Inform. Theory IT-26,1,26–37, (see also: comments and corrections..., in IEEE Trans. Inform. Theory IT-29, 1983), 1980Google Scholar
  17. [17]
    Tarjan, R.E.; Yannakakis, M.; Simple linear-time algorithms to test chordality of graphs, test acyclicity of hypergraphs and selectively reduce acyclic hypergraphs, SIAM J. Comp.,13, 566–579, 1984.Google Scholar
  18. [18]
    Wen, W.X.; Minimum Cross Entropy Reasoning in Recursive Causal Networks, in: Uncertainty in Artificial Intelligence 4, ed. R.D. Shachter, T.S. Levitt, L.N. Kanal, J.F. Lemmer, North-Holland, 1990Google Scholar
  19. [19]
    Whittaker, J.; Graphical Models in Applied Multivariate Statistics, John Wiley & Sons, New York, 1990.Google Scholar
  20. [20]
    Norsys Software Corp.; Library of Bayesian — Networks, available at the WWW under URL: Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1996

Authors and Affiliations

  • Carl-Heinz Meyer
    • 1
  • Wilhelm Rödder
    • 1
  1. 1.FernUniversität HagenGermany

Personalised recommendations