Upper and lower entropies of belief functions using compatible probability functions

  • C. W. R. Chau
  • P. Lingras
  • S. K. M. Wong
Approximate Reasoning
Part of the Lecture Notes in Computer Science book series (LNCS, volume 689)

Abstract

This paper uses the compatible probability functions to define the notion of upper entropy and lower entropy of a belief function as a generalization of the Shannon entropy. The upper entropy measures the amount of information conveyed by the evidence currently available. The lower entropy measures the maximum possible amount of information that can be obtained if further evidence becomes available. This paper also analyzes the different characteristics of these entropies and the computational aspect. The study demonstrates usefulness of compatible probability functions to apply various notions from the probability theory to the theory of belief functions.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    D.P. Bertsekas, Constrained Optimization and Lagrange Multiplier Methods, Academic Press, Inc., New York, pp. 71–75, 1982.Google Scholar
  2. 2.
    C.W.R. Chau, P.J. Lingras and S.K.M Wong, Upper and Lower Entroipes of Belief Functions, Technical Report, Department of Computer Science, University of Regina, Sask., Canada, 1990.Google Scholar
  3. 3.
    A. Dempster, Upper and Lower Probabilities Induced by a Multivalued Mapping, Annals of Mathematical Statistics, 38, pp. 325–339, 1967.Google Scholar
  4. 4.
    D. Dubois and H. Prade, Properties of measures of information in evidence and possibility theories, Fuzzy Sets and Systems, 24, 1987.Google Scholar
  5. 5.
    Fagin, R. and Halpern, J. (1990). A New Approach to Updating Beliefs, Proceeding of the Sixth Conference on Uncertainty in AI, Cambridge, Mass., July 27–29, pp. 317–325.Google Scholar
  6. 6.
    S. Guiasu, Information Theory with Applications, McGraw-Hill, London, 1977.Google Scholar
  7. 7.
    J. Hartmanis, The Application of Some Basic Inequalities for Entropy, Information and Control, Vol. 2, pp. 199–213, 1959.Google Scholar
  8. 8.
    IMSL, Nonlinearly Constrained Minimization Using finite-different gradient, IMSL Math/Library User's Manual, pp. 895–908, 1987.Google Scholar
  9. 9.
    G.J. Klir and T.A. Folger, Fuzzy Sets, Uncertainty, and Information, Prentice Hall, Englewood Cliffs, NJ, 1988.Google Scholar
  10. 10.
    P.J. Lingras, Qualitative and Quantitative Reasoning Under Uncertainty in Intelligent Information Systems, Unpublished Ph.D. Dissertation, Department of Computer Science, University of Regina, Sask., Canada, 1991.Google Scholar
  11. 11.
    P. J. Lingras and S.K.M. Wong, Two Perspectives of the Dempster-Shafer Theory of Belief Functions, to appear in The International Journal of Man-Machine Studies, 1990.Google Scholar
  12. 12.
    J.R. Quinlan, Inductive inference as a tool for the construction of high-performance programs, Machine Learning, R.S. Michalski, T.M. Mitchell and J. Carbonell eds. Palo Alto, CA: Tioga, 1983.Google Scholar
  13. 13.
    G. Shafer, A Mathematical Theory of Evidence, Princeton. N.J.: Princeton University Press, 1976.Google Scholar
  14. 14.
    G. Shafer, Belief functions and possibilities measures, The Analysis of Fuzzy Information 1, Bezdek, J.C., CRC Press, 1986.Google Scholar
  15. 15.
    C.E. Shannon, A mathematic theory of communication, Bell Technical Journal, Vol. 4, pp. 379–423, 1948.Google Scholar
  16. 16.
    H.E. Stephanou and S. Lu, Measuring Consensus Effectiveness by a Generalized Entropy Criterion, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 10, No. 4, pp. 544–554, July 1988.Google Scholar
  17. 17.
    S.K.M. Wong and P.J. Lingras, Unification of the Bayes Conditionalization and the Dempster Rule by Minimizing Information Gain, to appear in The proceedings of the Sixth Conference on Uncertainty in AI, Cambridge, Mass., July 27–29 1990.Google Scholar
  18. 18.
    S.K.M. Wong, W. Ziarko and R. Le Ye, Comparison of rough-set and statistical method in inductive learning, International Journal of Man-Machine Studies, 25, pp. 53–72, 1986.Google Scholar
  19. 19.
    R.R. Yager, Entropy and Specificity in a Mathematical Theory of Evidence, International Journal of General Systems, Vol. 9, pp. 249–260, 1983.Google Scholar

Copyright information

© Springer-Verlag 1993

Authors and Affiliations

  • C. W. R. Chau
    • 1
  • P. Lingras
    • 1
  • S. K. M. Wong
    • 1
  1. 1.Department of Computer ScienceUniversity of ReginaReginaCanada

Personalised recommendations