Advertisement

Generalized Information Theory Based on the Theory of Hints

  • Marc Pouly
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6717)

Abstract

The aggregate uncertainty is the only known functional for Dempster-Shafer theory that generalizes the Shannon and Hartley measures and satisfies all classical requirements for uncertainty measures, including subadditivity. Although being posed several times in the literature, it is still an open problem whether the aggregate uncertainty is unique under these properties. This paper derives an uncertainty measure based on the theory of hints and shows its equivalence to the pignistic entropy. It does not satisfy subadditivity, but the viewpoint of hints uncovers a weaker version of subadditivity. On the other hand, the pignistic entropy has some crucial advantages over the aggregate uncertainty. i.e. explicitness of the formula and sensitivity to changes in evidence. We observe that neither of the two measures captures the full uncertainty of hints and propose an extension of the pignistic entropy called hints entropy that satisfies all axiomatic requirements, including subadditivity, while preserving the above advantages over the aggregate uncertainty.

Keywords

Generalized Information Theory Theory of Hints Dempster-Shafer Theory Pignistic Entropy Hints Entropy 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Chau, C., Lingras, P., Wong, S.: Upper and lower entropies of belief functions using compatible probability functions. In: Komorowski, J., Raś, Z.W. (eds.) ISMIS 1993. LNCS, vol. 689, pp. 306–315. Springer, Heidelberg (1993)CrossRefGoogle Scholar
  2. 2.
    Dempster, A.P.: A generalization of bayesian inference. J. Royal Stat. Soc. B 30, 205–247 (1968)MathSciNetzbMATHGoogle Scholar
  3. 3.
    Dubois, D., Prade, H.: A note on measures of specificity for fuzzy sets. Int. J. Gen. Systems 10(4), 279–283 (1985)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Harmanec, D.: Toward a characterization of uncertainty measure for the dempster-shafer theory. In: UAI 1995: Proc. of the 11th Conference Annual Conference on Uncertainty in Artificial Intelligence, pp. 255–261 (1995)Google Scholar
  5. 5.
    Harmanec, D.: Measure of uncertainty and information. In: Imprecise Probability Project (1999)Google Scholar
  6. 6.
    Harmanec, D., Klir, G.: Measuring total uncertainty in dempster-shafer theory: a novel approach. Int. J. Gen. Systems 22(4), 405–419 (1994)CrossRefzbMATHGoogle Scholar
  7. 7.
    Harmanec, D., Resconi, G., Klir, G.J., Pan, Y.: On the computation of uncertainty measure in the dempster-shafer theory. Int. J. Gen. Systems 25(2), 153 (1996)CrossRefzbMATHGoogle Scholar
  8. 8.
    Higashi, M., Klir, G.J.: Measures of uncertainty and information based on possibility distributions. Int. J. Gen. Systems 9(1), 43–58 (1982)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Jousselme, A.-L., Liu, C., Grenier, D., Bossé, E.: Measuring ambiguity in the evidence theory. IEEE Trans. on Systems, Man, and Cybernetics, Part A 36(5), 890–903 (2006)CrossRefGoogle Scholar
  10. 10.
    Klir, G.J.: Uncertainty and Information: Foundations of Generalized Information Theory. John Wiley & Sons, Inc., Binghamton University (2005)Google Scholar
  11. 11.
    Klir, G.J., Lewis, H.W.: Remarks on ”measuring ambiguity in the evidence theory”. IEEE Trans. on Systems, Man, and Cybernetics, Part A 38(4), 995–999 (2008)CrossRefGoogle Scholar
  12. 12.
    Kohlas, J.: Information Algebras: Generic Structures for Inference. Springer, Heidelberg (2003)CrossRefzbMATHGoogle Scholar
  13. 13.
    Kohlas, J., Monney, P.-A.: A Mathematical Theory of Hints. An Approach to the Dempster-Shafer Theory of Evidence. LNEMS. Springer, Heidelberg (1995)CrossRefzbMATHGoogle Scholar
  14. 14.
    Kohlas, J., Monney, P.-A.: Statistical Information. Assumption-Based Statistical Inference. Sigma Series in Stochastics, vol. 3. Heldermann (2008)Google Scholar
  15. 15.
    Maeda, Y., Ichihashi, H.: An uncertainty with monotonicity under the random set inclusion. Int. J. Gen. Systems 21(4), 379 (1993)CrossRefzbMATHGoogle Scholar
  16. 16.
    Monney, P.-A.: A Mathematical Theory of Arguments for Statistical Evidence. Contributions to Statistics. Physica-Verlag, Heidelberg (2003)CrossRefzbMATHGoogle Scholar
  17. 17.
    Pouly, M., Kohlas, J.: Generic Inference - A Unifying Theory for Automated Reasoning. John Wiley & Sons, Inc., Chichester (2011)CrossRefzbMATHGoogle Scholar
  18. 18.
    Schneuwly, C.: Information - eine diskussion. Term Paper, University of Fribourg (1999)Google Scholar
  19. 19.
    Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)zbMATHGoogle Scholar
  20. 20.
    Smets, P., Kennes, R.: The transferable belief model. Artif. Intell. 66(2), 191–234 (1994)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Smith, R.: Generalized Information Theory: Resolving some old Questions and opening some new ones. PhD thesis, University of Binghamton (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Marc Pouly
    • 1
  1. 1.Interdisciplinary Centre for Security, Reliability and TrustUniversity of LuxembourgLuxembourg

Personalised recommendations