Computing Entropy under Interval Uncertainty. I

  • Hung T. Nguyen
  • Vladik Kreinovich
  • Berlin Wu
  • Gang Xiang
Part of the Studies in Computational Intelligence book series (SCI, volume 393)

Abstract

Measurement results (and, more generally, estimates) are never absolutely accurate: there is always an uncertainty, the actual value x is, in general, different from the estimate \(\tilde{x}\). Sometimes, we know the probability of different values of the estimation error Δx \(\scriptstyle \rm def\atop{=}\) \(\tilde{x}\)x, sometimes, we only know the interval of possible values of Δx, sometimes, we have interval bounds on the cdf of Δx. To compare different measuring instruments, it is desirable to know which of them brings more information – i.e., it is desirable to gauge the amount of information. For probabilistic uncertainty, this amount of information is described by Shannon’s entropy; similar measures can be developed for interval and other types of uncertainty. In this chapter, we start analyzing the problem of estimating information amount under different types of uncertainty.

Keywords

Fuzzy Number Partial Information Interval Uncertainty Binary Digit Discrete Probability Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Hung T. Nguyen
    • Vladik Kreinovich
      • Berlin Wu
        • Gang Xiang

          There are no affiliations available

          Personalised recommendations