Advertisement

Computing Entropy under Interval Uncertainty. II

  • Hung T. Nguyen
  • Vladik Kreinovich
  • Berlin Wu
  • Gang Xiang
Part of the Studies in Computational Intelligence book series (SCI, volume 393)

Formulation and Analysis of the Problem, and the Corresponding Results and Algorithms

Formulation of the problem. In most practical situations, our knowledge is incomplete: there are several (n) different states which are consistent with our knowledge. How can we gauge this uncertainty? A natural measure of uncertainty is the average number of binary (“yes”-“no”) questions that we need to ask to find the exact state. This idea is behind Shannon’s information theory: according to this theory, when we know the probabilities p1,..., p n of different states (for which ∑ p i = 1), then this average number of questions is equal to S = – \(\sum\limits^{n}_{i=1}\) p i ·log2(p i ). In information theory, this average number of question is called the amount of information.

Keywords

Fuzzy Number Optimal Vector Auxiliary Problem Interval Uncertainty Fuzzy Uncertainty 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Hung T. Nguyen
    • Vladik Kreinovich
      • Berlin Wu
        • Gang Xiang

          There are no affiliations available

          Personalised recommendations