Formulation and Analysis of the Problem, and the Corresponding Results and Algorithms
Formulation of the problem. In most practical situations, our knowledge is incomplete: there are several (n) different states which are consistent with our knowledge. How can we gauge this uncertainty? A natural measure of uncertainty is the average number of binary (“yes”-“no”) questions that we need to ask to find the exact state. This idea is behind Shannon’s information theory: according to this theory, when we know the probabilities p1,..., p n of different states (for which ∑ p i = 1), then this average number of questions is equal to S = – \(\sum\limits^{n}_{i=1}\) p i ·log2(p i ). In information theory, this average number of question is called the amount of information.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Nguyen, H.T., Kreinovich, V., Wu, B., Xiang, G. (2012). Computing Entropy under Interval Uncertainty. II. In: Computing Statistics under Interval and Fuzzy Uncertainty. Studies in Computational Intelligence, vol 393. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24905-1_25
Download citation
DOI: https://doi.org/10.1007/978-3-642-24905-1_25
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-24904-4
Online ISBN: 978-3-642-24905-1
eBook Packages: EngineeringEngineering (R0)