Abstract
This chapter is focused on developing the basic notion of Shannon entropy, its interpretation in terms of distinctions, i.e., the minimum average number of yes-or-no questions that must be answered to distinguish all the “messages.” Thus Shannon entropy is also a quantitative indicator of information-as-distinctions, and, accordingly, a “dit-bit transform” is defined that turns any simple, joint, conditional, or mutual logical entropy into the corresponding notion of Shannon entropy. One of the delicate points is that while logical entropy is always a non-negative measure in the sense of measure theory (indeed, a probability measure), we will later see that for three or more random variables, the Shannon mutual information can be negative. This means that Shannon entropy can in general be characterized only as a signed measure, i.e., a measure that can take on negative values.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Campbell, L. Lorne. 1965. Entropy as a Measure. IEEE Trans. on Information Theory IT-11: 112–114.
Csiszar, Imre, and Janos Körner. 1981. Information Theory: Coding Theorems for Discrete Memoryless Systems. New York: Academic Press.
Doob, J. L. 1994. Measure Theory. New York: Springer Science+Business Media.
Halmos, Paul R. 1974. Measure Theory. New York: Springer-Verlag.
Hartley, R.V.L. 1928. Transmission of information. Bell System Technical Journal 7: 535–63.
Hu, Kuo Ting. 1962. On the Amount of Information. Probability Theory & Its Applications 7: 439–447. https://doi.org/10.1137/1107041.
MacArthur, Robert H. 1965. Patterns of Species Diversity. Biol. Rev. 40: 510–33.
MacKay, D. J. C. 2003. Information Theory, Inference, and Learning Algorithms. Cambridge UK: Cambridge University Press.
Polya, George, and Gabor Szego. 1998. Problems and Theorems in Analysis Vol. II. Berlin: Springer-Verlag.
Rao, K. P. S. Bhaskara, and M. Bhaskara Rao. 1983. Theory of Charges: A Study of Finitely Additive Measures. London: Academic Press.
Rozeboom, William W. 1968. The Theory of Abstract Partials: An Introduction. Psychometrika 33: 133–167.
Ryser, Herbert John. 1963. Combinatorial Mathematics. Washington DC: Mathematical Association of America.
Takacs, Lajos. 1967. On the Method of Inclusion and Exclusion. Journal of the American Statistical Association 62: 102–113. https://doi.org/10.1080/01621459.1967.10482891.
Yeung, Raymond W. 1991. A New Outlook on Shannon’s Information Measures. IEEE Trans. on Information Theory 37: 466–474. https://doi.org/10.1109/18.79902.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Ellerman, D. (2021). The Relationship Between Logical Entropy and Shannon Entropy. In: New Foundations for Information Theory. SpringerBriefs in Philosophy. Springer, Cham. https://doi.org/10.1007/978-3-030-86552-8_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-86552-8_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86551-1
Online ISBN: 978-3-030-86552-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)