Skip to main content

The Relationship Between Logical Entropy and Shannon Entropy

  • Chapter
  • First Online:
New Foundations for Information Theory

Part of the book series: SpringerBriefs in Philosophy ((BRIEFSPHILOSOPH))

  • 571 Accesses

Abstract

This chapter is focused on developing the basic notion of Shannon entropy, its interpretation in terms of distinctions, i.e., the minimum average number of yes-or-no questions that must be answered to distinguish all the “messages.” Thus Shannon entropy is also a quantitative indicator of information-as-distinctions, and, accordingly, a “dit-bit transform” is defined that turns any simple, joint, conditional, or mutual logical entropy into the corresponding notion of Shannon entropy. One of the delicate points is that while logical entropy is always a non-negative measure in the sense of measure theory (indeed, a probability measure), we will later see that for three or more random variables, the Shannon mutual information can be negative. This means that Shannon entropy can in general be characterized only as a signed measure, i.e., a measure that can take on negative values.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Campbell, L. Lorne. 1965. Entropy as a Measure. IEEE Trans. on Information Theory IT-11: 112–114.

    Article  Google Scholar 

  2. Csiszar, Imre, and Janos Körner. 1981. Information Theory: Coding Theorems for Discrete Memoryless Systems. New York: Academic Press.

    Google Scholar 

  3. Doob, J. L. 1994. Measure Theory. New York: Springer Science+Business Media.

    Book  Google Scholar 

  4. Halmos, Paul R. 1974. Measure Theory. New York: Springer-Verlag.

    Google Scholar 

  5. Hartley, R.V.L. 1928. Transmission of information. Bell System Technical Journal 7: 535–63.

    Article  Google Scholar 

  6. Hu, Kuo Ting. 1962. On the Amount of Information. Probability Theory & Its Applications 7: 439–447. https://doi.org/10.1137/1107041.

    Article  Google Scholar 

  7. MacArthur, Robert H. 1965. Patterns of Species Diversity. Biol. Rev. 40: 510–33.

    Article  Google Scholar 

  8. MacKay, D. J. C. 2003. Information Theory, Inference, and Learning Algorithms. Cambridge UK: Cambridge University Press.

    Google Scholar 

  9. Polya, George, and Gabor Szego. 1998. Problems and Theorems in Analysis Vol. II. Berlin: Springer-Verlag.

    Book  Google Scholar 

  10. Rao, K. P. S. Bhaskara, and M. Bhaskara Rao. 1983. Theory of Charges: A Study of Finitely Additive Measures. London: Academic Press.

    Google Scholar 

  11. Rozeboom, William W. 1968. The Theory of Abstract Partials: An Introduction. Psychometrika 33: 133–167.

    Article  Google Scholar 

  12. Ryser, Herbert John. 1963. Combinatorial Mathematics. Washington DC: Mathematical Association of America.

    Book  Google Scholar 

  13. Takacs, Lajos. 1967. On the Method of Inclusion and Exclusion. Journal of the American Statistical Association 62: 102–113. https://doi.org/10.1080/01621459.1967.10482891.

    Google Scholar 

  14. Yeung, Raymond W. 1991. A New Outlook on Shannon’s Information Measures. IEEE Trans. on Information Theory 37: 466–474. https://doi.org/10.1109/18.79902.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ellerman, D. (2021). The Relationship Between Logical Entropy and Shannon Entropy. In: New Foundations for Information Theory. SpringerBriefs in Philosophy. Springer, Cham. https://doi.org/10.1007/978-3-030-86552-8_2

Download citation

Publish with us

Policies and ethics