Skip to main content

Information Entropy

  • Chapter
  • First Online:
Handbook of Dynamics and Probability

Abstract

The concepts of information content and entropy grew out of attempts to quantify randomness. Early ideas go back to Boltzmann, but the definitive step was taken by Claude Shannon (1948) in his treatise “A Mathematical Theory of Communication” which lay the groundwork for what is known today as information theory. More recent references are Mackey (2003) and Cover and Thomas (2005). Information theory studies sequences of letters or symbols from a finite alphabet. It presumes that a source produces these letters with a given probability distribution and then studies how information is degraded when storing, processing, or transmitting data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley Interscience (2005)

    Google Scholar 

  • MacKay, D.J.C.: Information Theory, Inference, and Learning Algorithms. Cambridge University Press (2003)

    Google Scholar 

  • Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379-423, 623-656 (1948)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peter Müller .

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Müller, P. (2022). Information Entropy. In: Handbook of Dynamics and Probability. Springer, Cham. https://doi.org/10.1007/978-3-030-88486-4_10

Download citation

Publish with us

Policies and ethics