Abstract
The concepts of information content and entropy grew out of attempts to quantify randomness. Early ideas go back to Boltzmann, but the definitive step was taken by Claude Shannon (1948) in his treatise “A Mathematical Theory of Communication” which lay the groundwork for what is known today as information theory. More recent references are Mackey (2003) and Cover and Thomas (2005). Information theory studies sequences of letters or symbols from a finite alphabet. It presumes that a source produces these letters with a given probability distribution and then studies how information is degraded when storing, processing, or transmitting data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley Interscience (2005)
MacKay, D.J.C.: Information Theory, Inference, and Learning Algorithms. Cambridge University Press (2003)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379-423, 623-656 (1948)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Müller, P. (2022). Information Entropy. In: Handbook of Dynamics and Probability. Springer, Cham. https://doi.org/10.1007/978-3-030-88486-4_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-88486-4_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-88485-7
Online ISBN: 978-3-030-88486-4
eBook Packages: Earth and Environmental ScienceEarth and Environmental Science (R0)