Abstract
In 1948, Claude Shannon published a paper entitled “A mathematical theory of communication” [157] which marks the beginning of information theory. In this paper, Shannon defined information measures such as entropy and mutual information,1 and introduced the fondamental laws of data compression and transmission. Information theory deals with the transmission, storage, and processing of information and is used in fields such as physics, computer science, mathematics, statistics, economics, biology, linguistics, neurology, learning, computer graphics, and image processing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Feixas, M., Bardera, A., Rigan, J., Sbert, M., Xu, Q. (2014). Information Theory Basics. In: Information Theory Tools for Image Processing. Synthesis Lectures on Computer Graphics and Animation. Springer, Cham. https://doi.org/10.1007/978-3-031-79555-8_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-79555-8_1
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-79554-1
Online ISBN: 978-3-031-79555-8
eBook Packages: Synthesis Collection of Technology (R0)eBColl Synthesis Collection 5