Abstract
Figure 7.1 presents a generic communication system. In this figure, the message is composed at the source. Then, it undergoes some coding to become suitable for transmission through the channel. Most channels are noisy and this is signified by the noise that affects the channel from the block on top. At the receiving end of the channel, a decoder must decode the encoded and noisy message into one that the recipient will understand. This is the basis for the development of the topic of Information Theory which started with the advent of the telegraph and telephone systems. Fisher [7], Nyquist [14, 15], Hartley [9], Shannon [18], Wiener [22], and Kullback [12] were among some of the early developers of Information Theory. A lot of this work was developed in response to the encryption and decryption needs for sensitive communications during the second world war.
Keywords
- Probability Density Function
- Relative Entropy
- Fisher Information Matrix
- Probability Mass Function
- Discrete Random Variable
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, access via your institution.
Buying options
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2011 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Beigi, H. (2011). Information Theory. In: Fundamentals of Speaker Recognition. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-77592-0_7
Download citation
DOI: https://doi.org/10.1007/978-0-387-77592-0_7
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-77591-3
Online ISBN: 978-0-387-77592-0
eBook Packages: Computer ScienceComputer Science (R0)