Advertisement

Information in the presence of noise. Shannon’s amount of information

  • Roman V. Belavkin
  • Panos M. Pardalos
  • Jose C. Principe
Chapter

Abstract

In this chapter, we consider the concept, introduced by Shannon, of the amount of mutual information between two random variables or two groups of random variables. This concept is central to information theory, the independent development of which was initiated by Shannon in 1948. The amount of mutual information is defined as the difference between a priori and a posteriori (conditional) entropies.

References

  1. 33.
    Pinsker, M.S.: The quantity of information about a Gaussian random stationary process, contained in a second process connected with it in a stationary manner. Dokl. Akad. Nauk USSR 99, 213–216 (1954, in Russian)Google Scholar
  2. 38.
    Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27 (1948)MathSciNetCrossRefGoogle Scholar
  3. 45.
    Shannon, C.E.: A mathematical theory of communication (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Roman V. Belavkin
    • 1
  • Panos M. Pardalos
    • 2
  • Jose C. Principe
    • 3
  1. 1.Faculty of Science and TechnologyMiddlesex UniversityLondonUK
  2. 2.Industrial and Systems EngineeringUniversity of FloridaGainesvilleUSA
  3. 3.Electrical & Computer EngineeringUniversity of FloridaGainesvilleUSA

Personalised recommendations