Information in the presence of noise. Shannon’s amount of information
In this chapter, we consider the concept, introduced by Shannon, of the amount of mutual information between two random variables or two groups of random variables. This concept is central to information theory, the independent development of which was initiated by Shannon in 1948. The amount of mutual information is defined as the difference between a priori and a posteriori (conditional) entropies.
- 33.Pinsker, M.S.: The quantity of information about a Gaussian random stationary process, contained in a second process connected with it in a stationary manner. Dokl. Akad. Nauk USSR 99, 213–216 (1954, in Russian)Google Scholar
- 45.Shannon, C.E.: A mathematical theory of communication (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)Google Scholar