Message transmission in the presence of noise. Second asymptotic theorem and its various formulations
In this chapter, we provide the most significant asymptotic results concerning the existence of optimal codes for noisy channels. It is proven that the Shannon’s amount of information is a bound on Hartley’s amount of information transmitted with asymptotic zero probability of error. This is the meaning of the second asymptotic theorem. Further we provide formulae showing how quickly the probability of error for decoding decreases when the block length increases. Contrary to the conventional approach, we represent the above results not in terms of channel capacity (i.e., we do not perform the maximization of the limit amount of information with respect to the probability density of the input variable), but in terms of Shannon’s amount of information.
- 9.Fano, R.M.: Transmission of Information: A Statistical Theory of Communications, 1st edn. MIT Press, Cambridge (1961)Google Scholar
- 10.Fano, R.M.: Transmission of Information: A Statistical Theory of Communications (Translation to Russian). Mir, Moscow (1965)Google Scholar
- 13.Gnedenko, B.V.: The Course of the Theory of Probability. Fizmatgiz, Moscow (1961, in Russian)Google Scholar
- 14.Gnedenko, B.V.: The Theory of Probability (Translation from Russian). Chelsea, New York (1962)Google Scholar
- 42.Shannon, C.E.: Certain results in coding theory for noisy channels (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)Google Scholar
- 45.Shannon, C.E.: A mathematical theory of communication (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)Google Scholar