How to Transmit Information Reliably with Unreliable Elements (Shannon’s Theorem)
The goal of our rather technical excursion into the field of stationary processes was to formulate and prove Shannon’s theorem. This is done in this last chapter of Part III.
KeywordsChannel Capacity High Fidelity Information Rate Simple Type Channel Output
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
- .Feinstein, A. (1954). A new basic theorem of information theory. IRE Transactions on Information Theory, 4, 2–22.Google Scholar
- .Feinstein, A. (1959). On the coding theorem and its converse for finite-memory channels. Information and Control, 2, 25–44.Google Scholar
- .Gray, R. M. (1990). Entropy and information theory. New York: Springer.Google Scholar
- .Kieffer, J. (1981). Block coding for weakly continuous channels. IEEE Transactions on Information Theory, 27(6), 721–727.Google Scholar
- .McMillan, B. (1953). The basic theorems of information theory. Annals of Mathematical Statistics, 24, 196–219.Google Scholar
- .Pfaffelhuber, E. (1971). Channels with asymptotically decreasing memory and anticipation. IEEE Transactions on Information Theory, 17(4), 379–385.Google Scholar
- .Shannon, C. E. (1948). A mathematical theory of communication. Bell Systems Technical Journal, 27, 379–423, 623–656.Google Scholar
- .Shannon, C. E. & Weaver, W. (1949). The mathematical theory of communication. Champaign: University of Illinois Press.Google Scholar
- .Wolfowitz, J. (1964). Coding theorems of information theory (2nd ed.). Springer-Verlag New York, Inc.: Secaucus.Google Scholar
© Springer-Verlag Berlin Heidelberg 2012