Abstract
After C.E. Shannon had presented his mathematical theory of communication [54] its ideas had a very strong impact in several scientific communities in the world.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Strong emphasis was put into deriving sharp error bounds for specified rates.
References
R. Ahlswede, Beiträge zur Shannonschen Informationstheorie im Fall nichtstationärer Kanäle. Z. Wahrscheinlichkeitstheorie Verw. Geb. 10, 1–42 (1968)
R. Ahlswede, Channel capacities for list codes. J. Appl. Probab. 10, 824–836 (1973)
R. Ahlswede, A constructive proof of the coding theorem for discrete memoryless channels in case of complete feedback, in 6th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, September 1971, Publishing House of the Czechosl Academy of Sciences, pp. 1–22 (1973)
P. Billingsley, Ergodic Theory and Information (Wiley, New York, 1965)
L. Breiman, The individual ergodic theorems of information theory. Ann. Math. Stat. 28, 809–811 (1957)
R.L. Dobrushin, Arbeiten zur Informationstheorie IV, Allgemeine Formulierung des Shannonschen Hauptsatzes der Informationstheorie, Mathematische Forschungsberichte XVII, herausgegeben von H. Grell (VEB Deutscher Verlag der Wissenschaften, Berlin, 1963)
J.L. Doob, Stochastic Processes (Wiley, New York, 1953)
R.M. Fano, Statistical Theory of Communication, Notes on a course given at the Massachusetts Institute of Technology, 1952, 1954
A. Feinstein, A new basic theorem of information theory. Trans. IRE Sect. Inf. Theory PGIT-4, 2–22 (1954)
A. Feinstein, Foundations of Information Theory (McGraw-Hill Book Company, Inc, New York, 1958)
A. Feinstein, On the coding theorem and its converse for finite-memory channels. Inf. Control 2(1), 25–44 (1959)
R.G. Gallager, Information Theory and Reliable Communication (Wiley, New York, 1968)
I.M. Gelfand, A.M. Jaglom, Arbeiten zur Informationstheorie II, Über die Berechnung der Menge an Information über eine zufällige Funktion, die in einer anderen zufälligen Funktion enthalten ist (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russischen), Berlin, 1958)
I.M. Gelfand, A.M. Jaglom, A.N. Kolmogoroff, Arbeiten zur Informationstheorie II, Zur allgemeinen Definition der Information (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russischen), Berlin, 1958)
R.M. Gray, D.S. Ornstein, Block coding for discrete stationary \(\overline{d}\)-continuous noisy channels. IEEE Trans. Inf. Theory 25, 292–306 (1979)
R.M. Gray, D.S. Ornstein, R.L. Dobrushin, Block synchronization, sliding-block coding, invulnerable sources and zero-error codes for discrete noiseless channels. Ann. Probab. 8, 639–674 (1980)
P.R. Halmos, Measure Theory (Springer, New York, 1958)
H.K. Ting, On the information stability of a sequence of channels (A necessary and sufficient condition for the validity of Feinstein’s Lemma and Shannon’s Theorem) (translated by R. Silverman), Theory Probab. Appl, VII(3) (1962)
K. Jacobs, Die Übertragung diskreter Information durch periodische und fastperiodische Kanäle. Math. Ann. 137, 125–135 (1959)
K. Jacobs, Informationstheorie (Seminarbericht, Göttingen, 1960)
K. Jacobs, Neuere Methoden und Ergebnisse der Ergodentheorie, (Berlin-Göttingen-Heidelberg, 1960)
K. Jacobs, Über die Struktur der mittleren Entropie. Math. Z. 78, 33–43 (1962)
K. Jacobs, Measure and Integral (Academic Press, New York, 1978)
A. del Junco, M. Rahe, Finitary codings and weak Bernoulli partitions. Proc. Amer. Math. Soc. 75, 259–264 (1979)
M. Keane, M. Smorodinsky, A class of finitary codes. Israel J. Math. 26, 352–371 (1977)
A.Y. Khinchin, On the fundamental theorems of information theory, Uspehi Mat. Nauk, 11,1(67), 17–75 (1956)
A.Y. Khinchin, Arbeiten zur Informationstheorie I, 2. Auflage, Der Begriff der Entropie in der Wahrscheinlichkeitsrechnung (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russichen), Berlin, 1961)
A.Y. Khinchin, Arbeiten zur Informationstheorie I, 2. Auflage, Über grundlegende Sätze der Informationstheorie (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russichen), Berlin, 1961)
J.C. Kieffer, A general formula for the capacity of stationary nonanticipatory channels. Inf. Control 26, 381–391 (1974)
J.C. Kieffer, On sliding block coding for transmission of a source over a stationary non-anticipatory channel. Inf. Control 35, 1–19 (1977)
J.C. Kieffer, On the transmission of Bernoulli sources over stationary channels. Ann. Probab. 8(5), 942–961 (1980)
J.C. Kieffer, Some universal noiseless multiterminal source coding theorems. Inf. Control 460(2), 93–107 (1980)
J.C. Kieffer, Noiseless stationary coding over stationary channels. Z. Wahrscheinlichkeitstheorie Verw. Geb. (1980)
J.C. Kieffer, Perfect transmission over a discrete memoryless channel requires infinite expected coding time. J. Comb. Inf. Syst. Sci. 5(4), 317–322 (1980)
J.C. Kieffer, Block coding for weakly continuous channels. IEEE Trans. Inf. Theory IT-27(6) (1981)
J.C. Kieffer, Zero-error stationary coding over stationary channels. Z. Wahrscheinlichkeitstheorie Verw. Geb. 56, 113–126 (1981)
A.N. Kolmogorov, Arbeiten zur Informationstheorie I, 2. Auflage, Theorie der Nachrichtenübermittlung (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russischen), Berlin, 1961); englische Ausgabe: A.N. Kolmogorov, To the Shannon theory of information transmission in the continuous case. Trans. IRE Sect. Inf. Theory 2(4), 102–108 (1956)
M. Loève, Probability Theory (Wiley, New York, 1959)
B. McMillan, The basic theorems of information theory. Ann. Math. Stat. 24(2), 196–219 (1953)
J. Nedoma, The capacity of a discrete channel, in Transactions of the First Prague Conference on Information Theory, Statistical Decision Function’s and Random Processes, pp. 143–181, (1957)
D.L. Neuhoff, P.C. Shields, Channels with almost finite memory. IEEE Trans. Inf. Theory, IT-25, 440–447 (1979)
D. Ornstein, Ergodic Theory, Randomness, and Dynamical Systems (Yale University Press, New Haven, 1974)
D. Ornstein, B. Weiss, The Shannon-McMillan-Breiman theorem for a class of amenable groups. Israel J. Math. 44, 53–60 (1983)
J.C. Oxtoby, Ergodic sets. Bull. Am. Math. Soc. 58, 116–136 (1952)
K.R. Parthasarathy, On the integral representation of the rate of transmission of a stationary channel. Illinois J. Math. 5, 299–305 (1961)
A. Perez, Notions généralisées d’incertitude d’entropie et d’information du point de vue de la théorie du martingales, in Transactions of the First Prague Conference on Information Theory, Statistical Decision Function’s and Random Processes, Publishing House of the Czechoslovak Academy of Sciences Program, pp. 183–208 (1957)
A. Perez, Sur la théorie de l’information dans le cas d’un alphabet abstrait, in Transactions of the First Prague Conference on Information Theory, Statistical Decision Function’s and Random Processes, Publishing House of the Czechoslovak Academy of Sciences Program, pp. 209–244 (1957)
M. Rosenblatt-Roth, Die Entropie stochastischer Prozesse (auf Russisch). Doklady Akad. Nauk SSSR 112, 16–19 (1957)
M. Rosenblatt-Roth, Theorie der Übertragung von Information durch stochastische Kanäle (auf Russisch). Doklady Akad. Nauk SSSR 112, 202–205 (1957)
C.E. Shannon, A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 623–656 (1948)
C.E. Shannon, The zero error capacity of a noisy channel. IRE Trans. Inf. Theory IT-2, 8–19 (1956)
C.E. Shannon, Certain results in coding theory for noisy channels. Inf. Control 1(1), 6–25 (1957)
P.C. Shields, The Theory of Bernoulli Shifts (The University of Chicago Ill, Chicago, 1973)
P.C. Shields, Almost block independence. Z. Wahrscheinlichkeitstheorie Verw. Geb. 49, 119–123 (1979)
K. Takano, On the basic theorems of information theory. Ann. Inst. Stat. Math. Tokyo 9, 53–77 (1958)
I.P. Tsaregradskii, A note on the capacity of a stationary channel with finite memory, Teor. Veroyatnost. i Primenen., vol. 3, pp. 84–96, 1958 (in Russian), Theory Probab. Appl. III(1), pp. 79–91 (1958)
J. Wolfowitz, The maximum achievable length of an error correcting code. Illinois J. Math. 2, 454–458 (1958)
J. Wolfowitz, Coding Theorems of Information Theory, 1st edn. 1961, 2nd edn. 1964, 3rd edn. (Springer, Berlin, 1978)
Further Reading
R.L. Adler, D. Coppersmith, M. Hassner, Algorithms for sliding—block codes—an application of symbolic dynamics to information theory. IEEE Trans. Inf. Theory IT-29, 5–22 (1983)
R. Ahlswede, D. Dueck, Bad codes are good ciphers. Probl. Control Inf. Theory 1(5), 337–351 (1982)
R. Ahlswede, P. Gacs, Two Contributions to Information Theory, Topics in information theory (Kesthely, Hungary, 1975), pp. 17–40
R. Ahlswede, J. Wolfowitz, Channels without synchronization. Adv. Appl. Probab. 3, 383–403 (1971)
P. Algoet, T.M. Cover, A sandwich proof of the Shannon-McMillan-Breiman theorem. Ann. Prob. 16(2), 899–909 (1988)
A.R. Barron, The strong ergodic theorem of densitites: generalized Shannon-McMillan-Breiman theorem. Ann. Prob. 13, 1292–1303 (1985)
D. Blackwell, Exponential error bounds for finite state channels, in Proceedings of 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1 (University of California Press, Berkeley, 1961) pp. 57–63
D. Blackwell, L. Breiman, A.J. Thomasian, Proof of Shannon’s transmission theorem for finite state indecomposable channels. Ann. Math. Stat. 29(4), 1209–1228 (1958)
N. Bourbaki, Espaces vectoriels topologiques, Éléments de Mathématique, Livre V, Acrualités Scientifiques et Industrielles, No. 1189, Paris, Chapter II, p. 84 (1953)
L. Breiman, On achieving channel capacity in finite-memory channels. Illinois J. Math. 4, 246–252 (1960)
J. Bucklew, A large deviation theory proof of the abstract alphabet source code theorem. IEEE Trans. Inf. Theory IT-34, 1081–1083 (1988)
K.L. Chung, A note on the ergodic theorem of information theory. Ann. Math. Stat. 32, 612–614 (1961)
T.M. Cover, P. Gacs, R.M. Gray, Kolmogorov’s contribution to information theory and algorithmic complexity. Ann. Prob. 17, 840–865 (1989)
M. Denker, C. Grillenberger, K. Sigmund, Ergodic Theory on Compact Spaces. Lecture Notes in Mathematics, vol. 58 (Springer, New York, 1976)
J.D. Deuschel, D.W. Stroock, Large Deviations, vol. 137 (Pure and Applied Mathematics, Academy Press, Boston, 1989)
R.L. Dobrushin, Allgemeine Formulierung des Shannonschen Hauptsatzes der Informationstheorie (auf Russisch). Doklady Akad. Nauk SSSR 126, 474 (1959)
R.L. Dobrushin, Shannon’s theorem for channels with synchronization errors, Problemy Peredaci Informatsii, vol. 3, pp. 18–36, 1967; Trans. Probl. Inf. Transm. 3, pp. 31–36 (1967)
M.D. Donsker, S.R.S. Varadhan, Asymptotic evaluation of certain Markov process expectations for large time. J. Commun. Pure Appl. Math. 28, 1–47 (1975)
P. Elias, Coding for Noisy Channels (IRE Convention Record, 1955), pp. 37–44
P. Elias, Two famous papers. IRE Trans. Inf. Theory 4, 99 (1958)
P. Elias, Coding for Two Noisy Channels, in Proceedings of the London Symposium of Information Theory London, (1959)
A. Feinstein, Math. Reviews MR0118574 (22 #9347) 94.00. Breiman, Leo, On achieving channel capacity in finite-memory channels. Illinois J. Math. 4, pp. 246–252 (1960)
N.A. Friedman, Introduction to Ergodic Theory (Van Nostrand Reinhold Company, New York, 1970)
R.G. Gray, Entropy and Information Theory (Springer, New York, 1990) (revised 2000, 2007, 2008)
B.W. Gnedenko, A.N. Kolmogorov, Grenzverteilungen von Summen unabhängiger Zufallsgrößen, 2. Auflage ((Übersetzung aus dem Russischen) Akademie-Verlag, Berlin, 1960)
R.V.L. Hartley, The transmission of information. Bell Syst. Tech. J. 7, 535–564 (1928)
E. Hopf, Ergodentheorie (Springer, Berlin, 1937)
K. Jacobs, The ergodic decomposition of the Kolmogorov-Sinai invariant, in Ergodic Theory, ed. by F.B. Wright, F.B. Wright (Academic Press, New York, 1963)
A.Y. Khinchin, The concept of entropy in probability theory, (in Russian), Uspekhi Mat. Nauk, vol. 8, No. 3 (55), pp. 3–20, translation in Mathematical Foundations of Information Theory (Dover Publications Inc, New York, 1953) (1958)
J.C. Kieffer, A simple proof of the Moy-Perez generalization of the Shannon-McMillan theorem. Pacific J. Math. 51, 203–206 (1974)
J.C. Kieffer, Sliding-block coding for weakly continuous channels. IEEE Trans. Inf. Theory IT-28(1), 2–16 (1982)
A.N. Kolmogorov, A new metric invariant of transitive dynamic systems and automorphisms in Lebesgue spaces. Dokl. Akad. Nauk SSR 119, 861–864 (1958) (in Russian)
A.N. Kolmogorov, On the entropy per unit time as a metric invariant of automorphisms. Dokl. Akad. Nauk SSSR 124, 768–771 (1959) (in Russian)
A.N. Kolmogoroff, Über einige asymptotische Charakteristika total beschränkter metrischer Räume, Doklady A.N. d. UdSSR
U. Krengel, Ergodic Theorems (de Gruyter & Co, Berlin, 1985)
W. Krieger, On entropy and generators of measure-preserving transformations. Trans. Am. Math. Soc. 149, 453–464 (1970)
B. Marcus, Sophic systems and encoding data. IEEE Trans. Inf. Theory IT-31, 366–377 (1985)
L.D. Meshalkin, A case of isomorphisms of Bernoulli scheme. Dokl. Akad. Nauk SSSR 128, 41–44 (1959) (in Russian)
J. Moser, E. Phillips, S. Varadhan, Ergodic Theory, A Seminar (Courant Institute of Mathematical Sciences, New York University, 1975)
S.C. Moy, Generalization of the Shannon-McMillan theorem. Pacific J. Math. 11, 705–714 (1961)
S. Muroga, On the capacity of a noisy continuous channel. Trans. IRE, Sect. Inf. Theory 3(1), 44–51 (1957)
J. von Neumann, Zur Operatorenmethode in der klassischen Mechanik. Ann. Math. 33, 587–642 (1932)
H. Nyqvist, Certain factors affecting telegraph speed. Bell Syst. Tech. J. 3, 324 (1924)
S. Orey, On the Shannon-Perez-Moy theorem. Contemp. Math. 41, 319–327 (1985)
D. Ornstein, Bernoulli shifts with the same entropy are isomorphic. Adv. Math. 4, 337–352 (1970)
D. Ornstein, An application of ergodic theory to probability theory. Ann. Probab. 1, 43–58 (1973)
A. Perez, Extensions of Shannon-McMillan’s limit theorem to more general stochastic processes, in Transactions of the 3rd Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Prague, pp. 545–574 (1964)
K. Petersen, Ergodic Theory (Cambridge University Press, Cambridge, 1983)
M.S. Pinsker, Dynamical systems with completely positive or negative entropy. Soviet Math. Dokl. 1, 937–938 (1960)
V.A. Rohlin, Y.G. Sinai, Construction and properties of invariant measurable partitions. Soviet Math. Dokl. 2, 1611–1614 (1962)
C.E. Shannon, Communication in the presence of noise. Proc. IRE 32, 10–21 (1949)
C.E. Shannon, Coding Theorems for a Discrete Source with a Fidelity Criterion (IRE National Convention Record, Party, 1959), pp. 142–163
C.E. Shannon, W. Weaver, The Mathematical Theory of Communication (University of Illinois Press Ill, Urbana, 1949)
Y.G. Sinai, Weak isomorphism of transformations with an invariant measure. Soviet Math. Dokl. 3, 1725–1729 (1962)
Y.G. Sinai, Introduction to Ergodic Theory Mathematical Notes (Princeton University Press, Princeton, 1976)
A.J. Thomasian, An elementary proof of the AEP of information theory. Ann. Math. Stat. 31(2), 452–456 (1960)
S.R.S. Varadhan, Large Deviations and Applications (Society for Industrial and Applied Mathematics, Philadelphia, 1984)
S. Verdú, Teaching it, XXVIII Shannon Lecture, Nice, France, 28 June 2007
P. Walters, Ergodic Theory—Introductory Lectures, vol. 458, Lecture Notes in Mathematics (Springer, New York, 1975)
N. Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series (Wiley, New York, 1949)
N. Wiener, The ergodic theorem. Duke Math. J. 5, 1–18 (1939)
K. Winkelbauer, Communication channels with finite past history, in Transactions of the 2nd Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, pp. 685–831 (1960)
J. Wolfowitz, The coding of messages subject to chance errors. Illinois J. Math. 1(4), 591–606 (1957)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Ahlswede, R., Ahlswede, A., Althöfer, I., Deppe, C., Tamm, U. (2015). Shannon’s Model for Continuous Transmission. In: Ahlswede, A., Althöfer, I., Deppe, C., Tamm, U. (eds) Transmitting and Gaining Data. Foundations in Signal Processing, Communications and Networking, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-319-12523-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-12523-7_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12522-0
Online ISBN: 978-3-319-12523-7
eBook Packages: EngineeringEngineering (R0)