Skip to main content

Shannon’s Model for Continuous Transmission

  • Chapter
  • First Online:
Transmitting and Gaining Data

Abstract

After C.E. Shannon had presented his mathematical theory of communication [54] its ideas had a very strong impact in several scientific communities in the world.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Strong emphasis was put into deriving sharp error bounds for specified rates.

References

  1. R. Ahlswede, Beiträge zur Shannonschen Informationstheorie im Fall nichtstationärer Kanäle. Z. Wahrscheinlichkeitstheorie Verw. Geb. 10, 1–42 (1968)

    Article  MATH  MathSciNet  Google Scholar 

  2. R. Ahlswede, Channel capacities for list codes. J. Appl. Probab. 10, 824–836 (1973)

    Article  MATH  MathSciNet  Google Scholar 

  3. R. Ahlswede, A constructive proof of the coding theorem for discrete memoryless channels in case of complete feedback, in 6th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, September 1971, Publishing House of the Czechosl Academy of Sciences, pp. 1–22 (1973)

    Google Scholar 

  4. P. Billingsley, Ergodic Theory and Information (Wiley, New York, 1965)

    MATH  Google Scholar 

  5. L. Breiman, The individual ergodic theorems of information theory. Ann. Math. Stat. 28, 809–811 (1957)

    Article  MATH  MathSciNet  Google Scholar 

  6. R.L. Dobrushin, Arbeiten zur Informationstheorie IV, Allgemeine Formulierung des Shannonschen Hauptsatzes der Informationstheorie, Mathematische Forschungsberichte XVII, herausgegeben von H. Grell (VEB Deutscher Verlag der Wissenschaften, Berlin, 1963)

    Google Scholar 

  7. J.L. Doob, Stochastic Processes (Wiley, New York, 1953)

    MATH  Google Scholar 

  8. R.M. Fano, Statistical Theory of Communication, Notes on a course given at the Massachusetts Institute of Technology, 1952, 1954

    Google Scholar 

  9. A. Feinstein, A new basic theorem of information theory. Trans. IRE Sect. Inf. Theory PGIT-4, 2–22 (1954)

    Google Scholar 

  10. A. Feinstein, Foundations of Information Theory (McGraw-Hill Book Company, Inc, New York, 1958)

    MATH  Google Scholar 

  11. A. Feinstein, On the coding theorem and its converse for finite-memory channels. Inf. Control 2(1), 25–44 (1959)

    Article  MATH  MathSciNet  Google Scholar 

  12. R.G. Gallager, Information Theory and Reliable Communication (Wiley, New York, 1968)

    MATH  Google Scholar 

  13. I.M. Gelfand, A.M. Jaglom, Arbeiten zur Informationstheorie II, Über die Berechnung der Menge an Information über eine zufällige Funktion, die in einer anderen zufälligen Funktion enthalten ist (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russischen), Berlin, 1958)

    Google Scholar 

  14. I.M. Gelfand, A.M. Jaglom, A.N. Kolmogoroff, Arbeiten zur Informationstheorie II, Zur allgemeinen Definition der Information (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russischen), Berlin, 1958)

    Google Scholar 

  15. R.M. Gray, D.S. Ornstein, Block coding for discrete stationary \(\overline{d}\)-continuous noisy channels. IEEE Trans. Inf. Theory 25, 292–306 (1979)

    Article  MATH  MathSciNet  Google Scholar 

  16. R.M. Gray, D.S. Ornstein, R.L. Dobrushin, Block synchronization, sliding-block coding, invulnerable sources and zero-error codes for discrete noiseless channels. Ann. Probab. 8, 639–674 (1980)

    Article  MATH  MathSciNet  Google Scholar 

  17. P.R. Halmos, Measure Theory (Springer, New York, 1958)

    Google Scholar 

  18. H.K. Ting, On the information stability of a sequence of channels (A necessary and sufficient condition for the validity of Feinstein’s Lemma and Shannon’s Theorem) (translated by R. Silverman), Theory Probab. Appl, VII(3) (1962)

    Google Scholar 

  19. K. Jacobs, Die Übertragung diskreter Information durch periodische und fastperiodische Kanäle. Math. Ann. 137, 125–135 (1959)

    Article  MATH  MathSciNet  Google Scholar 

  20. K. Jacobs, Informationstheorie (Seminarbericht, Göttingen, 1960)

    Google Scholar 

  21. K. Jacobs, Neuere Methoden und Ergebnisse der Ergodentheorie, (Berlin-Göttingen-Heidelberg, 1960)

    Google Scholar 

  22. K. Jacobs, Über die Struktur der mittleren Entropie. Math. Z. 78, 33–43 (1962)

    Article  MATH  MathSciNet  Google Scholar 

  23. K. Jacobs, Measure and Integral (Academic Press, New York, 1978)

    MATH  Google Scholar 

  24. A. del Junco, M. Rahe, Finitary codings and weak Bernoulli partitions. Proc. Amer. Math. Soc. 75, 259–264 (1979)

    Article  MATH  MathSciNet  Google Scholar 

  25. M. Keane, M. Smorodinsky, A class of finitary codes. Israel J. Math. 26, 352–371 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  26. A.Y. Khinchin, On the fundamental theorems of information theory, Uspehi Mat. Nauk, 11,1(67), 17–75 (1956)

    Google Scholar 

  27. A.Y. Khinchin, Arbeiten zur Informationstheorie I, 2. Auflage, Der Begriff der Entropie in der Wahrscheinlichkeitsrechnung (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russichen), Berlin, 1961)

    Google Scholar 

  28. A.Y. Khinchin, Arbeiten zur Informationstheorie I, 2. Auflage, Über grundlegende Sätze der Informationstheorie (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russichen), Berlin, 1961)

    Google Scholar 

  29. J.C. Kieffer, A general formula for the capacity of stationary nonanticipatory channels. Inf. Control 26, 381–391 (1974)

    Article  MathSciNet  Google Scholar 

  30. J.C. Kieffer, On sliding block coding for transmission of a source over a stationary non-anticipatory channel. Inf. Control 35, 1–19 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  31. J.C. Kieffer, On the transmission of Bernoulli sources over stationary channels. Ann. Probab. 8(5), 942–961 (1980)

    Article  MATH  MathSciNet  Google Scholar 

  32. J.C. Kieffer, Some universal noiseless multiterminal source coding theorems. Inf. Control 460(2), 93–107 (1980)

    Article  MathSciNet  Google Scholar 

  33. J.C. Kieffer, Noiseless stationary coding over stationary channels. Z. Wahrscheinlichkeitstheorie Verw. Geb. (1980)

    Google Scholar 

  34. J.C. Kieffer, Perfect transmission over a discrete memoryless channel requires infinite expected coding time. J. Comb. Inf. Syst. Sci. 5(4), 317–322 (1980)

    MATH  MathSciNet  Google Scholar 

  35. J.C. Kieffer, Block coding for weakly continuous channels. IEEE Trans. Inf. Theory IT-27(6) (1981)

    Google Scholar 

  36. J.C. Kieffer, Zero-error stationary coding over stationary channels. Z. Wahrscheinlichkeitstheorie Verw. Geb. 56, 113–126 (1981)

    Article  MATH  MathSciNet  Google Scholar 

  37. A.N. Kolmogorov, Arbeiten zur Informationstheorie I, 2. Auflage, Theorie der Nachrichtenübermittlung (VEB Deutscher Verlag der Wissenschaften (Übersetzung aus dem Russischen), Berlin, 1961); englische Ausgabe: A.N. Kolmogorov, To the Shannon theory of information transmission in the continuous case. Trans. IRE Sect. Inf. Theory 2(4), 102–108 (1956)

    Google Scholar 

  38. M. Loève, Probability Theory (Wiley, New York, 1959)

    Google Scholar 

  39. B. McMillan, The basic theorems of information theory. Ann. Math. Stat. 24(2), 196–219 (1953)

    Article  MATH  MathSciNet  Google Scholar 

  40. J. Nedoma, The capacity of a discrete channel, in Transactions of the First Prague Conference on Information Theory, Statistical Decision Function’s and Random Processes, pp. 143–181, (1957)

    Google Scholar 

  41. D.L. Neuhoff, P.C. Shields, Channels with almost finite memory. IEEE Trans. Inf. Theory, IT-25, 440–447 (1979)

    Google Scholar 

  42. D. Ornstein, Ergodic Theory, Randomness, and Dynamical Systems (Yale University Press, New Haven, 1974)

    MATH  Google Scholar 

  43. D. Ornstein, B. Weiss, The Shannon-McMillan-Breiman theorem for a class of amenable groups. Israel J. Math. 44, 53–60 (1983)

    Article  MATH  MathSciNet  Google Scholar 

  44. J.C. Oxtoby, Ergodic sets. Bull. Am. Math. Soc. 58, 116–136 (1952)

    Article  MATH  MathSciNet  Google Scholar 

  45. K.R. Parthasarathy, On the integral representation of the rate of transmission of a stationary channel. Illinois J. Math. 5, 299–305 (1961)

    MATH  MathSciNet  Google Scholar 

  46. A. Perez, Notions généralisées d’incertitude d’entropie et d’information du point de vue de la théorie du martingales, in Transactions of the First Prague Conference on Information Theory, Statistical Decision Function’s and Random Processes, Publishing House of the Czechoslovak Academy of Sciences Program, pp. 183–208 (1957)

    Google Scholar 

  47. A. Perez, Sur la théorie de l’information dans le cas d’un alphabet abstrait, in Transactions of the First Prague Conference on Information Theory, Statistical Decision Function’s and Random Processes, Publishing House of the Czechoslovak Academy of Sciences Program, pp. 209–244 (1957)

    Google Scholar 

  48. M. Rosenblatt-Roth, Die Entropie stochastischer Prozesse (auf Russisch). Doklady Akad. Nauk SSSR 112, 16–19 (1957)

    MATH  Google Scholar 

  49. M. Rosenblatt-Roth, Theorie der Übertragung von Information durch stochastische Kanäle (auf Russisch). Doklady Akad. Nauk SSSR 112, 202–205 (1957)

    Google Scholar 

  50. C.E. Shannon, A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 623–656 (1948)

    Article  MathSciNet  Google Scholar 

  51. C.E. Shannon, The zero error capacity of a noisy channel. IRE Trans. Inf. Theory IT-2, 8–19 (1956)

    Google Scholar 

  52. C.E. Shannon, Certain results in coding theory for noisy channels. Inf. Control 1(1), 6–25 (1957)

    Article  MATH  MathSciNet  Google Scholar 

  53. P.C. Shields, The Theory of Bernoulli Shifts (The University of Chicago Ill, Chicago, 1973)

    MATH  Google Scholar 

  54. P.C. Shields, Almost block independence. Z. Wahrscheinlichkeitstheorie Verw. Geb. 49, 119–123 (1979)

    Google Scholar 

  55. K. Takano, On the basic theorems of information theory. Ann. Inst. Stat. Math. Tokyo 9, 53–77 (1958)

    Article  MATH  MathSciNet  Google Scholar 

  56. I.P. Tsaregradskii, A note on the capacity of a stationary channel with finite memory, Teor. Veroyatnost. i Primenen., vol. 3, pp. 84–96, 1958 (in Russian), Theory Probab. Appl. III(1), pp. 79–91 (1958)

    Google Scholar 

  57. J. Wolfowitz, The maximum achievable length of an error correcting code. Illinois J. Math. 2, 454–458 (1958)

    MATH  MathSciNet  Google Scholar 

  58. J. Wolfowitz, Coding Theorems of Information Theory, 1st edn. 1961, 2nd edn. 1964, 3rd edn. (Springer, Berlin, 1978)

    Google Scholar 

Further Reading

  1. R.L. Adler, D. Coppersmith, M. Hassner, Algorithms for sliding—block codes—an application of symbolic dynamics to information theory. IEEE Trans. Inf. Theory IT-29, 5–22 (1983)

    Google Scholar 

  2. R. Ahlswede, D. Dueck, Bad codes are good ciphers. Probl. Control Inf. Theory 1(5), 337–351 (1982)

    MathSciNet  Google Scholar 

  3. R. Ahlswede, P. Gacs, Two Contributions to Information Theory, Topics in information theory (Kesthely, Hungary, 1975), pp. 17–40

    Google Scholar 

  4. R. Ahlswede, J. Wolfowitz, Channels without synchronization. Adv. Appl. Probab. 3, 383–403 (1971)

    Article  MATH  MathSciNet  Google Scholar 

  5. P. Algoet, T.M. Cover, A sandwich proof of the Shannon-McMillan-Breiman theorem. Ann. Prob. 16(2), 899–909 (1988)

    Article  MATH  MathSciNet  Google Scholar 

  6. A.R. Barron, The strong ergodic theorem of densitites: generalized Shannon-McMillan-Breiman theorem. Ann. Prob. 13, 1292–1303 (1985)

    Article  MATH  MathSciNet  Google Scholar 

  7. D. Blackwell, Exponential error bounds for finite state channels, in Proceedings of 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1 (University of California Press, Berkeley, 1961) pp. 57–63

    Google Scholar 

  8. D. Blackwell, L. Breiman, A.J. Thomasian, Proof of Shannon’s transmission theorem for finite state indecomposable channels. Ann. Math. Stat. 29(4), 1209–1228 (1958)

    Article  MATH  MathSciNet  Google Scholar 

  9. N. Bourbaki, Espaces vectoriels topologiques, Éléments de Mathématique, Livre V, Acrualités Scientifiques et Industrielles, No. 1189, Paris, Chapter II, p. 84 (1953)

    Google Scholar 

  10. L. Breiman, On achieving channel capacity in finite-memory channels. Illinois J. Math. 4, 246–252 (1960)

    MATH  MathSciNet  Google Scholar 

  11. J. Bucklew, A large deviation theory proof of the abstract alphabet source code theorem. IEEE Trans. Inf. Theory IT-34, 1081–1083 (1988)

    Google Scholar 

  12. K.L. Chung, A note on the ergodic theorem of information theory. Ann. Math. Stat. 32, 612–614 (1961)

    Article  MATH  Google Scholar 

  13. T.M. Cover, P. Gacs, R.M. Gray, Kolmogorov’s contribution to information theory and algorithmic complexity. Ann. Prob. 17, 840–865 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  14. M. Denker, C. Grillenberger, K. Sigmund, Ergodic Theory on Compact Spaces. Lecture Notes in Mathematics, vol. 58 (Springer, New York, 1976)

    Google Scholar 

  15. J.D. Deuschel, D.W. Stroock, Large Deviations, vol. 137 (Pure and Applied Mathematics, Academy Press, Boston, 1989)

    Google Scholar 

  16. R.L. Dobrushin, Allgemeine Formulierung des Shannonschen Hauptsatzes der Informationstheorie (auf Russisch). Doklady Akad. Nauk SSSR 126, 474 (1959)

    MATH  Google Scholar 

  17. R.L. Dobrushin, Shannon’s theorem for channels with synchronization errors, Problemy Peredaci Informatsii, vol. 3, pp. 18–36, 1967; Trans. Probl. Inf. Transm. 3, pp. 31–36 (1967)

    Google Scholar 

  18. M.D. Donsker, S.R.S. Varadhan, Asymptotic evaluation of certain Markov process expectations for large time. J. Commun. Pure Appl. Math. 28, 1–47 (1975)

    Article  MATH  MathSciNet  Google Scholar 

  19. P. Elias, Coding for Noisy Channels (IRE Convention Record, 1955), pp. 37–44

    Google Scholar 

  20. P. Elias, Two famous papers. IRE Trans. Inf. Theory 4, 99 (1958)

    Article  Google Scholar 

  21. P. Elias, Coding for Two Noisy Channels, in Proceedings of the London Symposium of Information Theory London, (1959)

    Google Scholar 

  22. A. Feinstein, Math. Reviews MR0118574 (22 #9347) 94.00. Breiman, Leo, On achieving channel capacity in finite-memory channels. Illinois J. Math. 4, pp. 246–252 (1960)

    Google Scholar 

  23. N.A. Friedman, Introduction to Ergodic Theory (Van Nostrand Reinhold Company, New York, 1970)

    MATH  Google Scholar 

  24. R.G. Gray, Entropy and Information Theory (Springer, New York, 1990) (revised 2000, 2007, 2008)

    Google Scholar 

  25. B.W. Gnedenko, A.N. Kolmogorov, Grenzverteilungen von Summen unabhängiger Zufallsgrößen, 2. Auflage ((Übersetzung aus dem Russischen) Akademie-Verlag, Berlin, 1960)

    Google Scholar 

  26. R.V.L. Hartley, The transmission of information. Bell Syst. Tech. J. 7, 535–564 (1928)

    Article  Google Scholar 

  27. E. Hopf, Ergodentheorie (Springer, Berlin, 1937)

    Book  Google Scholar 

  28. K. Jacobs, The ergodic decomposition of the Kolmogorov-Sinai invariant, in Ergodic Theory, ed. by F.B. Wright, F.B. Wright (Academic Press, New York, 1963)

    Google Scholar 

  29. A.Y. Khinchin, The concept of entropy in probability theory, (in Russian), Uspekhi Mat. Nauk, vol. 8, No. 3 (55), pp. 3–20, translation in Mathematical Foundations of Information Theory (Dover Publications Inc, New York, 1953) (1958)

    Google Scholar 

  30. J.C. Kieffer, A simple proof of the Moy-Perez generalization of the Shannon-McMillan theorem. Pacific J. Math. 51, 203–206 (1974)

    Article  MATH  MathSciNet  Google Scholar 

  31. J.C. Kieffer, Sliding-block coding for weakly continuous channels. IEEE Trans. Inf. Theory IT-28(1), 2–16 (1982)

    Google Scholar 

  32. A.N. Kolmogorov, A new metric invariant of transitive dynamic systems and automorphisms in Lebesgue spaces. Dokl. Akad. Nauk SSR 119, 861–864 (1958) (in Russian)

    Google Scholar 

  33. A.N. Kolmogorov, On the entropy per unit time as a metric invariant of automorphisms. Dokl. Akad. Nauk SSSR 124, 768–771 (1959) (in Russian)

    Google Scholar 

  34. A.N. Kolmogoroff, Über einige asymptotische Charakteristika total beschränkter metrischer Räume, Doklady A.N. d. UdSSR

    Google Scholar 

  35. U. Krengel, Ergodic Theorems (de Gruyter & Co, Berlin, 1985)

    Google Scholar 

  36. W. Krieger, On entropy and generators of measure-preserving transformations. Trans. Am. Math. Soc. 149, 453–464 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  37. B. Marcus, Sophic systems and encoding data. IEEE Trans. Inf. Theory IT-31, 366–377 (1985)

    Google Scholar 

  38. L.D. Meshalkin, A case of isomorphisms of Bernoulli scheme. Dokl. Akad. Nauk SSSR 128, 41–44 (1959) (in Russian)

    Google Scholar 

  39. J. Moser, E. Phillips, S. Varadhan, Ergodic Theory, A Seminar (Courant Institute of Mathematical Sciences, New York University, 1975)

    Google Scholar 

  40. S.C. Moy, Generalization of the Shannon-McMillan theorem. Pacific J. Math. 11, 705–714 (1961)

    Article  MATH  MathSciNet  Google Scholar 

  41. S. Muroga, On the capacity of a noisy continuous channel. Trans. IRE, Sect. Inf. Theory 3(1), 44–51 (1957)

    Article  Google Scholar 

  42. J. von Neumann, Zur Operatorenmethode in der klassischen Mechanik. Ann. Math. 33, 587–642 (1932)

    Google Scholar 

  43. H. Nyqvist, Certain factors affecting telegraph speed. Bell Syst. Tech. J. 3, 324 (1924)

    Article  Google Scholar 

  44. S. Orey, On the Shannon-Perez-Moy theorem. Contemp. Math. 41, 319–327 (1985)

    Article  MathSciNet  Google Scholar 

  45. D. Ornstein, Bernoulli shifts with the same entropy are isomorphic. Adv. Math. 4, 337–352 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  46. D. Ornstein, An application of ergodic theory to probability theory. Ann. Probab. 1, 43–58 (1973)

    Article  MATH  MathSciNet  Google Scholar 

  47. A. Perez, Extensions of Shannon-McMillan’s limit theorem to more general stochastic processes, in Transactions of the 3rd Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Prague, pp. 545–574 (1964)

    Google Scholar 

  48. K. Petersen, Ergodic Theory (Cambridge University Press, Cambridge, 1983)

    Book  MATH  Google Scholar 

  49. M.S. Pinsker, Dynamical systems with completely positive or negative entropy. Soviet Math. Dokl. 1, 937–938 (1960)

    MATH  MathSciNet  Google Scholar 

  50. V.A. Rohlin, Y.G. Sinai, Construction and properties of invariant measurable partitions. Soviet Math. Dokl. 2, 1611–1614 (1962)

    Google Scholar 

  51. C.E. Shannon, Communication in the presence of noise. Proc. IRE 32, 10–21 (1949)

    Article  MathSciNet  Google Scholar 

  52. C.E. Shannon, Coding Theorems for a Discrete Source with a Fidelity Criterion (IRE National Convention Record, Party, 1959), pp. 142–163

    Google Scholar 

  53. C.E. Shannon, W. Weaver, The Mathematical Theory of Communication (University of Illinois Press Ill, Urbana, 1949)

    Google Scholar 

  54. Y.G. Sinai, Weak isomorphism of transformations with an invariant measure. Soviet Math. Dokl. 3, 1725–1729 (1962)

    MATH  Google Scholar 

  55. Y.G. Sinai, Introduction to Ergodic Theory Mathematical Notes (Princeton University Press, Princeton, 1976)

    Google Scholar 

  56. A.J. Thomasian, An elementary proof of the AEP of information theory. Ann. Math. Stat. 31(2), 452–456 (1960)

    Article  MATH  MathSciNet  Google Scholar 

  57. S.R.S. Varadhan, Large Deviations and Applications (Society for Industrial and Applied Mathematics, Philadelphia, 1984)

    Book  Google Scholar 

  58. S. Verdú, Teaching it, XXVIII Shannon Lecture, Nice, France, 28 June 2007

    Google Scholar 

  59. P. Walters, Ergodic Theory—Introductory Lectures, vol. 458, Lecture Notes in Mathematics (Springer, New York, 1975)

    MATH  Google Scholar 

  60. N. Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series (Wiley, New York, 1949)

    MATH  Google Scholar 

  61. N. Wiener, The ergodic theorem. Duke Math. J. 5, 1–18 (1939)

    Article  MathSciNet  Google Scholar 

  62. K. Winkelbauer, Communication channels with finite past history, in Transactions of the 2nd Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, pp. 685–831 (1960)

    Google Scholar 

  63. J. Wolfowitz, The coding of messages subject to chance errors. Illinois J. Math. 1(4), 591–606 (1957)

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Deppe .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Ahlswede, R., Ahlswede, A., Althöfer, I., Deppe, C., Tamm, U. (2015). Shannon’s Model for Continuous Transmission. In: Ahlswede, A., Althöfer, I., Deppe, C., Tamm, U. (eds) Transmitting and Gaining Data. Foundations in Signal Processing, Communications and Networking, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-319-12523-7_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12523-7_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12522-0

  • Online ISBN: 978-3-319-12523-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics