Skip to main content

Part of the book series: Bolyai Society Mathematical Studies ((BSMS,volume 14))

Abstract

Information Theory has been created by Claude Shannon as a mathematical theory of communication. His fundamental paper {19} appeared in 1948. This was one of the major discoveries of the 20th century, establishing theoretical foundations for communication engineering and information technology. The key ingredients of Shannon’s work were (i) a stochastic model of communication, (ii) the view of information as a commodity whose amount can be measured without regard to meaning, and (iii) the emphasis of coding as a means to enhance information storage and transmission, in particular, to achieve reliable transmission over unreliable channels.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aczél, János — Daróczy, Zoltán, On Measures of Information and their Characterizations Academic Press (New York, 1975).

    MATH  Google Scholar 

  2. Kullback, Solomon, Information Theory and Statistics, Wiley (New York, 1959), Dover (New York, 1978).

    MATH  Google Scholar 

  3. Liggett, Thomas M., Interacting Particle Systems, Die Grundlehren der Mathematischen Wissenschaften in Einzeldarstellungen, Band 276, Springer-Verlag (New York, 1985).

    MATH  Google Scholar 

  4. Rényi, Alfréd, Selected Papers, ed. Pál Turán, Akadémiai Kiadó (Budapest, 1976).

    Google Scholar 

  5. Rényi, Alfréd, Probability Theory, Translated by László Vekerdi, North-Holland Series in Applied Mathematics and Mechanics, Vol. 10, North-Holland Publishing Company (Amsterdam-London); American Elsevier Publishing Co., Inc. (New York, 1970).

    Google Scholar 

  6. Reza, Fazlollah M., An Introduction to Information Theory, McGraw-Hill (New York, 1961).

    Google Scholar 

  7. Vajda, Igor, Theory of Statistical Inference and Information, Kluwer Academic (Boston, 1989).

    MATH  Google Scholar 

  8. Wald, Abraham, Sequential Analysis, John Wiley and Sons (New York) — Chapman and Hall (London, 1947).

    MATH  Google Scholar 

  9. {1} A. Barron, Antropy and the central limit theorem, Annals of Probability, 14 (1986), 336–342.

    Article  MATH  MathSciNet  Google Scholar 

  10. {2} L. Campbell, A coding theorem and Rényi’s entropy, Information and Control, 8 (1965), 423–429.

    Article  MATH  MathSciNet  Google Scholar 

  11. {3} I. Csiszár, Eine inrofmationtheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen ketten, Publ. Math. Inst. Hungar. Acad. Sci., 8 (1963), 85–108.

    MATH  Google Scholar 

  12. {4} I. Csiszár, Generalized entropy and quantization problems, Trans. Sixth Prague Conference on Inform. Theory, etc., 1971, Academia (Praha, 1973), 299–318.

    Google Scholar 

  13. {5} I. Csiszár, Generalized cutoff rates and Rényi’s information measures, IEEE Trans. Inform. Theory, 41 (1995), 26–34.

    Article  MATH  MathSciNet  Google Scholar 

  14. {6} I. Csiszár and J. Fischer, Informationsentfernungen im Raum der Wahrscheinlichkeitsverteilungen, Publ. Math. Inst. Hungar. Acad. Sci., 7 (1962), 159–182.

    MATH  Google Scholar 

  15. {7} Z. Daróczy, Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen, Acta Math. Sci Hungar., 15 (1964), 203–210.

    Article  MATH  Google Scholar 

  16. {8} Z. Daróczy and I. Kátai, Additive zahlentheoretische Funktionen und das Mass der information, Ann. Univ. Sci Budapest, See. Math., 13 (1970), 83–88.

    MATH  Google Scholar 

  17. {9} P. Erdős, On the distribution function of additive functions, Annals of Math., 17 (1946), 1–20.

    Article  Google Scholar 

  18. {10} J. Fritz, An information-theoretical proof of limit theorems for reversible Markov processes, Trans. Sixth Prague Conference on Inform. Theory, etc., 1971. Academia (Praha, 1973), 183–197.

    Google Scholar 

  19. {11} J. Fritz, An approach to the entropy of point processes, Periodica Math. Hungar., 3 (1973), 73–83.

    Article  MATH  MathSciNet  Google Scholar 

  20. {12} P. Gács, Hausdorff-dimension and probability distribution, Periodica Math. Hungar., 3 (1973), 59–71.

    Article  MATH  Google Scholar 

  21. {13} P. Kafka, F. Österreicher and I. Vincze, On powers of f-divergences defining a distance, Studia Sci Math. Hungar., 26 (1991), 415–422.

    MATH  MathSciNet  Google Scholar 

  22. {14} D. Kendall, Information theory and the limit-theorem for Markov chains and processes with a countable infinity of states, Annals Inst. Statist. Math., 15 (1963), 137–143.

    Article  MATH  MathSciNet  Google Scholar 

  23. {15} T. Nemetz, Equivalence-orthogonality dichotomies of probability measures, Limit Theorems of Probability Theory, Colloquia Math. Soc. J. Bolyai, Vol. 11, North Holland (1975), 183–191.

    Google Scholar 

  24. {16} F. Österreicher, The construction of least favourable distributions is traceable to a minimal perimeter problem, Studia Sci Math. Hungar., 17 (1982), 341–351.

    MATH  MathSciNet  Google Scholar 

  25. {17} M. Puri and I. Vincze, Measure of information and contiguity, Statistics and Probability Letters, 9 (1990), 223–228.

    Article  MATH  MathSciNet  Google Scholar 

  26. {18} M. Rudemo, Dimension and entropy for a class of stochastic processes, Publ. Math. Inst. Hungar. Acad. Sci., 9 (1964), 73–87.

    MATH  MathSciNet  Google Scholar 

  27. {19} C. Shannon, A mathematical theory of communication, Bell System Technical Journal, 27 (1948), 379–423 and 623–656.

    MATH  MathSciNet  Google Scholar 

  28. {20} C. Shannon, Communication in the presence of noice, Proc. IRE, 37 (1949), 10–21.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 János Bolyai Mathematical Society and Springer-Verlag

About this chapter

Cite this chapter

Csiszár, I. (2006). Stochastics: Information Theory. In: Horváth, J. (eds) A Panorama of Hungarian Mathematics in the Twentieth Century I. Bolyai Society Mathematical Studies, vol 14. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30721-1_17

Download citation

Publish with us

Policies and ethics