Skip to main content

On \(\lambda \)-Capacities and Information Stability

  • Chapter
  • First Online:
Transmitting and Gaining Data

Abstract

In Chap. 1, we introduced \(\lambda \)-capacities with several specifications and mainly concentrated on CC.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. R. Ahlswede, Certain results in coding theory for compound channels. in Proceedings Colloquium Information Theory, Debrecen, Hungary, pp. 35–60 (1967)

    Google Scholar 

  2. R. Ahlswede, The weak capacity of averaged channels. Z. Wahrscheinlichkeitstheorie und verw. Gebiete 11, 61–73 (1968)

    Article  MATH  MathSciNet  Google Scholar 

  3. R. Ahlswede, Coloring hypergraphs: a new approach to multi-user source coding I. J. Comb. Inf. Syst. Sci. 4(1), 76–115 (1979)

    MATH  MathSciNet  Google Scholar 

  4. R. Ahlswede, Coloring hypergraphs: a new approach to multi-user source coding II. J. Comb. Inf. Syst. Sci. 5(3), 220–268 (1980)

    MATH  MathSciNet  Google Scholar 

  5. R.L. Dobrushin, General formulation of Shannon’s basic theorem in information theory (in Russian). Uspehi Mat Nauk 14(6), 3–104 (1959)

    MATH  MathSciNet  Google Scholar 

  6. I.M. Gelfand, A.M. Yaglom, A.N. Kolmogoroff, Zur allgemeinen Definition der Information. Arbeiten zur Informationstheorie II (VEB Deutscher Verlag der Wissenschaften, Berlin, 1958)

    Google Scholar 

  7. S. Goldman, Information Theory (Constable, London, 1953)

    MATH  Google Scholar 

  8. P.R. Halmos, Measure Theory (Van Nostrand, New York, 1958)

    Google Scholar 

  9. K. Jacobs, Die Übertragung diskreter Informationen durch periodische und fastperiodische Kanäle. Math. Ann. 137, 125–135 (1959)

    Article  MATH  MathSciNet  Google Scholar 

  10. K. Jacobs, Almost periodic channels. Coll. Comb. Meth. Probab. Theory, Aarhus, pp. 118–126 (1962)

    Google Scholar 

  11. K. Jacobs, Über die Struktur der mittleren Entropie. Math. Z. 78, 33–43 (1962)

    Article  MATH  MathSciNet  Google Scholar 

  12. K. Jacobs, Measure and Integral (Academic, New York, 1978)

    MATH  Google Scholar 

  13. J.C. Kieffer, A general formula for the capacity of stationary nonanticipatory channels. Inf. Control 26(4), 381–391 (1974)

    Article  MathSciNet  Google Scholar 

  14. J. Nedoma, The capacity of a discrete channel. in Transactions ofthe First Prague Conference on Information Theory, Statistical Decision Functions, RandomProcesses (1957), pp. 143–182

    Google Scholar 

  15. J. Nedoma, On non-ergodic channels. in Transactions ofthe Second Prague Conference on Information Theory, Statistical Decision Functions, RandomProcesses (1960), pp. 363–395

    Google Scholar 

  16. J.C. Oxtoby, Ergodic Sets. Bull. Amer. Math. Soc. 58, 116–136 (1952)

    Article  MATH  MathSciNet  Google Scholar 

  17. K.R. Parthasarathy, Effective entropy rate and transmission of information through channels with additive random noise. Sankhy Ser. A 25, 75–84 (1963)

    MATH  Google Scholar 

  18. A. Perez, Sur la théorie de l’information dans le cas d’un alphabet abstrait, in Transactions of the First Prague Conference on Information Theory, Statistical Decision Functions, Random Processes (1957), pp. 209–244

    Google Scholar 

  19. M.S. Pinsker, Information and Information Stability of Random Variables and Processes, Problemy Peredachi Informacii, vol. 7 (AN SSSR, Moscow, 1960) (in Russian)

    Google Scholar 

  20. M.S. Pinsker, Arbeiten zur Informationstheorie, V (Mathematische Forschungsberichte) (VEB Deutscher Verlag der Wissenschaften, Berlin, 1963)

    Google Scholar 

  21. K. Winkelbauer, Communication channels with finite past history, Transactions of the Second Prague Conference on Information Theory, Statistical Decision Functions, Random Processes (1960), pp. 685–831

    Google Scholar 

  22. J. Wolfowitz, On channels without a capacity. Inf. Control 6, 49–54 (1963)

    Article  MATH  MathSciNet  Google Scholar 

Further Reading

  1. R. Ahlswede, Beiträge zur Shannonschen Informationstheorie im Falle nichtstationärer Kanäle. Z. Wahrscheinlichkeitstheorie und verw. Geb. 10, 1–42 (1968)

    Article  MATH  MathSciNet  Google Scholar 

  2. R. Ahlswede, J. Wolfowitz, The structure of capacity functions for compound channels, in Proceedings of International Symposium Probability and Information Theory, McMaster University, Canada, April 1968 (1969), pp. 12–54

    Google Scholar 

  3. D. Blackwell, L. Breiman, A.J. Thomasian, The capacity of a class of channels. Ann. Math. Stat. 30(4), 1229–1241 (1959)

    Article  MATH  MathSciNet  Google Scholar 

  4. H.G. Ding, On the information stability of a sequence of channels. Theory Probab. Appl. 7, 258–269 (1962)

    Article  Google Scholar 

  5. R.L. Dobrushin, Arbeiten zur Informationstheorie IV (VEB Deutscher Verlag der Wissenschaften, Berlin, 1963)

    Google Scholar 

  6. A. Feinstein, Foundations of Information Theory (McGraw-Hill, London, 1958)

    MATH  Google Scholar 

  7. I.M. Gelfand, A.M. Yaglom, Über die Berechnung der Menge an Information über eine zufällige Funktion, die in einer anderen zufälligen Funktion enthalten ist. Arbeiten zur Informationstheorie II (VEB Deutscher Verlag der Wissenschaften, Berlin, 1958)

    Google Scholar 

  8. H. Kesten, Some remarks on the capacity of compound channels in the semicontinuous case. Inf. Control 4, 169–184 (1961)

    Article  MATH  MathSciNet  Google Scholar 

  9. J.C. Kieffer, A simple proof of the Moy-Perez generalization of the Shannon–McMillan theorem. Pacific J. Math. 51, 203–206 (1974)

    Article  MATH  MathSciNet  Google Scholar 

  10. J.C. Oxtoby, Measure and Category, Graduate Texts in Mathematics, vol. 2 (Springer, New York, 1971)

    Google Scholar 

  11. J.C. Oxtoby, Maßund Kategorie, Hochschultext (Springer, Berlin, 1971)

    Book  Google Scholar 

  12. K.R. Parthasarathy, On the integral representation of the rate of transmission of a stationary channel. Illinois J. Math. 5, 299–305 (1961)

    MATH  MathSciNet  Google Scholar 

  13. J. Wolfowitz, Simultaneous channels. Arch. Rat. Mech. Anal. 4(4), 371–386 (1960)

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Deppe .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Ahlswede, R., Ahlswede, A., Althöfer, I., Deppe, C., Tamm, U. (2015). On \(\lambda \)-Capacities and Information Stability. In: Ahlswede, A., Althöfer, I., Deppe, C., Tamm, U. (eds) Transmitting and Gaining Data. Foundations in Signal Processing, Communications and Networking, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-319-12523-7_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12523-7_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12522-0

  • Online ISBN: 978-3-319-12523-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics