Advertisement

On the capacity of the arbitrarily varying channel for maximum probability of error

  • I. Csiszár
  • J. Körner
Article

Keywords

Stochastic Process Probability Theory Mathematical Biology Maximum Probability 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ahlswede, R.: A note on the existence of the weak capacity for channels with arbitrarily varying channel probability functions and its relation to Shannon's zero error capacity. Annals Math. Statist. 41, 1027–1033 (1970)Google Scholar
  2. 2.
    Ahlswede, R.: Elimination of correlation in random codes for arbitrarily varying channels. Z. Wahrscheinlichkeitstheorie verw. Gebiete 44, 159–175 (1978)Google Scholar
  3. 3.
    Ahlswede, R.: A method of coding and an application to arbitrarily varying channels. Journal of Combinatorics, Information and System Sciences 5, 10–35 (1980)Google Scholar
  4. 4.
    Ahlswede, R., Wolfowitz, J.: The capacity of a channel with arbitrarily varying cpf. and binary output alphabet. Z. Wahrscheinlichkeitstheorie verw. Gebiete 15, 186–194 (1970)Google Scholar
  5. 5.
    Csiszár. I.: Joint source-channel error exponent. Problems of Control and Information Theory, 9, 315–328 (1980)Google Scholar
  6. 6.
    Csiszár, I., Körner, J.: Graph decomposition: a new key to coding theorems. IEEE Trans. Information Theory 27, 5–12 (1981)Google Scholar
  7. 7.
    Csiszár, I., Körner, J.: Information Theory: Coding Theorems for Discrete Memoryless Systems. New York: Academic Press, 1981Google Scholar
  8. 8.
    Csiszár, I., Körner, J., Marton, K.: A new look at the error exponent of discrete memoryless channels. Preprint. Presented at the International Symposium on Information Theory, Cornell Univ., Ithaca, N.Y., 1977Google Scholar
  9. 9.
    Dueck, G., Körner, J.: Reliabily function of a discrete memoryless channel at rates above capacity. IEEE Trans. Information Theory 25, 82–85 (1979)Google Scholar
  10. 10.
    Goppa, V.D.: Nonprobabilistic mutual information without memory (in Russian). Problems of Control and Information Theory 4, 97–102 (1975)Google Scholar
  11. 11.
    Kiefer, J., Wolfowitz, J.: Channels with arbitrarily varying channel probability functions. Information and Control 5, 44–54 (1962)Google Scholar
  12. 12.
    Körner, J., Sgarro, A.: Universally attainable error exponents for broadcast channels with degraded message sets. IEEE Trans. Information Theory 26, 670–679 (1980)Google Scholar
  13. 13.
    Lovász, L.: On the Shannon capacity of a graph. IEEE Trans. Information Theory 25, 1–7 (1979)Google Scholar
  14. 14.
    Pinsker, M.S.: Information and Information Stability of Random Variables and Processes (in Russian). Problemy Peredăci Informacii Vol 7, AN SSSR, Moscow, 1960. English Translation: San Francisco: Holden-Day, 1964Google Scholar
  15. 15.
    Shannon, C.E.: The zero error capacity of a noisy channel. IRE Transactions Information Theory 2, 8–19 (1956)Google Scholar
  16. 16.
    Stambler, S.Z.: Shannon theorem for a full class of channels with state known at the output (in Russian). Problemy Peredači Informacii 14, no. 4, 3–12 (1975)Google Scholar
  17. 17.
    Wolfowitz, J.: Coding Theorems of Information Theory, 3rd edition. Berlin-Heidelberg-New York: Springer 1978Google Scholar

Copyright information

© Springer-Verlag 1981

Authors and Affiliations

  • I. Csiszár
    • 1
  • J. Körner
    • 1
  1. 1.Mathematical Institute of the Hungarian Academy of SciencesBudapestHungary

Personalised recommendations