Advertisement

On Concepts of Performance Parameters for Channels

  • R. Ahlswede
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4123)

Abstract

Among the mostly investigated parameters for noisy channels are code size, error probability in decoding, block length; rate, capacity, reliability function; delay, complexity of coding. There are several statements about connections between these quantities. They carry names like “coding theorem”, “converse theorem” (weak, strong, ...), “direct theorem”, “capacity theorem”, “lower bound”, “upper bound”, etc. There are analogous notions for source coding.

This note has become necessary after the author noticed that Information Theory suffers from a lack of precision in terminology. Its purpose is to open a discussion about this situation with the goal to gain more clarity.

There is also some confusion concerning the scopes of analytical and combinatorial methods in probabilistic coding theory, particularly in the theory of identification. We present a covering (or approximation) lemma for hypergraphs, which especially makes strong converse proofs in this area transparent and dramatically simplifies them.

Keywords

Channel Capacity Block Length Product Channel Code Size Capacity Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ahlswede, R.: Certain results in coding theory for compound channels. Proc. Colloquium Inf. Th. Debrecen (Hungary), 35–60 (1967)Google Scholar
  2. 2.
    Ahlswede, R.: Beiträge zur Shannonschen Informationstheorie im Fall nichtstationärer Kanäle. Z. Wahrscheinlichkeitstheorie und verw. Geb. 10, 1–42 (1968); Dipl. Thesis Nichtstationäre Kanäle, Göttingen (1963)Google Scholar
  3. 3.
    Ahlswede, R.: The weak capacity of averaged channels, Z. Wahrscheinlichkeitstheorie und verw. Geb. 11, 61–73 (1968)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Ahlswede, R.: On two–way communication channels and a problem by Zarankiewicz. In: Sixth Prague Conf. on Inf. Th., Stat. Dec. Fct’s and Rand. Proc., September 1971, pp. 23–37. Publ. House Chechosl. Academy of Sc. (1973)Google Scholar
  5. 5.
    Ahlswede, R.: An elementary proof of the strong converse theorem for the multiple–access channel. J. Combinatorics, Information and System Sciences, 7(3), 216–230 (1982)MATHMathSciNetGoogle Scholar
  6. 6.
    Ahlswede, R.: Coloring hypergraphs: A new approach to multi–user source coding I. Journ. of Combinatorics, Information and System Sciences 4(1), 76–115 (1979)MATHMathSciNetGoogle Scholar
  7. 7.
    Ahlswede, R.: Coloring hypergraphs: A new approach to multi–user source coding II. Journ. of Combinatorics, Information and System Sciences 5(3), 220–268 (1980)MATHMathSciNetGoogle Scholar
  8. 8.
    Ahlswede, R., Balakirsky, V.: Identification under random processes, Preprint 95–098, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld, Problemy peredachii informatsii (special issue devoted to M.S. Pinsker) 32(1), 144–160 (1996); Problems of Information Transmission 32(1), 123–138 (1996)Google Scholar
  9. 9.
    Ahlswede, R., Csiszár, I.: Common randomness in information theory and cryptography, part I: secret sharing. IEEE Trans. Information Theory 39(4), 1121–1132 (1993)MATHCrossRefGoogle Scholar
  10. 10.
    Ahlswede, R., Csiszár, I.: Common randomness in information theory and cryptography, part II: CR capacity, Preprint 95–101, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Inf. Theory 44(1), 55–62 (1998)Google Scholar
  11. 11.
    Ahlswede, R., Dueck, G.: Every bad code has a good subcode: a local converse to the coding theorem, Z. Wahrscheinlichkeitstheorie und verw. Geb. 34, 179–182 (1976)MATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Ahlswede, R., Dueck, G.: Identification via channels. IEEE Trans. Inf. Theory 35(1), 15–29 (1989)MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Ahlswede, R., Dueck, G.: Identification in the presence of feedback — a discovery of new capacity formulas. IEEE Trans. on Inf. Theory 35(1), 30–39 (1989)MATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Ahlswede, R., Wolfowitz, J.: The structure of capacity functions for compound channels. In: Proc. of the Internat. Symposium on Probability and Information Theory at McMaster University, Canada, April 1968, pp. 12–54 (1969)Google Scholar
  15. 15.
    Ahlswede, R., Verboven, B.: On identification via multi–way channels with feedback. IEEE Trans. Information Theory 37(5), 1519–1526 (1991)MATHCrossRefMathSciNetGoogle Scholar
  16. 16.
    Ahlswede, R., Zhang, Z.: New directions in the theory of identification via channels, Preprint 94–010, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Information Theory 41(4), 1040–1050 (1995)Google Scholar
  17. 17.
    Ahlswede, R., Cai, N., Zhang, Z.: Erasure, list, and detection zero–error capacities for low noise and a relation to identification, Preprint 93–068, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Information Theory 42(1), 55–62 (1996)Google Scholar
  18. 18.
    Ahlswede, R., Gács, P., Körner, J.: Bounds on conditional probabilities with applications in multiuser communication, Z. Wahrscheinlichkeitstheorie und verw. Geb. 34, 157–177 (1976)MATHCrossRefGoogle Scholar
  19. 19.
    Ahlswede, R.: General theory of information transfer, Preprint 97–118, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld (1997); General theory of information transfer: updated, General Theory of Information Transfer and Combinatorics, a Special Issue of Discrete Applied Mathematics (to appear)Google Scholar
  20. 20.
    Ash, R.: Information Theory, Interscience Tracts in Pure and Applied Mathematics, vol. 19. Wiley & Sons, New York (1965)Google Scholar
  21. 21.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, Series in Telecommunications. J. Wiley & Sons, Chichester (1991)MATHCrossRefGoogle Scholar
  22. 22.
    Csiszár, I., Körner, J.: Information Theory — Coding Theorem for Discrete Memoryless Systems. Academic, New York (1981)Google Scholar
  23. 23.
    Dobrushin, R.L.: General formulation of Shannon’s main theorem of information theory. Usp. Math. Nauk. 14, 3–104 (1959); Translated in Am. Math. Soc. Trans. 33, 323–438 (1962)Google Scholar
  24. 24.
    Fano, R.M.: Transmission of Information: A Statistical Theory of Communication. Wiley, New York (1961)Google Scholar
  25. 25.
    Feinstein, A.: Foundations of Information Theory. McGraw–Hill, New York (1958)MATHGoogle Scholar
  26. 26.
    Gallager, R.G.: A simple derivation of the coding theorem and some applications. IEEE Trans. Inf. Theory, 3–18 (1965)Google Scholar
  27. 27.
    Gallager, R.G.: Information Theory and Reliable Communication. J. Wiley and Sons, Inc., New York (1968)MATHGoogle Scholar
  28. 28.
    Han, T.S.: Oral communication (1998)Google Scholar
  29. 29.
    Han, T.S.: Information – Spectrum Methods in Information Theory (April 1998) (in Japanese)Google Scholar
  30. 30.
    Han, T.S., Verdú, S.: Approximation theory of output statistics. IEEE Trans. Inf. Theory IT–39(3), 752–772 (1993)CrossRefGoogle Scholar
  31. 31.
    Han, T.S., Verdú, S.: New results in the theory of identification via channels. IEEE Trans. Inf. Theory 39(3), 752–772 (1993)MATHCrossRefGoogle Scholar
  32. 32.
    Jacobs, K.: Almost periodic channels, Colloquium on Combinatorial Methods in Probability Theory, pp. 118–126, Matematisk Institute, Aarhus University, August 1–10 (1962)Google Scholar
  33. 33.
    Jelinek, F.: Probabilistic Information Theory (1968)Google Scholar
  34. 34.
    Kesten, H.: Some remarks on the capacity of compound channels in the semicontinuous case. Inform. and Control 4, 169–184 (1961)MATHCrossRefMathSciNetGoogle Scholar
  35. 35.
    Pinsker, M.S.: Information and Stability of Random Variables and Processes, Izd. Akad. Nauk (1960)Google Scholar
  36. 36.
    Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)Google Scholar
  37. 37.
    Shannon, C.E.: The zero error capacity of a noisy channel. IRE, Trans. Inf. Theory 2, 8–19 (1956)CrossRefMathSciNetGoogle Scholar
  38. 38.
    Shannon, C.E.: Certain results in coding theory for noisy channels. Inform. and Control 1, 6–25 (1957)MATHCrossRefMathSciNetGoogle Scholar
  39. 39.
    Verdú, S., Han, T.S.: A general formula for channel capacity. IEEE Trans. Inf. Theory 40(4), 1147–1157 (1994)MATHCrossRefGoogle Scholar
  40. 40.
    Wolfowitz, J.: The coding of messages subject to chance errors. Illinois Journal of Mathematics 1, 591–606 (1957)MATHMathSciNetGoogle Scholar
  41. 41.
    Wolfowitz, J.: Coding theorems of information theory, 3rd edn. Ergebnisse der Mathematik und ihrer Grenzgebiete, Band 31. Springer, Berlin- New York (1978)Google Scholar
  42. 42.
    Wyner, A.D.: The capacity of the product channel. Information and Control 9, 423–430 (1966)MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • R. Ahlswede
    • 1
  1. 1.Fakultät für MathematikUniversität BielefeldBielefeldGermany

Personalised recommendations