Among the directions of research in GTIT-C, which could be established, we present in Chapters I-IX contributions of participants, which took shape in written form. The papers were thoroughly refereed. For the ease of reference they are numbered and in addition labelled by the letter B, which hints at this book.


Quantum Channel Network Code Broadcast Channel Noisy Channel Quantum Information Theory 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [C1]
    Althöfer, I., Cai, N., Dueck, G., Khachatrian, L.H., Pinsker, M., Sárközy, A., Wegener, I., Zhang, Z. (eds.): Numbers, Information and Complexity, Special volume in honour of R. Ahlswede on occasion of his 60th birthday. Kluwer Acad. Publ., Boston (2000)Google Scholar
  2. [C2]
    Armstrong, E.H.: A method of reducing disturbances in radio signating by a system of frequency modulation. Proc. Inst. Radio Eng. 24, 689–740 (1936)Google Scholar
  3. [C3]
    Beneś, V.E.: Optimal rearrangeable multistage connecting networks. Bell System Tech. J. 43, 1641–1656 (1964)MATHMathSciNetGoogle Scholar
  4. [C4]
    Bennett, C.H., Bernstein, E., Brassard, G., Vazirani, U.V.: Strength and weakness of quantum computing. SIAM J. on Computing 26(5), 1510–1523 (1997)MATHCrossRefMathSciNetGoogle Scholar
  5. [C5]
    Berlekamp, E.R.: Block coding with noiseless feedback, Phd-thesis, MIT, Cambridge, MA (1964)Google Scholar
  6. [C6]
    Bergmans, P.P.: Random coding theorem for broadcast channels with degraded components. IEEE Trans. Inform. Theory 19(2), 197–207 (1973)CrossRefMathSciNetGoogle Scholar
  7. [C7]
    Berger, T.: Rate Distortion Theory: A Mathematical Basis for Data Compression. Prentice-Hall, Englewood Cliffs (1971)Google Scholar
  8. [C8]
    Blackwell, D., Breiman, L., Thomasian, A.J.: The capacity of a certain channel classes under random coding. Ann. Math. Statist. 31, 558–567 (1960)MATHCrossRefMathSciNetGoogle Scholar
  9. [C9]
    Blume, F.: Possible rates of entropy convergence. Ergodic Theory and Dynam. Systems 17(1), 45–70 (1997)MATHCrossRefMathSciNetGoogle Scholar
  10. [C10]
    Bradbury, J.W., Vehrencamp, S.L.: Principles of Animal Communication. Sinauer Assoc., Inc. Publishers Sunderland, Mass., USA (1998)Google Scholar
  11. [C11]
    Carnap, R., Hillel, Y.B.: An Outless of a Theory of Semantic Information, in Language and Information, Massachusetts (1964)Google Scholar
  12. [C12]
    Cherry, E.C.: On Human Communication, a Review, a Survey and a Criticism. MIT–Press, Cambridge 1957 (1966)Google Scholar
  13. [C13]
    Cherry, E.C.: A history of information theory. Proc. Inst. Elec. Eng. (London) 98(pt. 3), 383–393 (1951)Google Scholar
  14. [C14]
    Childs, A.M., Eisenberg, J.M.: Quantum algorithms for subset finding, e–print, quant-ph/0311038 (2003)Google Scholar
  15. [C15]
    Clos, C.: A study of non–blocking switching networks. Bell System Tech. J. 32, 406–424 (1953)Google Scholar
  16. [C16]
    Cover, T.M.: Broadcast channels. IEEE Trans. Inform. Theory 18, 2–14 (1972)MATHCrossRefMathSciNetGoogle Scholar
  17. [C17]
    Cover, T.M., Thomas, J.: Elements of Information Theory. Wiley, Chichester (1991)MATHCrossRefGoogle Scholar
  18. [C18]
    Csibi, S., van der Meulen, E.C.: Error probabilities for identification coding and least length single sequence hopping. In: Althöfer, I., Cai, N., Dueck, G., Khachatrian, L., Pinsker, M.S., Sárközy, A., Wegener, I., Zhang, Z. (eds.) Numbers, Information and Complexity, special volume in honour of R. Ahlswede on occasion of his 60th birthday, pp. 221–238. Kluwer Academic Publishers, Boston (2000)Google Scholar
  19. [C19]
    Csiszár, I., Körner: Broadcast channel with confidential message. IEEE Trans. Inform. Theory 24, 339–348 (1978)MATHCrossRefMathSciNetGoogle Scholar
  20. [C20]
    Csiszár, I., Körner: Information Theory: Coding Theorems for Discrete Memoryless Systems. In: Probability and Mathematical Statistics. Academic Press, New York (1981)Google Scholar
  21. [C21]
    Davisson, L.D.: Universal noiseless coding. IEEE Trans. Inform. Theory IT-19(6) (1973)Google Scholar
  22. [C22]
    Diels, H., Plamböck, G.: Die Fragmente der Vorsokratiker, Rowohlt (1957)Google Scholar
  23. [C23]
    Dodunekov, S.M.: Optimization problems in coding theory. In: Survey presented at a Workshop on Combinatorial Search, in Budapest, April 23-26 (2005)Google Scholar
  24. [C24]
    Doob, J.L.: Review of A mathematical theory of communication. Mathematical Reviews 10, 133 (1949)Google Scholar
  25. [C25]
    Dueck, G.: Omnisophie: über richtige, wahre und natürliche Menschen. Springer, Heidelberg (2003)Google Scholar
  26. [C26]
    Dudley, H.: The vocoder. Bell. Lab. Rec. 18, 122–126 (1939)Google Scholar
  27. [C27]
    Eigen, M.: Self-organization of matter and the evolution of biological macro molecules. Naturwissenschaften 58, 465 (1971)CrossRefGoogle Scholar
  28. [C28]
    Eigen, M.: Wie entsteht Information? Ber. Bunsenges. Phys. Chem. 80, 1060 (1976)MATHGoogle Scholar
  29. [C29]
    Eigen, M.: Macromolecular evolution: dynamical ordering in sequence space. Ber. Bunsenges. Phys. Chem. 89, 658 (1985)Google Scholar
  30. [C30]
    Eigen, M.: Sprache und Lernen auf molekularer Ebene. In: Peise, A., Mohler, K. (eds.) Der Mensch und seine Sprache, Berlin (1979)Google Scholar
  31. [C31]
    Einstein, A., Podolsky, B., Rosen, N.: Can quantum-mechanical description of physical reality be considered complete? Phys. Rev. 47, 777–780 (1935)MATHCrossRefGoogle Scholar
  32. [C32]
    Elias, P., Feinstein, A., Shannon, C.E.: A note on the maximum flow through a network. IEEE Trans. Inform. Theory 11 (1956)Google Scholar
  33. [C33]
    Estoup, J.B.: Gammes Sténographique, Paris (1916)Google Scholar
  34. [C34]
    Feder, M., Merhav, N., Gutman, M.: Universal prediction of individual sequences. IEEE Trans. Inform. Theory 38, 1258–1270 (1992)MATHCrossRefMathSciNetGoogle Scholar
  35. [C35]
    Ford, L.R., Fulkerson, D.R.: Flows in Networks. Princeton University Press, Princeton (1962)MATHGoogle Scholar
  36. [C36]
    Gács, P., Körner, J.: Common information is far less than mutual information. Problems of Control and Information Theory/Problemy Upravlenija i Teorii Informacii 2(2), 149–162 (1973)MATHMathSciNetGoogle Scholar
  37. [C37]
    Gallager, R.G.: Traffic capacity and coding for certain broadcast channels (Russian). Problemy Peredači Informacii 10(3), 3–14 (1974)MATHMathSciNetGoogle Scholar
  38. [C38]
    Gallager, R.G.: A perspective on multi–access channels. IEEE Trans. Inf. Theory 31(2) (1985)Google Scholar
  39. [C39]
    Gilbert, E.N.: How good is Morsecode. Inform. Contr. 14, 559–565 (1969)MATHCrossRefGoogle Scholar
  40. [C40]
    Grover, L.K.: A fast quantum mechanical algorithm for database search. In: Proceedings of the 28th Annual ACM Symp. on Theory of Computing (STOC), pp. 212–219 (1996)Google Scholar
  41. [C41]
    Grover, L.K.: Quantum mechanics helps in searching for a needle in a haystack. Phys. Rev. Letters 78(2), 325–328 (1997)CrossRefGoogle Scholar
  42. [C42]
    Guercerio, G.B., Gödel, K.: Spektrum der Wissenschaft (Deutsche Ausgabe Scientific American), Biographie (January 2002)Google Scholar
  43. [C43]
    Haemers, W.: On some problems of Lovasz concerning the Shannon capacity of a graph. IEEE Trans. Inform. Theory 25(2), 231–232 (1979)MATHCrossRefMathSciNetGoogle Scholar
  44. [C44]
    Han, T.S., Amari, S.I.: Statistical inference under multiterminal data compression, information theory: 1948–1998. IEEE Trans. Inform. Theory 44(6), 2300–2324 (1998)MATHCrossRefMathSciNetGoogle Scholar
  45. [C45]
    Han, T.S., Verdu, S.: New results in the theory of identification via channels. IEEE Trans. Inform. Theory 38(1), 14–25 (1993)CrossRefMathSciNetGoogle Scholar
  46. [C46]
    Haroutunian, E.A., Harutyunyan, A.N.: Successive refinement of information with reliability criterion. In: Proc. IEEE Int Symp. Inform. Theory, Sorrento, Italy, June 2000, p. 205 (2000)Google Scholar
  47. [C47]
    Haroutunian, E.A.: Upper estimate of transmission rate for memoryless channel with countable number of output signals under given error probability exponent (in Russian). In: 3rd All-Union Conf. on Theory of Information Transmission and Coding, Uzhgorod, Publication house of Uzbek Academy of Sciences, Tashkent, pp. 83–86 (1967)Google Scholar
  48. [C48]
    Hartley, R.V.L.: Transmission of information. Bell. Syst. Tech. J. 7, 535–563 (1928)Google Scholar
  49. [C49]
    Hauser, M.D.: The Evolution of Communication. MIT Press, Cambridge (1997)Google Scholar
  50. [C50]
    Holevo, A.S.: Problems in the mathematical theory of quantum communication channels. Rep. Math. Phys. 12(2), 273–278 (1977)MATHCrossRefMathSciNetGoogle Scholar
  51. [C51]
    Hollmann, H.D.L., van Lint, J.H., Lennartz, J.P., Tolhuizen, L.M.G.M.: On codes with the identifiable parent property. J. Combin. Theory Ser. A 82(2), 121–133 (1998)MATHCrossRefMathSciNetGoogle Scholar
  52. [C52]
    Horodecki, M.: Is the classical broadcast channel additive? Oral Communication, Cambridge, England (December 2004)Google Scholar
  53. [C53]
    Jaggi, S., Sanders, P., Chou, P.A., Effros, M., Egner, S., Jain, K., Tolhuizen, L.: Polynomial time algorithms for multicast network code construction. IEEE Trans. on Inform. Theory 51(6), 1973–1982 (2005)CrossRefMathSciNetGoogle Scholar
  54. [C54]
    Katona, G., Tichler, K.: When the lie depends on the target. In: A Workshop on Combinatorial Search, in Budapest, April 23-26 (2005)Google Scholar
  55. [C55]
    Kautz, W., Singleton, R.: Nonrandom binary superimposed codes. IEEE Trans. Inform. Theory 10, 363–377 (1964)MATHCrossRefGoogle Scholar
  56. [C56]
    Körner, J., Marton, K.: General broadcast channels with degraded message sets. IEEE Trans. Inform. Theory 23, 60–64 (1977)MATHCrossRefGoogle Scholar
  57. [C57]
    Körner, J.: Some methods in multi-user communication – a tutorial survey. In: Longo, G. (ed.) Information Theory, New Trends and Open Problems. CISM Courses and Lectures, vol. 219, pp. 173–224. Springer, Wien (1975)Google Scholar
  58. [C58]
    Koetter, R., Medard, M.: An algebraic approach to network coding. Transactions on Networking 11(5), 782–795 (2003)CrossRefGoogle Scholar
  59. [C59]
    Kolmogorov, A.N.: Logical basis for information theory and probability theory. IEEE Trans. Inform. Theory 14, 663 (1968)CrossRefMathSciNetGoogle Scholar
  60. [C60]
    Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Internat. J. Comput. Math. 2, 157–168 (1968)MATHCrossRefMathSciNetGoogle Scholar
  61. [C61]
    Koutsoupias, E., Papadimitriou, C.: Worst-case equilibria. In: Proceedings of the 16th Symposium on Theoretical Aspects of Computer Science (STACS), pp. 404–413 (1999)Google Scholar
  62. [C62]
    Küpfmüller, K.: Über Einschwingvorgänge in Wellen Filter. Elek. Nachrichtentech. 1, 141–152 (1924)Google Scholar
  63. [C63]
    Küppers, B.O.: Der semantische Aspekt von Information und seine evolutionsbiologische Bedeutung. Nova Acta Leopoldina Bd. 72(294), 195–219 (1996)Google Scholar
  64. [C64]
    Larson, J.R.: Notes on the theory of modulation. Proc. Inst. Radio Eng. 10, 57–69 (1922)Google Scholar
  65. [C65]
    Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and Its Applications, 2nd edn. Springer, Heidelberg (1997)MATHGoogle Scholar
  66. [C66]
    Löber, P.: Quantum channels and simultaneous ID coding, Dissertation, Universität Bielefeld, Germany (1999)Google Scholar
  67. [C67]
    Lüders, G.: Über die Zustandsänderung durch den Meßprozeß. Ann. d. Physik 8, 322–328 (1951)MATHGoogle Scholar
  68. [C68]
    Massey, J.L.: Guessing and entropy. In: Proc. IEEE Int. Symp. on Info. Th., p. 204 (1994)Google Scholar
  69. [C69]
    McEliece, R., Posner, E.C.: Hiding and covering in a compact metric space. Ann. Statist. 1, 729–739 (1973)MATHCrossRefMathSciNetGoogle Scholar
  70. [C70]
    Monod, J.: Zufall und Notwendigkeit, München (1971)Google Scholar
  71. [C71]
    Nowak, M.A., Krakauer, D.C.: The evolution of language. PNAS 96(14), 8028–8033 (1999)CrossRefGoogle Scholar
  72. [C72]
    Nyquist, H.: Certain factors affecting telegraph speed. Bell. Syst. Tech. J. 3, 324–352 (1924)Google Scholar
  73. [C73]
    Olms, G.: Briefwechsel: Carl Friedrich Gauss und Friedrich Wilhelm Bessel, Hildesheim (1975)Google Scholar
  74. [C74]
    Pierce, J.R.: The early days of Information Theory. IEEE Trans. Inform. Theory I-19(1), 3–8 (1973)CrossRefMathSciNetGoogle Scholar
  75. [C75]
    Polanyi, M.: Life’s irreducible structure. Science 160, 1308 (1968)CrossRefGoogle Scholar
  76. [C76]
    Ratner, V.A.: Molekulargenetische Steuerungssysteme, Stuttgart (1977)Google Scholar
  77. [C77]
    Renyi, A.: Probability Theory, Amsterdam (1970)Google Scholar
  78. [C78]
    Sanders, P., Egner, S., Tolhuizen, L.: Polynomial Time Algorithms for Network Information Flow. In: Proceedings of the fifteenth annual ACM symposium on Parallel algorithms and architectures, San Diego, California, USA, pp. 286–294 (2003)Google Scholar
  79. [C79]
    Schiller, F.: Die Verschwörung des Fiesco zu Genua (1783)Google Scholar
  80. [C80]
    Schroeder, M.: Fractals, Chaos, Power Laws. W.H. Freeman, New York (1991)MATHGoogle Scholar
  81. [C81]
    Schrödinger, E.: What is Life, Cambridge (1944)Google Scholar
  82. [C82]
    Schumacher, B.: Quantum coding. Phys. Rev. A 51, 2738–2747 (1995)CrossRefMathSciNetGoogle Scholar
  83. [C83]
    Shannon, C.E.: A mathematical theory of communication. Bell Syst. Techn. J. 27, 339–425, 623–656 (1948)Google Scholar
  84. [C84]
    Shannon, C.E.: Prediction and Entropy of Printed English. Bell Sys. Tech. J. 3, 50–64 (1950)MathSciNetGoogle Scholar
  85. [C85]
    Shannon, C.E.: The Bandwagon, Institute of Radio Engineers. Transactions on Information Theory 2 (1956)Google Scholar
  86. [C86]
    Shannon, C.E.: Certain results in coding theory for noisy channels. Inform. and Control 1, 6–25 (1957)MATHCrossRefMathSciNetGoogle Scholar
  87. [C87]
    Shannon, C.E.: Memory requirements in a telephone exchange. Bell System Tech. J. 29, 343–349 (1950)MathSciNetGoogle Scholar
  88. [C88]
    Slepian, D.: Two theorems on a particular crossbar switching network (unpublished manuscript, 1952)Google Scholar
  89. [C89]
    Slepian, D., Wolf, J.: Noiseless coding of correlated information sources. IEEE Trans. Inform. Theory 19, 471–480 (1973)MATHCrossRefMathSciNetGoogle Scholar
  90. [C90]
    Steinberg, Y.: New converses in the theory of identification via channels. IEEE Trans. Inform. Theory 44(3), 984–998 (1998)MATHCrossRefMathSciNetGoogle Scholar
  91. [C91]
    Steinberg, Y., Merhav, N.: Identification in presence of side information with application to watermarking. IEEE Trans. Inform. Theory 47, 1410–1422 (2001)MATHCrossRefMathSciNetGoogle Scholar
  92. [C92]
    Takens, F., Verbitski, E.: Generalized entropies: Renyi and correlation integral approach. Nonlinearity 11(4), 771–782 (1998)MATHCrossRefMathSciNetGoogle Scholar
  93. [C93]
    Tuncel, E., Rose, K.: Error Exponents in scalable source coding. IEEE Trans. Inform. Theory 49, 289–296 (2003)MATHCrossRefMathSciNetGoogle Scholar
  94. [C94]
    Venkatesan, S., Anantharam, V.: The common randomness capacity of a pair of independent discrete memoryless channels. IEEE Trans. Inform. Theory 44(1), 215–224 (1998)MATHCrossRefMathSciNetGoogle Scholar
  95. [C95]
    Venkatesan, S., Anantharam, V.: The common randomness capacity of a network of discrete memoryless channels. IEEE Trans. Inform. Theory 46(2), 367–387 (2000)MATHCrossRefMathSciNetGoogle Scholar
  96. [C96]
    Verdù, S., Wei, V.: Explicit construction of constant-weight codes for identification via channels. IEEE Trans. Inform. Theory 39, 30–36 (1993)MATHCrossRefMathSciNetGoogle Scholar
  97. [C97]
    van der Meulen, E.C.: Random coding theorems for the general discrete memoryless broadcast channel. IEEE Trans. Inform. Theory 21, 180–190 (1975)MATHCrossRefGoogle Scholar
  98. [C98]
    von Neumann, J.: The Computer and the Brain. Yale University Press (1958)Google Scholar
  99. [C99]
    von Neumann, J., Morgenstern, O.: Theory of Games and Economic Behavior. Princeton University Press, Princeton (1944)MATHGoogle Scholar
  100. [C100]
    von Weizsäcker, C.F.: Die Einheit der Natur, München (1971)Google Scholar
  101. [C101]
    Wiener, N.: The theory of communication. Phys. Today 3, 31–32 (1950)Google Scholar
  102. [C102]
    Wiener, N.: Extrapolation, Interpolation, and Smoothing of Stationary Time Series. Technology Press of the Massachusetts Institute of Technology, Cambridge. Wiley, New York. Chapman & Hall, London (1949)Google Scholar
  103. [C103]
    Wilmink, R.: Quantum broadcast channels and cryptographic applications for separable states, Dissertation, Universität Bielefeld, Germany, 69 pages (2003)Google Scholar
  104. [C104]
    Winter, A.: Coding theorems of quantum information theory, Dissertation, Universität Bielefeld, Germany (1999)Google Scholar
  105. [C105]
    Winter, A.: The capacity region of the quantum multiple access channel, e-print, quant-ph/9807019, 1998, and IEEE Trans. Inform. Theory 47(7), 3059–3065 (2001)Google Scholar
  106. [C106]
    Winter, A.: Quantum and classical message identification via quantum channels, e-print, quant-ph/0401060 (2004)Google Scholar
  107. [C107]
    Wolfowitz, J.: Coding theorems of information theory, Ergebnisse der Mathematik und ihrer Grenzgebiete, Heft 31. Springer-Verlag, Berlin-Göttingen-Heidelberg. Prentice-Hall, Englewood Cliffs (1961); 3rd edn. (1978)Google Scholar
  108. [C108]
    Wyner, A.D.: The wire-tap channel. Bell Sys. Tech. J. 54, 1355–1387 (1975)MathSciNetGoogle Scholar
  109. [C109]
    Wyner, A.D., Ziv, J.: A theorem on the entropy of certain binary sequences and applications II. IEEE Trans. Information Theory 19, 772–777 (1973)MATHCrossRefMathSciNetGoogle Scholar
  110. [C110]
    Yeung, R.W.: A First Course in Information Theory. In: Information Technology: Transmission, Processing and Storage. Kluwer Academic/Plenum Publishers, New York (2002)Google Scholar
  111. [C111]
    Zipf, G.K.: Human Behavior and the Principle of Least Effort. Addison-Wesley, Cambridge (1949)Google Scholar
  112. [C112]
    Ziv, J.: Back from Infinity: a Constrained Resources Approach to Information Theory. IEEE Information Theory Society Newsletter 48(1) (1998)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Rudolf Ahlswede
    • 1
  1. 1.Fakultät für MathematikUniversität BielefeldBielefeldGermany

Personalised recommendations