Advertisement

The Slepian-Wolf Theorem for Individual Sequences

  • Rudolf AhlswedeEmail author
Chapter
Part of the Foundations in Signal Processing, Communications and Networking book series (SIGNAL, volume 15)

Abstract

After our work on AVC with elimination technique as first major breakthrough [1] it became clear that the method should work for multi-way AVC and also for systems of correlated AVS (the case for a single AVS is comparably easy and was known, see chapter). Three of our Ph.D. students addressed these problems. Klemisch-Ahlert addressed the MAC, Jahn started with DMCS (the Slepian-Wolf model) and extended most known coding theorems for multi-way channels to the arbitrarily varying case. It culminated with a simpler proof of Marton’s lower bound for the broadcast channel capacity region together with the improvement to the arbitrarily varying case. All those results were modulo the “positivity problem”, which was settled much later even for one-way channels..

References

  1. 1.
    R. Ahlswede, Elimination of correlation in random codes for arbitrarily varying channels. Z. Wahrscheinlichkeitstheorie u. verw. Geb. 44, 159–197 (1978)MathSciNetCrossRefGoogle Scholar
  2. 2.
    R. Ahlswede, Coloring hypergraphs: a new approach to multi-user source coding I. J. Comb. Inf. Syst. Sci. 4(1), 76–115 (1979)MathSciNetzbMATHGoogle Scholar
  3. 3.
    R. Ahlswede, Coloring hypergraphs: a new approach to multi-user source coding II. J. Comb. Inf. Syst. Sci. 5(3), 220–268 (1980)MathSciNetzbMATHGoogle Scholar
  4. 4.
    G. Dueck, L. Wolters, Ergodic theory and encoding of individual sequences. Probl. Control. Inf. Theory 14(5), 329–345 (1985)MathSciNetzbMATHGoogle Scholar
  5. 5.
    J. Ziv, Coding theorems for individual sequences. IEEE Trans. Inf. Theory IT-24, 405–412 (1978)MathSciNetCrossRefGoogle Scholar
  6. 6.
    D. Slepian, J.K. Wolf, Noiseless coding of correlated information sources. IEEE Trans. Inf. Theory IT-19, 471–480 (1973)MathSciNetCrossRefGoogle Scholar
  7. 7.
    J. Ziv, Fixed-rate encoding of individual sequences with side information. IEEE Trans. Inf. Theory IT-30, 348–352 (1984)MathSciNetCrossRefGoogle Scholar
  8. 8.
    M. Denker, C. Grillenberger, K. Sigmund, Ergodic Theory on Compact Spaces, Lecture notes in mathematics (Springer, Berlin, 1976)CrossRefGoogle Scholar

Further Readings for Part II

  1. 9.
    R. Ahlswede, A method of coding and an application to arbitrarily varying channels. J. Comb. Inf. Syst. Sci. 5(1), 10–35 (1980)MathSciNetGoogle Scholar
  2. 10.
    R. Ahlswede, Series: Foundations in Signal Processing, Communications and Networking, in Storing and Transmitting Data, Rudolf Ahlswede’s Lectures on Information Theory 1, ed. by A. Ahlswede, I. Althöfer, C. Deppe, U. Tamm, vol. 10 (Springer, Berlin, 2014)Google Scholar
  3. 11.
    P. Billingsley, Ergodic Theory and Information (Wiley, New York, 1965)zbMATHGoogle Scholar
  4. 12.
    D. Blackwell, L. Breiman, A.J. Thomasian, The capacity of a class of channels. Ann. Math. Stat. 30, 1229–1241 (1959)MathSciNetCrossRefGoogle Scholar
  5. 13.
    T.M. Cover, A proof of the data compression theorem of Slepian-Wolf for ergodic sources. IEEE Trans. Inf. Theory IT-21, 226–228 (1975)Google Scholar
  6. 14.
    I. Csiszár, J. Körner, Information Theory: Coding Theory for Discrete Memoryless Systems (Akadémiai Kiadó, Budapest, 1981)zbMATHGoogle Scholar
  7. 15.
    L.D. Davisson, Universal noiseless coding. IEEE Trans. Inf. Theory IT-19, 783–795 (1973)MathSciNetCrossRefGoogle Scholar
  8. 16.
    G. Dueck, Die topologische Entropie von Mengen generischer Punkte, Diploma thesis, Göttingen, 1975Google Scholar
  9. 17.
    G. Dueck, L. Wolters, The Slepian-Wolf theorem for individual sequences. Probl. Control. Inf. Theory 14(6), 437–450 (1985)MathSciNetzbMATHGoogle Scholar
  10. 18.
    R.G. Gallager, Information Theory and Reliable Communication (Wiley, New York, 1968)zbMATHGoogle Scholar
  11. 19.
    R.G. Gallager, Source coding with side-information and universal coding, unpublished (1976)Google Scholar
  12. 20.
    R.M. Gray, L.D. Davisson, Source coding theorems without the ergodic assumption. IEEE Trans. Inf. Theory IT-20, 502–516 (1974)MathSciNetCrossRefGoogle Scholar
  13. 21.
    R.M. Gray, L.D. Davisson, The ergodic decomposition of stationary discrete random processes. IEEE Trans. Inf. Theory IT-20, 625–636 (1974)MathSciNetCrossRefGoogle Scholar
  14. 22.
    J.H. Jahn, Kodierung beliebig variierender korrelierter Quellen, Ph.D. thesis, Bielefeld, 1978Google Scholar
  15. 23.
    A. Lempel, J. Ziv, On the complexity of an individual sequence. IEEE Trans. Inf. Theory IT-22, 75–81 (1976)Google Scholar
  16. 24.
    V.A. Rohlin, On the fundamental ideas of measure theory. Am. Math. Soc. Trans. 71 (1952)Google Scholar
  17. 25.
    V.A. Rohlin, Lectures on the entropy theory of measure-preserving transformations. Russ. Math. Surv. 22(5), 1–52 (1967)MathSciNetCrossRefGoogle Scholar
  18. 26.
    J. Ziv, A. Lempel, A universal algorithm for sequential data compression. IEEE Trans. Inf. Theory IT-23, 337–343 (1977)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.BielefeldGermany

Personalised recommendations