Advertisement

Appendix: On Common Information and Related Characteristics of Correlated Information Sources

  • R. Ahlswede
  • J. Körner
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4123)

Abstract

This is a literal copy of a manuscript from 1974. References have been updated. It contains a critical discussion of in those days recent concepts of “common information” and suggests also alternative definitions. (Compare pages 402–405 in the book by I. Csiszár, J. Körner “Information Theory: Coding Theorems for Discrete Memoryless Systems”, Akademiai Kiado, Budapest 1981.) One of our definitions gave rise to the now well–known source coding problem for two helpers (formulated in 2.) on page 7).

More importantly, an extension of one concept to “common information with list knowledge” has recently (R. Ahlswede and V. Balakirsky “Identification under Random Processes” invited paper in honor of Mark Pinsker, Sept. 1995) turned out to play a key role in analyzing the contribution of a correlated source to the identification capacity of a channel.

Thus the old ideas have led now to concepts of operational significance and therefore are made accessible here.

Keywords

Mutual Information Related Characteristic Side Information Dependent Random Variable Common Information 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Gács, P., K”orner, J.: Common information is far less than mutual information. Problems of Contr. and Inf. Th. 2, 149–162 (1973)MATHGoogle Scholar
  2. 2.
    Witsenhausen, H.S.: On sequences of pairs of dependent random variables. SIAM J. of Appl. Math. 28, 100–113 (1975)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Gray, R.M., Wyner, A.D.: Source coding for a simple network. Bell System. Techn. J. (December 1974)Google Scholar
  4. 4.
    Wyner, A.D.: The common information of two dependent random variables. IEEE Trans. Inform. Theory IT–21, 163–179 (1975)CrossRefMathSciNetGoogle Scholar
  5. 5.
    Ahlswede, R., K”orner, J.: Source coding with side information and a converse for degraded broadcast channels. IEEE Trans. on Inf. Th. IT–21(6), 629–637 (1975)CrossRefMathSciNetGoogle Scholar
  6. 6.
    Slepian, D., Wolf, J.K.: Noiseless coding for correlated information sources. IEEE Trans. on Inf. Th. IT–19, 471–480 (1973)CrossRefMathSciNetGoogle Scholar
  7. 7.
    Ahlswede, R., K”orner, J.: On the connection between the entropies of input and output distributions of discrete memoryless channels. In: Proceedings of the 5th Conference on Probability Theory, Brasov (1974): Editura Academeiei Rep. Soc. Romania, Bucaresti, pp. 13–23 (1977)Google Scholar
  8. 8.
    Ahlswede, R.: Multi–way communication channels. In: Proc. 2nd Int. Symp. Inf. Th., Tsahkadsor, Armenian S.S.R, pp. 23–52 (1971); Publishing House of the Hungarian Academy of Sciences (1973)Google Scholar
  9. 9.
    Gallager, R.G.: Information Theory and Reliable Communication. Wiley and Sons, New York (1968)MATHGoogle Scholar
  10. 10.
    Wyner, A.D.: On source coding sith side information at the decoder. IEEE Trans. on Inf. Th. IT–21(3) (May 1975)Google Scholar
  11. 11.
    Ahlswede, R., Gács, P., K”orner, J.: Bounds on conditional probabilities with applications in multi–user communication. Zeitschr. f”ur Wahrscheinlichkeitstheorie und verw. Gebiete 34, 157–177 (1976)MATHCrossRefGoogle Scholar
  12. 12.
    Ahlswede, R., Gács, P.: Spreading of sets in product spaces and hypercontraction of the Markov operator. Ann. of Probability 4(6), 925–939 (1976)MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • R. Ahlswede
    • 1
  • J. Körner
    • 2
  1. 1.Fakultät für MathematikUniversität BielefeldBielefeldGermany
  2. 2. 

Personalised recommendations