Appendix: On Common Information and Related Characteristics of Correlated Information Sources
This is a literal copy of a manuscript from 1974. References have been updated. It contains a critical discussion of in those days recent concepts of “common information” and suggests also alternative definitions. (Compare pages 402–405 in the book by I. Csiszár, J. Körner “Information Theory: Coding Theorems for Discrete Memoryless Systems”, Akademiai Kiado, Budapest 1981.) One of our definitions gave rise to the now well–known source coding problem for two helpers (formulated in 2.) on page 7).
More importantly, an extension of one concept to “common information with list knowledge” has recently (R. Ahlswede and V. Balakirsky “Identification under Random Processes” invited paper in honor of Mark Pinsker, Sept. 1995) turned out to play a key role in analyzing the contribution of a correlated source to the identification capacity of a channel.
Thus the old ideas have led now to concepts of operational significance and therefore are made accessible here.
KeywordsMutual Information Related Characteristic Side Information Dependent Random Variable Common Information
Unable to display preview. Download preview PDF.
- 3.Gray, R.M., Wyner, A.D.: Source coding for a simple network. Bell System. Techn. J. (December 1974)Google Scholar
- 7.Ahlswede, R., K”orner, J.: On the connection between the entropies of input and output distributions of discrete memoryless channels. In: Proceedings of the 5th Conference on Probability Theory, Brasov (1974): Editura Academeiei Rep. Soc. Romania, Bucaresti, pp. 13–23 (1977)Google Scholar
- 8.Ahlswede, R.: Multi–way communication channels. In: Proc. 2nd Int. Symp. Inf. Th., Tsahkadsor, Armenian S.S.R, pp. 23–52 (1971); Publishing House of the Hungarian Academy of Sciences (1973)Google Scholar
- 10.Wyner, A.D.: On source coding sith side information at the decoder. IEEE Trans. on Inf. Th. IT–21(3) (May 1975)Google Scholar