Multisource Algorithmic Information Theory

  • Alexander Shen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3959)


Multisource information theory in Shannon setting is well known. In this article we try to develop its algorithmic information theory counterpart and use it as the general framework for many interesting questions about Kolmogorov complexity.


Output Node Kolmogorov Complexity Input String Common Information Short Program 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Ahlswede, R., Cai, N., Li, S.-Y.R., Yeung, R.W.: Network information flow. IEEE Trans. Inform. Theory 46, 1004–1016 (2000)CrossRefMathSciNetGoogle Scholar
  2. 2.
    Bennett, C., Gács, P., Li, M., Vitanyi, P., Zurek, W.: Information distance. In: Proc. 25th ACM Symp. Theory of Comput., pp. 21–30 (1993); Final version: IEEE Trans. Inform. Theory IT-44(4), 1407–1423 (1998)Google Scholar
  3. 3.
    Muchnik, A., Romashchenko, A., Vereshagin, N., Shen, A.: Upper semi-lattice of binary strings with relation x is simple conditional to y. DIMACS Tech. Report, 97-74 (December 1997). Revised version: Proceedings of 1999 Computational Complexity conference, Atlanta. Final version (with A. Chernov): Theoretical Computer Science 271(1–2), 69–95 (2002)Google Scholar
  4. 4.
    Cziszar, I., Korner, J.: Information theory: Coding Theorems for Discrete Memoryless Systems, 2nd edn. Academic Press, New York (1997)Google Scholar
  5. 5.
    Buhrman, H., Fortnow, L., Laplante, S.: Resource-bounded Kolmogorov complexity revisited. SIAM Journalin Computing 31(3), 887–905 (2002)MATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Gács, P., Körner, J.: Common information is far less than mutual information. Problems of Control and Information Theory 2(2), 149–162Google Scholar
  7. 7.
    Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and Its Application, 2nd edn. Springer, Heidelberg (1997)Google Scholar
  8. 8.
    Li, S.-Y.R., Yeung, R.W., Cai, N.: Linear network coding. IEEE Transactions on Information Theory 49, 371–381 (2003)MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Muchnik, A.: Conditional complexity and codes. Theoretical Computer Science 271(1–2), 91–109 (2002)Google Scholar
  10. 10.
    Muchnik, A., Shen, N., Vereshchagin, M.: Vyugin, Non-reducible descriptions for conditional Kolmogorov complexity. Report TR04-054, Electronic Colloqium on Computational Complexity, ISSN 1433-8092Google Scholar
  11. 11.
    Hammer, D., Romashenko, A., Shen, A., Vereshchagin, N.: Inequalities for Shannon entropies and Kolmogorov complexities. In: Proceedings of CCC 1997 Conference, Ulm. Final version: Inequalities for Shannon entropy and Kolmogorov Complexity, Journal of Computer and System Sciences, vol. 60, pp. 442–464 (2000)Google Scholar
  12. 12.
    Romashchenko, A., Shen, A., Vereshchagin, N.: Combinatorial interpretation of Kolmogorov complexity. ECCC Report 7(26) (2000); IEEE conference on Computational Complexity, published in Theoretical Computer Science, vol. 271(1–2), pp. 111–123 (2002)Google Scholar
  13. 13.
    Shen, A.: Algorithmic Information Theory and Kolmogorov Complexity. Lecture notes of an introductory course. Uppsala University Technical Report (2000-034), Available online at
  14. 14.
    Yeung, R.: A First Course in Information Theory. Springer, Heidelberg (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Alexander Shen
    • 1
  1. 1.Laboratoire Poncelet, CNRS, Institute for Information Transmission ProblemsLIF CNRS, MarseilleMoscow

Personalised recommendations