Shannon Entropy vs. Kolmogorov Complexity

  • An. Muchnik
  • N. Vereshchagin
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3967)


Most assertions involving Shannon entropy have their Kolmogorov complexity counterparts. A general theorem of Romashchenko [4] states that every information inequality that is valid in Shannon’s theory is also valid in Kolmogorov’s theory, and vice verse. In this paper we prove that this is no longer true for ∀ ∃-assertions, exhibiting the first example where the formal analogy between Shannon entropy and Kolmogorov complexity fails.


Shannon Entropy Binary String Kolmogorov Complexity Vice Verse Kolmogorov Theory 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Sipser, M.: Expanders, randomness, or time versus space. J. Comput. and System Sci. 36, 379–383 (1988)MathSciNetCrossRefMATHGoogle Scholar
  2. 2.
    Bennett, C.H., Gács, P., Li, M., Vitányi, P., Zurek, W.: Information Distance. IEEE Transactions on Information Theory 44(4), 1407–1423 (1998)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Chernov, A., Muchnik, A., Romashchenko, A., Shen, A., Vereshchagin, N.: Upper semi-lattice of binary strings with the relation ‘x is simple conditional to y. Theoretical Computer Science 271, 69–95 (2002); Preliminary version in: 14th Annual IEEE Conference on Computational Complexity, Atlanta, May 4-6, pp. 114–122 (1999)Google Scholar
  4. 4.
    Hammer, D., Romashchenko, A., Shen, A., Vereshchagin, N.: Inequalities for Shannon entropy and Kolmogorov complexity. Journal of Computer and Systems Sciences 60, 442–464 (2000)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Li, M., Vitányi, P.M.B.: An Introduction to Kolmogorov Complexity and its Applications, 2nd edn. Springer, New York (1997)CrossRefMATHGoogle Scholar
  6. 6.
    Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Problems Inform. Transmission 1(1), 1–7 (1965)MathSciNetMATHGoogle Scholar
  7. 7.
    Slepian, D., Wolf, J.K.: Noiseless Coding of Correlated Information Sources. IEEE Trans. Inform. Theory IT-19, 471–480 (1973)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Muchnik, A.A.: Conditional complexity and codes. Theoretical Computer Science 271, 97–109 (2002)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Shannon, C.E.: A mathematical theory of communication. Bell Sys. Tech. J. 27, 379–423 and 623–656 (1948)Google Scholar
  10. 10.
    Solomonoff, R.J.: A formal theory of inductive inference, Part 1 and Part 2. Information and Control 7, 1–22 and 224–254 (1964)Google Scholar
  11. 11.
    Uspensky, V.A., Shen, A.: Relations Between Varieties of Kolmogorov Complexities. Mathematical Systems Theory 29(3), 271–292 (1996)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • An. Muchnik
    • 1
  • N. Vereshchagin
    • 2
  1. 1.Institute of New TechnologiesMoscowRussia
  2. 2.Department of Mathematical Logic and Theory of Algorithms, Faculty of Mechanics and MathematicsMoscow State UniversityMoscowRussia

Personalised recommendations