Monotone Conditional Complexity Bounds on Future Prediction Errors

  • Alexey Chernov
  • Marcus Hutter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3734)


We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ. Here we assume we are at a time t>1 and already observed x=x 1...x t . We bound the future prediction performance on x t + 1 x t + 2... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.


Turing Machine Kolmogorov Complexity Computable Measure Input Tape Future Loss 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [CV05]
    Cilibrasi, R., Vitányi, P.M.B.: Clustering by compression. IEEE Trans. Information Theory 51(4), 1523–1545 (2005)CrossRefGoogle Scholar
  2. [HM04]
    Hutter, M., Muchnik, A.: Universal convergence of semimeasures on individual random sequences. In: Ben-David, S., Case, J., Maruoka, A. (eds.) ALT 2004. LNCS (LNAI), vol. 3244, pp. 234–248. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  3. [Hut01]
    Hutter, M.: Convergence and error bounds for universal prediction of nonbinary sequences. In: Proc. 12th Eurpean Conference on Machine Learning (ECML-2001), pp. 239–250 (December 2001)Google Scholar
  4. [Hut03a]
    Hutter, M.: Convergence and loss bounds for Bayesian sequence prediction. IEEE Trans. on Information Theory 49(8), 2061–2067 (2003)CrossRefMathSciNetGoogle Scholar
  5. [Hut03b]
    Hutter, M.: Optimality of universal Bayesian prediction for general loss and alphabet. Journal of Machine Learning Research 4, 971–1000 (2003)CrossRefMathSciNetGoogle Scholar
  6. [Hut03c]
    Hutter, M.: Sequence prediction based on monotone complexity. In: Schölkopf, B., Warmuth, M.K. (eds.) COLT/Kernel 2003. LNCS (LNAI), vol. 2777, pp. 506–521. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  7. [Hut04]
    Hutter, M.: Universal Artificial Intelligence: Sequential Decisions based on Algorithmic Probability, 300 pages. Springer, Berlin (2004),
  8. [LV97]
    Li, M., Vitányi, P.M.B.: An introduction to Kolmogorov complexity and its applications, 2nd edn. Springer, Heidelberg (1997)zbMATHGoogle Scholar
  9. [PH04]
    Poland, J., Hutter, M.: Convergence of discrete MDL for sequential prediction. In: Shawe-Taylor, J., Singer, Y. (eds.) COLT 2004. LNCS (LNAI), vol. 3120, pp. 300–314. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  10. [Sch00]
    Schmidhuber, J.: Algorithmic theories of everything. Report IDSIA-20-00, quant-ph/0011122, IDSIA, Manno (Lugano), Switzerland (2000)Google Scholar
  11. [Sch02a]
    Schmidhuber, J.: Hierarchies of generalized Kolmogorov complexities and nonenumerable universal measures computable in the limit. International Journal of Foundations of Computer Science 13(4), 587–612 (2002)zbMATHCrossRefMathSciNetGoogle Scholar
  12. [Sch02b]
    Schmidhuber, J.: The speed prior: A new simplicity measure yielding near-optimal computable predictions. In: Kivinen, J., Sloan, R.H. (eds.) COLT 2002. LNCS (LNAI), vol. 2375, pp. 216–228. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  13. [Sol64]
    Solomonoff, R.J.: A formal theory of inductive inference: Part 1 and 2. Inform. Control 7, 1–22, 224–254 (1964)Google Scholar
  14. [Sol78]
    Solomonoff, R.J.: Complexity-based induction systems: comparisons and convergence theorems. IEEE Trans. Information Theory IT-24, 422–432 (1978)Google Scholar
  15. [US96]
    Uspensky, V.A., Shen, A.: Relations Between Varieties of Kolmogorov Complexities. Math. Systems Theory 29, 271–292 (1996)zbMATHMathSciNetGoogle Scholar
  16. [VSU05]
    Vereshchagin, N.K., Shen, A., Uspensky, V.A.: Lecture Notes on Kolmogorov Complexity. Unpublished (2005),
  17. [ZL70]
    Zvonkin, A.K., Levin, L.A.: The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Mathematical Surveys 25(6), 83–124 (1970)zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Alexey Chernov
    • 1
  • Marcus Hutter
    • 1
  1. 1.IDSIAManno-LuganoSwitzerland

Personalised recommendations