Sequence Prediction Based on Monotone Complexity

  • Marcus Hutter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2777)

Abstract

This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = − logm, i.e. based on universal deterministic/one-part MDL. m is extremely close to Solomonoff’s prior M, the latter being an excellent predictor in deterministic as well as probabilistic environments, where performance is measured in terms of convergence of posteriors or losses. Despite this closeness to M, it is difficult to assess the prediction quality of m, since little is known about the closeness of their posteriors, which are the important quantities for prediction. We show that for deterministic computable environments, the “posterior” and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence behavior is unclear. In probabilistic environments, neither the posterior nor the losses converge, in general.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. BC91.
    Barron, A.R., Cover, T.M.: Minimum complexity density estimation. IEEE Transactions on Information Theory 37, 1034–1054 (1991)MATHCrossRefMathSciNetGoogle Scholar
  2. Gác83.
    Gács, P.: On the relation between descriptional complexity and algorithmic probability. Theoretical Computer Science 22, 71–93 (1983)MATHCrossRefMathSciNetGoogle Scholar
  3. Hut01a.
    Hutter, M.: Convergence and error bounds of universal prediction for general alphabet. In: Flach, P.A., De Raedt, L. (eds.) ECML 2001. LNCS (LNAI), vol. 2167, pp. 239–250. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  4. Hut01b.
    Hutter, M.: New error bounds for Solomonoff prediction. Journal of Computer and System Sciences 62(4), 653–667 (2001)MATHCrossRefMathSciNetGoogle Scholar
  5. Hut02.
    Hutter, M.: Convergence and loss bounds for Bayesian sequence prediction. Technical Report IDSIA-09-01, IDSIA, Manno(Lugano), CH (2002), http://arxiv.org/abs/cs.LG/0301014
  6. KV86.
    Kumar, P.R., Varaiya, P.P.: Stochastic Systems: Estimation, Identification, and Adaptive Control. Prentice Hall, Englewood Cliffs (1986)MATHGoogle Scholar
  7. Lev73.
    Levin, L.A.: On the notion of a random sequence. Soviet Math. Dokl. 14(5), 1413–1416 (1973)MATHGoogle Scholar
  8. LV97.
    Li, M., Vitányi, P.M.B.: An introduction to Kolmogorov complexity and its applications, 2nd edn. Springer, Heidelberg (1997)MATHGoogle Scholar
  9. Sch00.
    Schmidhuber, J.: Algorithmic theories of everything. Report IDSIA-20-00, quant-ph/0011122, IDSIA, Manno (Lugano), Switzerland (2000)Google Scholar
  10. Sol64.
    Solomonoff, R.J.: A formal theory of inductive inference: Part 1 and 2. Inform. Control 7, 1–22, 224–254 (1964)Google Scholar
  11. Sol78 .
    Solomonoff, R.J.: Complexity-based induction systems: comparisons and convergence theorems. IEEE Trans. Inform. Theory IT - 24, 422–432 (1978)Google Scholar
  12. VW98 .
    Vovk, V.G., Watkins, C.: Universal portfolio selection. In: Proceedings of the 11th Annual Conference on Computational Learning Theory (COLT 1998), pp. 12–23. ACM Press, New York (1998)CrossRefGoogle Scholar
  13. ZL70.
    Zvonkin, A.K., Levin, L.A.: The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Mathematical Surveys 25(6), 83–124 (1970)MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Marcus Hutter
    • 1
  1. 1.IDSIAManno-LuganoSwitzerland

Personalised recommendations