Monotone Conditional Complexity Bounds on Future Prediction Errors
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ. Here we assume we are at a time t>1 and already observed x=x1...xt. We bound the future prediction performance on xt + 1xt + 2... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.
Unable to display preview. Download preview PDF.
- [Hut01]Hutter, M.: Convergence and error bounds for universal prediction of nonbinary sequences. In: Proc. 12th Eurpean Conference on Machine Learning (ECML-2001), pp. 239–250 (December 2001)Google Scholar
- [Hut04]Hutter, M.: Universal Artificial Intelligence: Sequential Decisions based on Algorithmic Probability, 300 pages. Springer, Berlin (2004), http://www.idsia.ch/~marcus/ai/uaibook.htm
- [Sch00]Schmidhuber, J.: Algorithmic theories of everything. Report IDSIA-20-00, quant-ph/0011122, IDSIA, Manno (Lugano), Switzerland (2000)Google Scholar
- [Sol64]Solomonoff, R.J.: A formal theory of inductive inference: Part 1 and 2. Inform. Control 7, 1–22, 224–254 (1964)Google Scholar
- [Sol78]Solomonoff, R.J.: Complexity-based induction systems: comparisons and convergence theorems. IEEE Trans. Information Theory IT-24, 422–432 (1978)Google Scholar
- [VSU05]Vereshchagin, N.K., Shen, A., Uspensky, V.A.: Lecture Notes on Kolmogorov Complexity. Unpublished (2005), http://lpcs.math.msu.su/~ver/kolm-book