Convergence of Discrete MDL for Sequential Prediction

  • Jan Poland
  • Marcus Hutter
Conference paper

DOI: 10.1007/978-3-540-27819-1_21

Part of the Lecture Notes in Computer Science book series (LNCS, volume 3120)
Cite this paper as:
Poland J., Hutter M. (2004) Convergence of Discrete MDL for Sequential Prediction. In: Shawe-Taylor J., Singer Y. (eds) Learning Theory. COLT 2004. Lecture Notes in Computer Science, vol 3120. Springer, Berlin, Heidelberg

Abstract

We study the properties of the Minimum Description Length principle for sequence prediction, considering a two-part MDL estimator which is chosen from a countable class of models. This applies in particular to the important case of universal sequence prediction, where the model class corresponds to all algorithms for some fixed universal Turing machine (this correspondence is by enumerable semimeasures, hence the resulting models are stochastic). We prove convergence theorems similar to Solomonoff’s theorem of universal induction, which also holds for general Bayes mixtures. The bound characterizing the convergence speed for MDL predictions is exponentially larger as compared to Bayes mixtures. We observe that there are at least three different ways of using MDL for prediction. One of these has worse prediction properties, for which predictions only converge if the MDL estimator stabilizes. We establish sufficient conditions for this to occur. Finally, some immediate consequences for complexity relations and randomness criteria are proven.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Jan Poland
    • 1
  • Marcus Hutter
    • 1
  1. 1.IDSIAManno (Lugano)Switzerland

Personalised recommendations