Statistics and Computing

, Volume 16, Issue 2, pp 161–175

MDL convergence speed for Bernoulli sequences

Article

DOI: 10.1007/s11222-006-6746-3

Cite this article as:
Poland, J. & Hutter, M. Stat Comput (2006) 16: 161. doi:10.1007/s11222-006-6746-3

Abstract

The Minimum Description Length principle for online sequence estimation/prediction in a proper learning setup is studied. If the underlying model class is discrete, then the total expected square loss is a particularly interesting performance measure: (a) this quantity is finitely bounded, implying convergence with probability one, and (b) it additionally specifies the convergence speed. For MDL, in general one can only have loss bounds which are finite but exponentially larger than those for Bayes mixtures. We show that this is even the case if the model class contains only Bernoulli distributions. We derive a new upper bound on the prediction error for countable Bernoulli classes. This implies a small bound (comparable to the one for Bayes mixtures) for certain important model classes. We discuss the application to Machine Learning tasks such as classification and hypothesis testing, and generalization to countable classes of i.i.d. models.

Copyright information

© Springer Science + Business Media, LLC 2006

Authors and Affiliations

  1. 1.Graduate School of Information Science and TechnologyHokkaido UniversityJapan
  2. 2.IDSIAManno (Lugano)Switzerland