Universal Convergence of Semimeasures on Individual Random Sequences
Solomonoff’s central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior μ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown μ. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (Martin-Löf) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge for all random sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some non-universal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearly-measures. We show that W converges to D and D to μ on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.
KeywordsRandom Sequence Computable Function Iterate Logarithm Kolmogorov Complexity Hellinger Distance
Unable to display preview. Download preview PDF.
- [Hut03d]Hutter, M.: Sequence prediction based on monotone complexity. In: Proc. 16th Annual Conf. on Learning Theory (COLT 2003). LNCS (LNAI), pp. 506–521. Springer, Heidelberg (2003)Google Scholar