On Sequences with Non-learnable Subsequences
The remarkable results of Foster and Vohra was a starting point for a series of papers which show that any sequence of outcomes can be learned (with no prior knowledge) using some universal randomized forecasting algorithm and forecast-dependent checking rules. We show that for the class of all computationally efficient outcome-forecast-based checking rules, this property is violated. Moreover, we present a probabilistic algorithm generating with probability close to one a sequence with a subsequence which simultaneously miscalibrates all partially weakly computable randomized forecasting algorithms.
According to the Dawid’s prequential framework we consider partial recursive randomized algorithms.
KeywordsForecast System Binary Sequence Probability Forecast Delay Function Probabilistic Algorithm
Unable to display preview. Download preview PDF.
- 4.Kakade, S.M., Foster, D.P.: Deterministic Calibration and Nash Equilibrium. In: Shawe-Taylor, J., Singer, Y. (eds.) COLT 2004. LNCS (LNAI), vol. 3120, pp. 33–48. Springer, Heidelberg (2004)Google Scholar
- 10.Vovk, V.: Defensive forecasting for optimal prediction with expert advice, arXiv:0708.1503v1 (2007)Google Scholar