Advertisement

On Sequences with Non-learnable Subsequences

  • Vladimir V. V’yugin
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5010)

Abstract

The remarkable results of Foster and Vohra was a starting point for a series of papers which show that any sequence of outcomes can be learned (with no prior knowledge) using some universal randomized forecasting algorithm and forecast-dependent checking rules. We show that for the class of all computationally efficient outcome-forecast-based checking rules, this property is violated. Moreover, we present a probabilistic algorithm generating with probability close to one a sequence with a subsequence which simultaneously miscalibrates all partially weakly computable randomized forecasting algorithms.

According to the Dawid’s prequential framework we consider partial recursive randomized algorithms.

Keywords

Forecast System Binary Sequence Probability Forecast Delay Function Probabilistic Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dawid, A.P.: The Well-Calibrated Bayesian [with discussion], J. Am. Statist. Assoc. 77, 605–613 (1982)zbMATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Dawid, A.P.: Calibration-Based Empirical Probability [with discussion], Ann. Statist 13, 1251–1285 (1985)zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Foster, D.P., Vohra, R.: Asymptotic Calibration. Biometrika 85, 379–390 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Kakade, S.M., Foster, D.P.: Deterministic Calibration and Nash Equilibrium. In: Shawe-Taylor, J., Singer, Y. (eds.) COLT 2004. LNCS (LNAI), vol. 3120, pp. 33–48. Springer, Heidelberg (2004)Google Scholar
  5. 5.
    Lehrer, E.: Any Inspection Rule is Manipulable. Econometrica 69(5), 1333–1347 (2001)zbMATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Oakes, D.: Self-Calibrating Priors Do not Exists [with discussion]. J. Am. Statist. Assoc. 80, 339–342 (1985)zbMATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Rogers, H.: Theory of Recursive Functions and Effective Computability. McGraw-Hill, New York (1967)zbMATHGoogle Scholar
  8. 8.
    Sandroni, A., Smorodinsky, R., Vohra, R.: Calibration with Many Checking Rules. Mathematics of Operations Research 28(1), 141–153 (2003)zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Schervish, V.: Comment [to Oakes, 1985]. J. Am. Statist. Assoc. 80, 341–342 (1985)CrossRefMathSciNetGoogle Scholar
  10. 10.
    Vovk, V.: Defensive forecasting for optimal prediction with expert advice, arXiv:0708.1503v1 (2007)Google Scholar
  11. 11.
    V’yugin, V.V.: Non-Stochastic Infinite and Finite Sequences. Theor. Comp. Science 207, 363–382 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Zvonkin, A.K., Levin, L.A.: The Complexity of Finite Objects and the Algorithmic Concepts of Information and Randomness. Russ. Math. Surv. 25, 83–124 (1970)zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Vladimir V. V’yugin
    • 1
  1. 1.Institute for Information Transmission ProblemsRussian Academy of SciencesMoscow GSP-4Russia

Personalised recommendations