Algorithmic statistics has two different (and almost orthogonal) motivations. From the philosophical point of view, it tries to formalize how the statistics works and why some statistical models are better than others. After this notion of a “good model” is introduced, a natural question arises: it is possible that for some piece of data there is no good model? If yes, how often these bad (non-stochastic) data appear “in real life”?
Another, more technical motivation comes from algorithmic information theory. In this theory a notion of complexity of a finite object (=amount of information in this object) is introduced; it assigns to every object some number, called its algorithmic complexity (or Kolmogorov complexity). Algorithmic statistic provides a more fine-grained classification: for each finite object some curve is defined that characterizes its behavior. It turns out that several different definitions give (approximately) the same curve.
Road-map: Sect. 2 considers the notion of \((\alpha ,\beta )\)-stochasticity; Sect. 3 considers two-part descriptions and the so-called “minimal description length principle”; Sect. 4 gives one more approach: we consider the list of objects of bounded complexity and measure how far some object is from the end of the list, getting some natural class of “standard descriptions” as a by-product; finally, Sect. 5 establishes a connection between these notions and resource-bounded complexity. The rest of the paper deals with an attempts to make theory close to practice by considering restricted classes of descriptions (Sect. 6) and strong models (Sect. 7).
In this survey we try to provide an exposition of the main results in the field (including full proofs for the most important ones), as well as some historical comments. We assume that the reader is familiar with the main notions of algorithmic information (Kolmogorov complexity) theory. An exposition can be found in [42, Chaps. 1, 3, 4] or [22, Chaps. 2, 3], see also the survey .
A short survey of main results of algorithmic statistics was given in  (without proofs); see also the last chapter of the book .
This is a preview of subscription content, log in to check access.
We are grateful to several people who contributed and/or carefully read preliminary versions of this survey, in particular, to B. Bauwens, P. Gács, A. Milovanov, G. Novikov, A. Romashchenko, P. Vitányi, and to all participants of Kolmogorov seminar in Moscow State University and ESCAPE group in LIRMM. We are also grateful to an anonymous referee for correcting several mistakes.
Adleman, L.M.: Time, space and randomness. MIT report MIT/LCS/TM-131, March 1979Google Scholar
Antunes, L., Bauwens, B., Souto, A., Teixeira, A.: Sophistication vs. logical depth. Theory of Computing Systems. doi:10.1007/s00224-016-9672-6
Antunes, L., Fortnow, L., van Melkebeek, D.: Computational depth, In: Proceedings of the 16th IEEE Conference on Computational Complexity, pp. 266–273. IEEE, New York (2001). Journal version: Computational depth: concept and applications. Theoret. Comput. Sci. 354(3), 391–404 (2006)Google Scholar
Bauwens, B.: Computability in statistical hypotheses testing, and characterizations of independence and directed influences in time series using Kolmogorov complexity. Ph.D. thesis, University of Gent, May 2010Google Scholar
Bennett, C.H.: Logical depth and physical complexity. In: Herken, R. (ed.) The Universal Turing Machine: A Half-Century Survey, pp. 227–257. Oxford University Press, New York (1988)Google Scholar
Bienvenu, L., Desfontaines, D., Shen, A.: What percentage of programs halt? In: Halldórsson, M.M., Iwama, K., Kobayashi, N., Speckmann, B. (eds.) ICALP 2015. LNCS, vol. 9134, pp. 219–230. Springer, Heidelberg (2015). doi:10.1007/978-3-662-47672-7_18Google Scholar
Bienvenu, L., Gács, P., Hoyrup, M., Rojas, C., Shen, A.: Algorithmic tests and randomness with respect to a class of measures. Proc. Steklov Inst. Math. 274, 34–89 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
Cover, T.: Kolmogorov complexity, data compression and inference. In: Skwirzynski, J.K. (ed.) The Impact of Processing Techniques on Communications. NATO ASI Series, vol. 91, pp. 23–33. Martinus Nijhoff Publishers, Dordrecht (1985). doi:10.1007/978-94-009-5113-6_2CrossRefGoogle Scholar
Kolmogorov, A.N.: Three approaches to the quantitative definition of information (in Russian). Prob. Inf. Trans. 1(1), 4–11 (1965). English translation published: Int. J. Comput. Math. 2, 157–168 (1968)MathSciNetGoogle Scholar
Kolmogorov, A.N.: Talk at the Information Theory Symposium in Tallinn, Estonia (then USSR) (1974) [As reported by Cover in his 1985 paper ]Google Scholar
Kolmogorov, A.N.: The complexity of algorithms and the objective definition of randomness. Summary of the talk presented April 16, 1974 at Moscow Mathematical Society. Uspekhi matematicheskikh nauk (Russian) 29(4), 155 (1974). http://mi.mathnet.ru/rus/umn/v29/i4/p153 (A short note in Russian)
Kolmogorov, A.N.: Talk at the seminar at Moscow State University Mathematics Department (Logic Division), 26 November 1981. [The definition of \((\alpha ,\beta )\)-stochasticity was defined in this talk, and the question about the fraction of non-stochastic objects was posed.]Google Scholar
Li, M., Vitányi, P.M.B.: An Introduction to Kolmogorov Complexity and its Applications, 3rd edn. Springer, New York (2008)CrossRefzbMATHGoogle Scholar
Longpré, L.: Resource bounded Kolmogorov complexity, a link between computational complexity and information theory. Ph.D. Thesis, Department of Computer Science, Cornell University, TR 86–776 (1986)Google Scholar
Milovanov, A.: Some properties of antistochastic strings. In: Beklemishev, L.D., Musatov, D.V. (eds.) CSR 2015. LNCS, vol. 9139, pp. 339–349. Springer, Heidelberg (2015). doi:10.1007/978-3-319-20297-6_22Google Scholar