Skip to main content
Log in

Complete Statistical Theory of Learning

  • Topical Issue
  • Published:
Automation and Remote Control Aims and scope Submit manuscript

Abstract

Existing mathematical model of learning requires using training data find in a given subset of admissible function the function that minimizes the expected loss. In the paper this setting is called Second selection problem. Mathematical model of learning in this paper along with Second selection problem requires to solve the so-called First selection problem where using training data one first selects from wide set of function in Hilbert space an admissible subset of functions that include the desired function and second selects in this admissible subset a good approximation to the desired function. Existence of two selection problems reflects fundamental property of Hilbert space, existence of two different concepts of convergence of functions: weak convergence (that leads to solution of the First selection problem) and strong convergence (that leads to solution of the Second selection problem). In the paper we describe simultaneous solution of both selection problems for functions that belong to Reproducing Kernel Hilbert space. The solution is obtained in closed form.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Vapnik, V.N. and Chervonenkis, A.J., Necessary and Sufficient Convergence of Relative Frequencies of events to Their Probabilities, Theor. Prob. App., 1971, vol. 16, no. 2, pp. 264–280.

    Article  Google Scholar 

  2. Vapnik, V.N. and Chervonenkis, A.J., Teoriya raspoznavaniya obrazov (Theory of Pattern Recognition), Moscow: Nauka, 1974.

    Google Scholar 

  3. Vapnik, V., The Nature of Statistical Learning Theory, New York: Springer, 1995.

    Book  Google Scholar 

  4. Vapnik, V.N., Statistical Learning Theory, New York: Wiley, 1998.

    MATH  Google Scholar 

  5. Vapnik, V.N., Estimation of Dependencies Based on Empirical Data, Moscow: Nauka, 1979.

    Google Scholar 

  6. Devroy, L., Geofry, L., and Lugosi, G., A Probabilistic Theory of Pattern Recognition, New York: Springer, 1996.

    Book  Google Scholar 

  7. Tikhonov, A.N. and Arsenin, V.Ya., Solutions of Ill-Posed Problems, Washington: Winston, 1977.

    MATH  Google Scholar 

  8. Vapnik, V. and Izmailov, R., Rethinking Statistical Learning Theory: Learning Using Statistical Invariants, Machine Learning, 2018, vol. 108, pp. 381–423.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. N. Vapnik.

Additional information

In the memory of outstanding scientist and remarkable person Ja.Z. Tsypkin

Russian Text © The Author(s), 2019, published in Avtomatika i Telemekhanika, 2019, No. 11, pp. 24–58.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vapnik, V.N. Complete Statistical Theory of Learning. Autom Remote Control 80, 1949–1975 (2019). https://doi.org/10.1134/S000511791911002X

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S000511791911002X

Keywords

Navigation