Skip to main content
Log in

Learning rates of least-square regularized regression with polynomial kernels

  • Published:
Science in China Series A: Mathematics Aims and scope Submit manuscript

Abstract

This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization scheme is given, which yields sharp learning rates. The rates depend on the dimension of polynomial space and polynomial reproducing kernel Hilbert space measured by covering numbers. Meanwhile, we also establish the direct approximation theorem by Bernstein-Durrmeyer operators in \( L_{\rho _X }^2 \) with Borel probability measure.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Cucker F, Smale S. On the mathematical foundations of learning. Bull Amer Math Soc, 39: 1–49 (2001)

    Article  MathSciNet  Google Scholar 

  2. Cucker F, Zhou D X. Learning Theory: An Approximation Theory Viewpoint. Cambridge Monographs on Applied and Computational Mathematics. Cambridge: Cambridge University Press, 2007

    Google Scholar 

  3. Wu Q, Ying Y, Zhou D X. Learning rates of least-square regularized regression. Found Comput Math, 6(2):171–192 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  4. Aronszajn N. Theory of reproducing kernels. Trans Amer Math Soc, 68: 337–404 (1950)

    Article  MATH  MathSciNet  Google Scholar 

  5. Schaback R. Mathematical results concerning kernel techniques. In: Proceedings of the 13th IFAC Symposium on System Identification. New York: Elsevier, 2003

    Google Scholar 

  6. Steinwart I, Scovel C. Fast rates for support vector machines using Gaussian kernels. Ann Statist, 35:575–607 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  7. Smale S, Zhou D X. Estimating the approximation error in learning theory. Anal Appl, 1: 17–41 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  8. Zhou D X, Jetter K. Approximation with polynomial kernels and SVM classifiers. Adv Comput Math, 25:323–344 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  9. Wu Q, Zhou D X. Support vector machine classifiers: linear programming versus quadratic programming. Neural Comput, 17: 1160–1187 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  10. Berens H, Lorentz G. Inverse theorems for Bernstein polynomial. Indiana Univ Math J, 21: 693–708 (1972)

    Article  MATH  MathSciNet  Google Scholar 

  11. Ditzian Z, Totik V. Moduli of smoothness. In: Springer Series in Computational Mathematics 9. Berlin-Heildelberg-New York: Springer-Verlag, 1987

    Google Scholar 

  12. Ditzian Z. Rate of convergence for Bernstein polynomials, revisited. J Approx Theory, 50: 40–48 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  13. Totik V. An interpatation theorem and its application to positive operators. Pacific J Math, 2: 417–481 (1984)

    MathSciNet  Google Scholar 

  14. Zhou D X. The covering number in learning theory. J Complexity, 18: 739–767 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  15. Evgeniou T, Pontil M, Poggio T. Regularization networks and support vector machines. Adv Comput Math, 13: 1–50 (2000)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to BingZheng Li.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Li, B., Wang, G. Learning rates of least-square regularized regression with polynomial kernels. Sci. China Ser. A-Math. 52, 687–700 (2009). https://doi.org/10.1007/s11425-008-0137-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11425-008-0137-5

Keywords

MSC(2000)

Navigation