Abstract
This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization scheme is given, which yields sharp learning rates. The rates depend on the dimension of polynomial space and polynomial reproducing kernel Hilbert space measured by covering numbers. Meanwhile, we also establish the direct approximation theorem by Bernstein-Durrmeyer operators in \( L_{\rho _X }^2 \) with Borel probability measure.
Similar content being viewed by others
References
Cucker F, Smale S. On the mathematical foundations of learning. Bull Amer Math Soc, 39: 1–49 (2001)
Cucker F, Zhou D X. Learning Theory: An Approximation Theory Viewpoint. Cambridge Monographs on Applied and Computational Mathematics. Cambridge: Cambridge University Press, 2007
Wu Q, Ying Y, Zhou D X. Learning rates of least-square regularized regression. Found Comput Math, 6(2):171–192 (2006)
Aronszajn N. Theory of reproducing kernels. Trans Amer Math Soc, 68: 337–404 (1950)
Schaback R. Mathematical results concerning kernel techniques. In: Proceedings of the 13th IFAC Symposium on System Identification. New York: Elsevier, 2003
Steinwart I, Scovel C. Fast rates for support vector machines using Gaussian kernels. Ann Statist, 35:575–607 (2007)
Smale S, Zhou D X. Estimating the approximation error in learning theory. Anal Appl, 1: 17–41 (2003)
Zhou D X, Jetter K. Approximation with polynomial kernels and SVM classifiers. Adv Comput Math, 25:323–344 (2006)
Wu Q, Zhou D X. Support vector machine classifiers: linear programming versus quadratic programming. Neural Comput, 17: 1160–1187 (2005)
Berens H, Lorentz G. Inverse theorems for Bernstein polynomial. Indiana Univ Math J, 21: 693–708 (1972)
Ditzian Z, Totik V. Moduli of smoothness. In: Springer Series in Computational Mathematics 9. Berlin-Heildelberg-New York: Springer-Verlag, 1987
Ditzian Z. Rate of convergence for Bernstein polynomials, revisited. J Approx Theory, 50: 40–48 (1987)
Totik V. An interpatation theorem and its application to positive operators. Pacific J Math, 2: 417–481 (1984)
Zhou D X. The covering number in learning theory. J Complexity, 18: 739–767 (2002)
Evgeniou T, Pontil M, Poggio T. Regularization networks and support vector machines. Adv Comput Math, 13: 1–50 (2000)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Li, B., Wang, G. Learning rates of least-square regularized regression with polynomial kernels. Sci. China Ser. A-Math. 52, 687–700 (2009). https://doi.org/10.1007/s11425-008-0137-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11425-008-0137-5
Keywords
- learning theory
- reproducing kernel Hilbert space
- polynomial kernel
- regularization error
- Bernstein-Durrmeyer operators
- covering number
- regularization scheme