Abstract
We introduce a class of analytic positive definite multivariate kernels which includes infinite dot product kernels as sometimes used in machine learning, certain new nonlinearly factorizable kernels, and a kernel which is closely related to the Gaussian. Each such kernel reproduces in a certain “native” Hilbert space of multivariate analytic functions. If functions from this space are interpolated in scattered locations by translates of the kernel, we prove spectral convergence rates of the interpolants and all derivatives. By truncation of the power series of the kernel-based interpolants, we constructively generalize the classical Bernstein theorem concerning polynomial approximation of analytic functions to the multivariate case. An application to machine learning algorithms is presented.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Madych, W., Nelson, S.: Bounds on multivariate polynomials and exponential error estimates for multiquadric interpolation. J. Approx. Theory 70, 94–114 (1992)
de Boor, C., Ron, A.: On multivariate polynomial interpolation. Constr. Approx. 6, 287–302 (1990)
Schaback, R.: Interpolation by polynomials and radial basis functions. Constr. Approx. 21, 293–317 (2005)
Driscoll, T., Fornberg, B.: Interpolation in the limit of increasingly flat radial basis functions. Comput. Math. Appl. 43, 413–422 (2002)
Wendland, H.: Scattered Data Approximation. Cambridge Monographs on Applied and Computational Mathematics. Cambridge University Press, Cambridge (2005)
Christianini, N., Shawe-Taylor, J.: Kernel Methods for Pattern Recognition. Cambridge University Press, Cambridge (2004)
Schölkopf, B., Smola, A.: Learning with Kernels—Support Vector Machines, Regularisation, and Beyond. MIT Press, Cambridge (2002)
Lu, F., Sun, H.: Positive definite dot product kernels in learning theory. Adv. Comput. Math. 22, 181–198 (2005)
Abramowitz, M.: Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Table. Dover, New York (1970)
Erdelyi, A., et al.: Higher Transcendental Functions, vol. II. McGraw–Hill, New York (1953–1955)
Natanson, I.: Constructive Function Theory, I. Frederick Ungar Publishing Co., New York (1965)
Baouendi, M., Goulaouic, C.: Approximation polynômiale de fonctions C∞ et analytique. Ann. Inst. Fourier (Grenoble) 21, 149–173 (1971)
Hoerl, A., Kennrad, R.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12, 55–67 (1970)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Kurkova, V., Sanguineti, M.: Learning with generalization capability by kernel methods of bounded complexity. J. Complex. 21, 350–367 (2005)
Wendland, H., Rieger, C.: Approximate interpolation. Numer. Math. 101, 643–662 (2005)
Schölkopf, B., Williamson, R., Bartlett, W.: New support vector algorithms. Neural Comput. 12, 1207–1245 (2000)
Rieger, C., Zwicknagl, B.: Deterministic error analysis of support vector machines and related regularized kernel methods. MPI-MIS preprint 152/2006 (2006)
Schaback, R., Wendland, H.: Inverse and saturation theorems for radial basis function interpolation. Math. Comput. 71, 669–681 (2002)
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by G. Kerkyacharian.
Rights and permissions
Open Access This is an open access article distributed under the terms of the Creative Commons Attribution Noncommercial License ( https://creativecommons.org/licenses/by-nc/2.0 ), which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
About this article
Cite this article
Zwicknagl, B. Power Series Kernels. Constr Approx 29, 61–84 (2009). https://doi.org/10.1007/s00365-008-9012-4
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00365-008-9012-4
Keywords
- Multivariate polynomial approximation
- Bernstein theorem
- Dot product kernels
- Reproducing kernel Hilbert spaces
- Error bounds
- Convergence orders