Advertisement

Constructive Approximation

, Volume 29, Issue 1, pp 61–84 | Cite as

Power Series Kernels

  • Barbara ZwicknaglEmail author
Open Access
Article

Abstract

We introduce a class of analytic positive definite multivariate kernels which includes infinite dot product kernels as sometimes used in machine learning, certain new nonlinearly factorizable kernels, and a kernel which is closely related to the Gaussian. Each such kernel reproduces in a certain “native” Hilbert space of multivariate analytic functions. If functions from this space are interpolated in scattered locations by translates of the kernel, we prove spectral convergence rates of the interpolants and all derivatives. By truncation of the power series of the kernel-based interpolants, we constructively generalize the classical Bernstein theorem concerning polynomial approximation of analytic functions to the multivariate case. An application to machine learning algorithms is presented.

Keywords

Multivariate polynomial approximation Bernstein theorem Dot product kernels Reproducing kernel Hilbert spaces Error bounds Convergence orders 

Mathematics Subject Classification (2000)

41A05 41A10 41A25 41A58 41A63 68T05 

References

  1. 1.
    Madych, W., Nelson, S.: Bounds on multivariate polynomials and exponential error estimates for multiquadric interpolation. J. Approx. Theory 70, 94–114 (1992) zbMATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    de Boor, C., Ron, A.: On multivariate polynomial interpolation. Constr. Approx. 6, 287–302 (1990) zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Schaback, R.: Interpolation by polynomials and radial basis functions. Constr. Approx. 21, 293–317 (2005) zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Driscoll, T., Fornberg, B.: Interpolation in the limit of increasingly flat radial basis functions. Comput. Math. Appl. 43, 413–422 (2002) zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Wendland, H.: Scattered Data Approximation. Cambridge Monographs on Applied and Computational Mathematics. Cambridge University Press, Cambridge (2005) zbMATHGoogle Scholar
  6. 6.
    Christianini, N., Shawe-Taylor, J.: Kernel Methods for Pattern Recognition. Cambridge University Press, Cambridge (2004) Google Scholar
  7. 7.
    Schölkopf, B., Smola, A.: Learning with Kernels—Support Vector Machines, Regularisation, and Beyond. MIT Press, Cambridge (2002) Google Scholar
  8. 8.
    Lu, F., Sun, H.: Positive definite dot product kernels in learning theory. Adv. Comput. Math. 22, 181–198 (2005) zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Abramowitz, M.: Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Table. Dover, New York (1970) Google Scholar
  10. 10.
    Erdelyi, A., et al.: Higher Transcendental Functions, vol. II. McGraw–Hill, New York (1953–1955) Google Scholar
  11. 11.
    Natanson, I.: Constructive Function Theory, I. Frederick Ungar Publishing Co., New York (1965) Google Scholar
  12. 12.
    Baouendi, M., Goulaouic, C.: Approximation polynômiale de fonctions C et analytique. Ann. Inst. Fourier (Grenoble) 21, 149–173 (1971) zbMATHMathSciNetGoogle Scholar
  13. 13.
    Hoerl, A., Kennrad, R.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12, 55–67 (1970) zbMATHCrossRefGoogle Scholar
  14. 14.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995) zbMATHGoogle Scholar
  15. 15.
    Kurkova, V., Sanguineti, M.: Learning with generalization capability by kernel methods of bounded complexity. J. Complex. 21, 350–367 (2005) zbMATHCrossRefMathSciNetGoogle Scholar
  16. 16.
    Wendland, H., Rieger, C.: Approximate interpolation. Numer. Math. 101, 643–662 (2005) CrossRefMathSciNetGoogle Scholar
  17. 17.
    Schölkopf, B., Williamson, R., Bartlett, W.: New support vector algorithms. Neural Comput. 12, 1207–1245 (2000) CrossRefGoogle Scholar
  18. 18.
    Rieger, C., Zwicknagl, B.: Deterministic error analysis of support vector machines and related regularized kernel methods. MPI-MIS preprint 152/2006 (2006) Google Scholar
  19. 19.
    Schaback, R., Wendland, H.: Inverse and saturation theorems for radial basis function interpolation. Math. Comput. 71, 669–681 (2002) zbMATHMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2008

Authors and Affiliations

  1. 1.Max Planck Institute for Mathematics in the SciencesLeipzigGermany

Personalised recommendations