Chapter

Learning Theory

Volume 4005 of the series Lecture Notes in Computer Science pp 169-183

Learning Bounds for Support Vector Machines with Learned Kernels

  • Nathan SrebroAffiliated withDepartment of Computer Science, University of Toronto
  • , Shai Ben-DavidAffiliated withSchool of Computer Science, University of Waterloo

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Consider the problem of learning a kernel for use in SVM classification. We bound the estimation error of a large margin classifier when the kernel, relative to which this margin is defined, is chosen from a family of kernels based on the training sample. For a kernel family with pseudodimension d φ , we present a bound of \(\sqrt{\tilde{\mathcal{O}}{({d_{\phi}}+1/\gamma^2)}/n}\) on the estimation error for SVMs with margin γ. This is the first bound in which the relation between the margin term and the family-of-kernels term is additive rather then multiplicative. The pseudodimension of families of linear combinations of base kernels is the number of base kernels. Unlike in previous (multiplicative) bounds, there is no non-negativity requirement on the coefficients of the linear combinations. We also give simple bounds on the pseudodimension for families of Gaussian kernels.