Skip to main content

A Brief Introduction on Kernels

  • Chapter
  • First Online:
Machine Learning

Abstract

In the previous chapters, we described the Support Vector Machines as a method that creates an optimal hyperplane separating two classes by minimizing the loss via margin maximization. This maximization led to a dual optimization problem resulting in a Lagrangian function which is quadratic and requires simple inequality constraints. The support vectors are responsible for defining the hyperplane, resulting in the support vector classifier f which not only provides a unique solution to the problem, but also ensures learning with tighter guarantees. However this is only possible if the input space is sufficiently linearly separable. On the other hand, many input spaces are, in fact, not linearly separable. In order to overcome this restriction, nonlinear transformations can be used to implicitly obtain a more adequate space.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    There are variations for PCA to study data subspaces which employ XX T in order to analyze the correlations among position vectors.

  2. 2.

    Recall a Hilbert space is simply a vector space for which: a distance metric, a vector norm and a dot product are defined. Because of that, it supports projection of vectors into spaces, and therefore geometry.

  3. 3.

    Please, refer to the MNIST database http://yann.lecun.com/exdb/mnist.

References

  1. R. Courant, D. Hilbert, Methods of Mathematical Physics. Methods of Mathematical Physics, vol. 2 (Interscience Publishers, Geneva, 1962)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Fernandes de Mello, R., Antonelli Ponti, M. (2018). A Brief Introduction on Kernels. In: Machine Learning. Springer, Cham. https://doi.org/10.1007/978-3-319-94989-5_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-94989-5_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-94988-8

  • Online ISBN: 978-3-319-94989-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics