Skip to main content

Kernel-Based Methods

  • Chapter
  • First Online:

Part of the book series: Advances in Pattern Recognition ((ACVPR))

Abstract

Inspired by the success of support vector machines, to improve generalization and classification abilities, conventional pattern classification techniques have been extended to incorporate maximizing margins and mapping to a feature space. For example, perceptron algorithms [1–4], neural networks (Chapter 9), and fuzzy systems (Chapter 10) have incorporated maximizing margins and/or mapping to a feature space.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    If \(M^{\prime}=M\), (6.24) is the same as (20.13) in [7].

References

  1. Y. Freund and R. E. Schapire.Large margin classification using the perceptron algorithm.Machine Learning, 37(3):277–296, 1999.

    Article  MATH  Google Scholar 

  2. C. Gentile. A new approximate maximal margin classification algorithm.Journal of Machine Learning Research, 2:213–242, 2001.

    Article  MathSciNet  Google Scholar 

  3. Y. Li and P. M. Long. The relaxed online maximum margin algorithm. Machine Learning, 46(1–3):361–387, 2002.

    Article  MATH  Google Scholar 

  4. K. Crammer and Y. Singer. Ultraconservative online algorithms for multiclass problems. Journal of Machine Learning Research, 3:951–991, 2003.

    Article  MATH  MathSciNet  Google Scholar 

  5. G. Baudat and F. Anouar. Kernel-based methods and function approximation. In Proceedings of International Joint Conference on Neural Networks (IJCNN ’01), volume 2, pages 1244–1249, Washington, DC, 2001.

    Google Scholar 

  6. A. Ruiz and P. E. López-de-Teruel. Nonlinear kernel-based statistical pattern analysis. IEEE Transactions on Neural Networks, 12(1):16–32, 2001.

    Article  Google Scholar 

  7. B. Schölkopf, A. J. Smola, and K.-R. Müller. Kernel principal component analysis. In B. Schölkopf, C. J. C. Burges, and A. J. Smola, editors, Advances in Kernel Methods: Support Vector Learning, pages 327–352. MIT Press, Cambridge, MA, 1999.

    Google Scholar 

  8. A. J. Smola, O. L. Mangasarian, and B. Schölkopf. Sparse kernel feature analysis. Technical Report 99-04, University of Wisconsin, Data Mining Institute, Madison, WI, 1999.

    Google Scholar 

  9. B. Schölkopf and A. J. Smola. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge, MA, 2002.

    Google Scholar 

  10. Z. Rong and A. I. Rudnicky. A large scale clustering scheme for kernel k-means.In Proceedings of the Sixteenth International Conference on Pattern Recognition (ICPR 2002), volume 4, pages 289–292, 2002.

    Google Scholar 

  11. I. S. Dhillon, Y. Guan, and B. Kulis. Kernel k-means, spectral clustering and normalized cuts. In KDD-2004: Proceedings of the Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Seattle, WA, pages 551–556. Association for Computing Machinery, New York, 2004.

    Google Scholar 

  12. Z. Li, S. Tang, J. Xue, and J. Jiang. Modified FCM clustering based on kernel mapping. In J. Shen, S. Pankanti, and R. Wang, editors, Proceedings of SPIE: Object Detection, Classification, and Tracking Technologies, volume 4554, pages 241–245, Wuhan, China, 2001.

    Google Scholar 

  13. D.-Q. Zhang and S.-C. Chen. Clustering incomplete data using kernel-based fuzzy c-means algorithm. Neural Processing Letters, 18(3):155–162, 2003.

    Article  Google Scholar 

  14. T. Graepel, M. Burger, and K. Obermayer. Self-organizing maps: Generalizations and new optimization techniques. Neurocomputing, 21(1–3):173–190, 1998.

    Article  MATH  Google Scholar 

  15. E. Maeda and H. Murase. Multi-category classification by kernel based nonlinear subspace method. In Proceedings of 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP ’99), volume 2, pages 1025–1028, 1999.

    Google Scholar 

  16. S. Takeuchi, T. Kitamura, S. Abe, and K. Fukui. Subspace based linear programming support vector machines. In Proceedings of the 2009 International Joint Conference on Neural Networks (IJCNN 2009), pages 3067–3073, Atlanta, GA, 2009.

    Google Scholar 

  17. T. Kitamura, S. Abe, and K. Fukui.Subspace based least squares support vector machines for pattern classification. In Proceedings of the 2009 International Joint Conference on Neural Networks (IJCNN 2009), pages 1640–1646, Atlanta, GA, 2009.

    Google Scholar 

  18. T. Kitamura S. Takeuchi, S. Abe, and K. Fukui. Subspace-based support vector machines. Neural Networks, 22(5–6):558–567, 2009.

    Article  Google Scholar 

  19. C. H. Park and H. Park. Efficient nonlinear dimension reduction for clustered data using kernel functions. In Proceedings of the Third IEEE International Conference on Data Mining (ICDM 2003), pages 243–250, Melbourne, FL, 2003.

    Google Scholar 

  20. P. Zhang, J. Peng, and C. Domeniconi. Dimensionality reduction using kernel pooled local discriminant information. In Proceedings of the Third IEEE International Conference on Data Mining (ICDM 2003), pages 701–704, Melbourne, FL, 2003.

    Google Scholar 

  21. G. C. Cawley and N. L. C. Talbot. Efficient formation of a basis in a kernel feature space. In Proceedings of the Tenth European Symposium on Artificial Neural Networks (ESANN 2002), pages 1–6, Bruges, Belgium, 2002.

    Google Scholar 

  22. S. Fine and K. Scheinberg. Efficient SVM training using low-rank kernel representations. Journal of Machine Learning Research, 2:243–264, 2001.

    Article  Google Scholar 

  23. T. Evgeniou, M. Pontil, and T. Poggio. Regularization networks and support vector machines. Advances in Computational Mathematics, 13(1):1–50, 2000.

    Article  MATH  MathSciNet  Google Scholar 

  24. P. Zhang and J. Peng. SVM vs regularized least squares classification. In Proceedings of the Seventeenth International Conference on Pattern Recognition (ICPR 2004), volume 1, pages 176–179, Cambridge, UK, 2004.

    Google Scholar 

  25. K. Morikawa. Pattern classification and function approximation by kernel least squares. Bachelor’s thesis, Electrical and Electronics Engineering, Kobe University, Japan, 2004 (in Japanese).

    Google Scholar 

  26. G. H. Golub and C. F. Van Loan. Matrix Computations, Third Edition. The Johns Hopkins University Press, Baltimore, MD, 1996.

    Google Scholar 

  27. W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery. Numerical Recipes in C: The Art of Scientific Computing, Second Edition. Cambridge University Press, Cambridge, UK, 1992.

    Google Scholar 

  28. S. Mika, B. Schölkopf, A. Smola, K.-R. Müller, M. Scholz, and G. Rätsch. Kernel PCA and de-noising in feature spaces. In M. S. Kearns, S. A. Solla, and D. A. Cohn, editors, Advances in Neural Information Processing Systems 11, pages 536–542. MIT Press, Cambridge, MA, 1999.

    Google Scholar 

  29. B. Schölkopf, S. Mika, C. J. C. Burges, P. Knirsch, K.-R. Müller, G. Rätsch, and A. J. Smola. Input space versus feature space in kernel-based methods. IEEE Transactions on Neural Networks, 10(5):1000–1017, 1999.

    Article  Google Scholar 

  30. T. Takahashi and T. Kurita. Robust de-noising by kernel PCA. In J. R. Dorronsoro, editor, Artificial Neural Networks (ICANN 2002)–-Proceedings of International Conference, Madrid, Spain, pages 739–744. Springer-Verlag, Berlin, Germany, 2002.

    Google Scholar 

  31. J. T.-Y. Kwok and I. W.-H. Tsang. The pre-image problem in kernel methods. IEEE Transactions on Neural Networks, 15(6):1517–1525, 2004.

    Article  Google Scholar 

  32. R. O. Duda and P. E. Hart. Pattern Classification and Scene Analysis. John Wiley & Sons, New York, 1973.

    Google Scholar 

  33. S. Mika, G. Rätsch, J. Weston, B. Schölkopf, and K.-R. Müller. Fisher discriminant analysis with kernels. In Y.-H. Hu, J. Larsen, E. Wilson, and S. Douglas, editors, Neural Networks for Signal Processing IX–-Proceedings of the 1999 IEEE Signal Processing Society Workshop, pages 41–48, 1999.

    Google Scholar 

  34. G. Baudat and F. Anouar. Generalized discriminant analysis using a kernel approach. Neural Computation, 12(10):2385–2404, 2000.

    Article  Google Scholar 

  35. H. Li, T. Jiang, and K. Zhang. Efficient and robust feature extraction by maximum margin criterion. In S. Thrun, L. K. Saul, and B. Schölkopf, editors, Advances in Neural Information Processing Systems 16, pages 97–104. MIT Press, Cambridge, MA, 2004.

    Google Scholar 

  36. J. Yang, A. F. Frangi, J.-Y. Yang, D. Zhang, and Z. Jin. KPCA plus LDA: A complete kernel fisher discriminant framework for feature extraction and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(2):230–244, 2005.

    Article  Google Scholar 

  37. E. Pekalska and B. Haasdonk. Kernel discriminant analysis for positive definite and indefinite kernels. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(6):1017–1031, 2009.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shigeo Abe .

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag London Limited

About this chapter

Cite this chapter

Abe, S. (2010). Kernel-Based Methods . In: Support Vector Machines for Pattern Classification. Advances in Pattern Recognition. Springer, London. https://doi.org/10.1007/978-1-84996-098-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-1-84996-098-4_6

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84996-097-7

  • Online ISBN: 978-1-84996-098-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics