Skip to main content

A Criterion for Learning the Data-Dependent Kernel for Classification

  • Conference paper
Advanced Data Mining and Applications (ADMA 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4632))

Included in the following conference series:

  • 2215 Accesses

Abstract

A novel criterion, namely Maximum Margin Criterion (MMC), is proposed for learning the data-dependent kernel for classification. Different kernels create the different geometrical structures of the data in the feature space, and lead to different class discrimination. Selection of kernel influences greatly the performance of kernel learning. Optimizing kernel is an effective method to improve the classification performance. In this paper, we propose a novel kernel optimization method based on maximum margin criterion, which can solve the problem of Xiong’s work [1] that the optimal solution can be solved by iteration update algorithm owing to the singular problem of matrix. Our method can obtain a unique optimal solution by solving an eigenvalue problem, and the performance is enhanced while time consuming is decreased. Experimental results show that the proposed algorithm gives a better performance and a lower time consuming compared with Xiong’s work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Yang, J., Frangi, A.F., Yang, J.-y., Zhang, D., Jin, Z.: KPCA Plus LDA: A Complete Kernel Fisher Discriminant Framework for Feature Extraction and Recognition. IEEE Trans. Pattern Analysis and Machine Intelligence 27(2), 230–244 (2005)

    Article  Google Scholar 

  2. Liu, Q., Lu, H., Ma, S.: Improving kernel Fisher discriminant analysis for face recognition. IEEE Trans. Pattern Analysis and Machine Intelligence 14(1), 42–49 (2004)

    Google Scholar 

  3. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    Book  MATH  Google Scholar 

  4. Müller, K.R., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An introduction to kernel-based learning algorithms. IEEE Trans. Neural Networks 12, 181–201 (2001)

    Article  Google Scholar 

  5. Scholkopf, B., Smola, A., Mu‘ller, K.R.: Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation 10(5), 1299–1319 (1998)

    Article  Google Scholar 

  6. Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Mu‘ller, K.-R.: Fisher Discriminant Analysis with Kernels. In: Proc. IEEE Int’l Workshop Neural Networks for Signal Processing IX, August 1999, pp. 41–48. IEEE Computer Society Press, Los Alamitos (1999)

    Google Scholar 

  7. Lu, J., Plataniotis, K.N., Venetsanopoulos, A.N.: Face recognition using kernel direct discriminant analysis algorithms. IEEE Transactions on Neural Networks 14(1), 117–226 (2003)

    Article  Google Scholar 

  8. Baudat, G., Anouar, F.: Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation 12(10), 2385–2404 (2000)

    Article  Google Scholar 

  9. Liang, Z., Shi, P.: Uncorrelated discriminant vectors using a kernel method. Pattern Recognition 38, 307–310 (2005)

    Article  MATH  Google Scholar 

  10. Liang, Z., Shi, P.: Efficient algorithm for kernel discriminant anlaysis. Pattern Recognition 37(2), 381–384 (2004)

    Article  Google Scholar 

  11. Liang, Z., Shi, P.: An efficient and effective method to solve kernel Fisher discriminant analysis. Neurocomputing 61, 485–493 (2004)

    Article  Google Scholar 

  12. Lu, J., Plataniotis, K.N., Venetsanopoulos, A.N.: Face Recognition Using Kernel Direct Discriminant Analysis Algorithms. IEEE Trans. Neural Networks 14(1), 117–126 (2003)

    Article  Google Scholar 

  13. Yang, M.H.: Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods. In: Proc. Fifth IEEE Int’l Conf. Automatic Face and Gesture Recognition, May 2002, pp. 215–220 (2002)

    Google Scholar 

  14. Zheng, W., Zou, C., Zhao, L.: Weighted maximum margin discriminant analysis with kernels. Neurocomputing 67, 357–362 (2005)

    Article  Google Scholar 

  15. Huang, J., Yuen, P.C., Chen, W.-S., Lai, J.H.: Kernel Subspace LDA with Optimized Kernel Parameters on Face Recognition. In: Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, IEEE Computer Society Press, Los Alamitos (2004)

    Google Scholar 

  16. Wang, L., Chan, K.L., Xue, P.: A Criterion for Optimizing Kernel Parameters in KBDA for Image Retrieval. IEEE Trans. Systems, Man and Cybernetics-Part B: Cybernetics 35(3), 556–562 (2005)

    Article  Google Scholar 

  17. Chen, W.-S., Yuen, P.C., Huang, J., Dai, D.-Q.: Kernel Machine-Based One-Parameter Regularized Fisher Discriminant Method for Face Recognition. IEEE Trans. Systems, Man and Cybernetics-Part B: Cybernetics 35(4), 658–669 (2005)

    Google Scholar 

  18. Huilin Xiong, Swamy, M.N.S., Omair Ahmad, M.: Optimizing the Kernel in the Empirical Feature Space. IEEE Trans. Neural Networks 16(2), 460–474 (2005)

    Article  Google Scholar 

  19. Samaria, F., Harter, A.: Parameterisation of a Stochastic Model for Human Face Identification. In: Proceedings of 2nd IEEE Workshop on Applications of Computer Vision, Sarasota, FL (December 1994)

    Google Scholar 

  20. Li, H., Jiang, T., Zhang, K.: Efficient and Robust Feature Extraction by Maximum Margin Criterion. IEEE Trans. Neural Networks 17(1), 157–165 (2006)

    Article  Google Scholar 

  21. Amari, S., Wu, S.: Improving support vector machine classifiers by modifying kernel functions. Neural Netw. 12(6), 783–789 (1999)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Li, JB., Chu, SC., Pan, JS. (2007). A Criterion for Learning the Data-Dependent Kernel for Classification. In: Alhajj, R., Gao, H., Li, J., Li, X., Zaïane, O.R. (eds) Advanced Data Mining and Applications. ADMA 2007. Lecture Notes in Computer Science(), vol 4632. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73871-8_34

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-73871-8_34

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-73870-1

  • Online ISBN: 978-3-540-73871-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics