Advertisement

Feature Selection Based on Kernel Discriminant Analysis

  • Masamichi Ashihara
  • Shigeo Abe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4132)

Abstract

For two-class problems we propose two feature selection criteria based on kernel discriminant analysis. The first one is the objective function of kernel discriminant analysis (KDA) and the second one is the KDA-based exception ratio. We show that the objective function of KDA is monotonic for the deletion of features, which ensures stable feature selection. The KDA-based exception ratio defines the overlap between classes in the one-dimensional space obtained by KDA. The computer experiments show that the both criteria work well to select features but the former is more stable.

Keywords

Support Vector Machine Feature Selection Inhibition Region Machine Learn Research Average Recognition Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abe, S.: Support Vector Machines for Pattern Classification. Springer, Heidelberg (2005)Google Scholar
  2. 2.
    Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Machine Learning 46(1–3), 389–422 (2002)MATHCrossRefGoogle Scholar
  3. 3.
    Bradley, P.S., Mangasarian, O.L.: Feature selection via concave minimization and support vector machines. In: Proc. ICML 1998, pp. 82–90 (1998)Google Scholar
  4. 4.
    Weston, J., Elisseeff, A., Schölkopf, B., Tipping, M.: Use of the zero-norm with linear models and kernel methods. J. Machine Learning Research 3, 1439–1461 (2003)MATHCrossRefGoogle Scholar
  5. 5.
    Perkins, S., Lacker, K., Theiler, J.: Grafting: Fast, incremental feature selection by gradient descent in function space. J. Machine Learning Research 3, 1333–1356 (2003)MATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Liu, Y., Zheng, Y.F.: FS_SFS: A novel feature selection method for support vector machines. Pattern Recognition (to appear)Google Scholar
  7. 7.
    Abe, S.: Modified backward feature selection by cross validation. In: Proc. ESANN 2005, pp. 163–168 (2005)Google Scholar
  8. 8.
    Bi, J., Bennett, K.P., Embrechts, M., Breneman, C., Song, M.: Dimensionality reduction via sparse support vector machines. J. Machine Learning Research 3, 1229–1243 (2003)MATHCrossRefGoogle Scholar
  9. 9.
    Rakotomamonjy, A.: Variable selection using SVM-based criteria. J. Machine Learning Research 3, 1357–1370 (2003)MATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    Thawonmas, R., Abe, S.: A novel approach to feature selection based on analysis of class regions. IEEE Trans. SMC–B 27(2), 196–207 (1997)Google Scholar
  11. 11.
    Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.-R.: Fisher discriminant analysis with kernels. In: NNSP 1999, pp. 41–48 (1999)Google Scholar
  12. 12.
    Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Computation 12(10), 2385–2404 (2000)CrossRefGoogle Scholar
  13. 13.
    Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)Google Scholar
  14. 14.
    Kita, S., Maekawa, S., Ozawa, S., Abe, S.: Boosting kernel discriminant analysis with adaptive kernel selection. In: Proc. ICANCA 2005, CD-ROM (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Masamichi Ashihara
    • 1
  • Shigeo Abe
    • 1
  1. 1.Graduate School of Science and TechnologyKobe UniversityRokkodai, Nada, KobeJapan

Personalised recommendations