Feature Selection Based on Kernel Discriminant Analysis
For two-class problems we propose two feature selection criteria based on kernel discriminant analysis. The first one is the objective function of kernel discriminant analysis (KDA) and the second one is the KDA-based exception ratio. We show that the objective function of KDA is monotonic for the deletion of features, which ensures stable feature selection. The KDA-based exception ratio defines the overlap between classes in the one-dimensional space obtained by KDA. The computer experiments show that the both criteria work well to select features but the former is more stable.
KeywordsSupport Vector Machine Feature Selection Inhibition Region Machine Learn Research Average Recognition Rate
Unable to display preview. Download preview PDF.
- 1.Abe, S.: Support Vector Machines for Pattern Classification. Springer, Heidelberg (2005)Google Scholar
- 3.Bradley, P.S., Mangasarian, O.L.: Feature selection via concave minimization and support vector machines. In: Proc. ICML 1998, pp. 82–90 (1998)Google Scholar
- 6.Liu, Y., Zheng, Y.F.: FS_SFS: A novel feature selection method for support vector machines. Pattern Recognition (to appear)Google Scholar
- 7.Abe, S.: Modified backward feature selection by cross validation. In: Proc. ESANN 2005, pp. 163–168 (2005)Google Scholar
- 10.Thawonmas, R., Abe, S.: A novel approach to feature selection based on analysis of class regions. IEEE Trans. SMC–B 27(2), 196–207 (1997)Google Scholar
- 11.Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.-R.: Fisher discriminant analysis with kernels. In: NNSP 1999, pp. 41–48 (1999)Google Scholar
- 13.Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)Google Scholar
- 14.Kita, S., Maekawa, S., Ozawa, S., Abe, S.: Boosting kernel discriminant analysis with adaptive kernel selection. In: Proc. ICANCA 2005, CD-ROM (2005)Google Scholar