Classifier-Independent Feature Selection Based on Non-parametric Discriminant Analysis
A novel algorithm for classifier-independent feature selection is proposed. There are two possible ways to select features that are effective for any kind of classifier. One way is to correctly estimate the class-conditional probability densities and the other way is to accurately estimate the discrimination boundary. The purpose of this study is to find the discrimination boundary and to determine the effectiveness of features in terms of normal vectors along the boundary. The fundamental effectiveness of this approach was confirmed by the results of several experiments.
KeywordsFeature Selection Normal Vector Recognition Rate Gaussian Mixture Model Feature Subset
- 2.Holz, H. J., and Loew, M. H.: Relative Feature Importance: A Classifier-Independent Approach to Feature Selection. In: Gelsema E. S. and Kanal L. N. (eds.) Pattern Recognition in Practice IV, Amsterdam: Elsevier (1994) 473–487Google Scholar
- 7.Quinlan, J. R.: C4.5: Programs for Machine Learning. Morgan Kaufmann San Mateo CA (1993)Google Scholar
- 9.Murphy, P. M., and Aha, D. W.: UCI Repository of machine learning databases [Machine-readable data repository]. University of California Irvine, Department of Information and Computation Science (1996)Google Scholar
- 10.Abe, N., Kudo, M., Toyama, J., and Shimbo, M.: A Divergence Criterion for Classifier-Independent Feature Selection. In: Ferri, F. J., Inesta, J. M., Amin, A., and Pudil, P. (eds.) Advances in Pattern Recognition, Lecture Notes in Computer Science, Alicante, Spain, (2000) 668–676CrossRefGoogle Scholar