Classifier-Independent Feature Selection Based on Non-parametric Discriminant Analysis

  • Naoto Abe
  • Mineichi Kudo
  • Masaru Shimbo
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2396)

Abstract

A novel algorithm for classifier-independent feature selection is proposed. There are two possible ways to select features that are effective for any kind of classifier. One way is to correctly estimate the class-conditional probability densities and the other way is to accurately estimate the discrimination boundary. The purpose of this study is to find the discrimination boundary and to determine the effectiveness of features in terms of normal vectors along the boundary. The fundamental effectiveness of this approach was confirmed by the results of several experiments.

Keywords

Feature Selection Normal Vector Recognition Rate Gaussian Mixture Model Feature Subset 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Kudo, M., and Sklansky J.: Comparison of Algorithms that Select Features for Pattern Classifiers. Pattern Recognition 33-1 (2000) 25–41CrossRefGoogle Scholar
  2. 2.
    Holz, H. J., and Loew, M. H.: Relative Feature Importance: A Classifier-Independent Approach to Feature Selection. In: Gelsema E. S. and Kanal L. N. (eds.) Pattern Recognition in Practice IV, Amsterdam: Elsevier (1994) 473–487Google Scholar
  3. 3.
    Novovicová, J., Pudil, P., and Kittler, J.: Divergence Based Feature Selection for Multimodal Class Densities. IEEE Transactions on Pattern Analysis and Machine Intelligence 18 (1996) 218–223CrossRefGoogle Scholar
  4. 4.
    Kudo, M., and Shimbo, M.: Feature Selection Based on the Structural Indices of Categories. Pattern Recognition 26 (1993) 891–901CrossRefGoogle Scholar
  5. 5.
    Egmont-Petersen, M., Dassen, W. R. M., and Reiber, J. H. C.: Sequential Selection of Discrete Features for Neural Networks-A Bayesian Approach to Building a Cascade. Pattern Recognition Letters 20 (1999) 1439–1448CrossRefGoogle Scholar
  6. 6.
    Fukunaga, K., and Mantock, J. M.: Nonparametric Discriminant Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 5 (1983) 671–678MATHCrossRefGoogle Scholar
  7. 7.
    Quinlan, J. R.: C4.5: Programs for Machine Learning. Morgan Kaufmann San Mateo CA (1993)Google Scholar
  8. 8.
    Kudo, M., Yanagi, S., and Shimbo, M.: Construction of Class Region by a Randomized Algorithm: A Randomized Subclass Method. Pattern Recognition 29 (1996) 581–588CrossRefGoogle Scholar
  9. 9.
    Murphy, P. M., and Aha, D. W.: UCI Repository of machine learning databases [Machine-readable data repository]. University of California Irvine, Department of Information and Computation Science (1996)Google Scholar
  10. 10.
    Abe, N., Kudo, M., Toyama, J., and Shimbo, M.: A Divergence Criterion for Classifier-Independent Feature Selection. In: Ferri, F. J., Inesta, J. M., Amin, A., and Pudil, P. (eds.) Advances in Pattern Recognition, Lecture Notes in Computer Science, Alicante, Spain, (2000) 668–676CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Naoto Abe
    • 1
  • Mineichi Kudo
    • 1
  • Masaru Shimbo
    • 2
  1. 1.Division of Systems and Information Engineering Graduate School of EngineeringHokkaido UniversitySapporoJapan
  2. 2.Faculty of Information MediaHokkaido Information UniversityEbetsuJapan

Personalised recommendations