Advertisement

A monotonic measure for optimal feature selection

  • Huan Liul
  • Hiroshi Motoda
  • Manoranjan Dash
Feature Selection
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1398)

Abstract

Feature selection is a problem of choosing a subset of relevant features. In general, only exhaustive search can bring about the optimal subset. With a monotonic measure, exhaustive search can be avoided without sacrificing optimality. Unfortunately, most error- or distancebased measures are not monotonic. A new measure is employed in this work that is monotonic and fast to compute. The search for relevant features according to this measure is guaranteed to be complete but not exhaustive. Experiments are conducted for verification.

References

  1. 1.
    H. Almuallim and T.G. Dietterich. Learning with many irrelevant features. In Proceedings of AAAI, 1991.Google Scholar
  2. 2.
    M. Ben-Bassat. Pattern recognition and reduction of dimensionality. In P. R. Krishnaiah and L. N. Kanal, editors, Handbook of statistics-II, pages 773–791. North Holland, 1982.Google Scholar
  3. 3.
    G.H. John, R. Kohavi, and K. Pfleger. Irrelevant feature and the subset selection problem. In Proceedings of ICML, pages 121–129. Morgan Kaufmann, 1994.Google Scholar
  4. 4.
    K. Kira and L.A. Rendell. The feature selection problem: Traditional methods and a new algorithm. In Proceedings of AAAI, pages 129–134. 1992.Google Scholar
  5. 5.
    H. Liu and R. Setiono. A probabilistic approach to feature selection — a filter solution. In Proceedings of ICML, pages 319–327. Morgan Kaufmann, 1996.Google Scholar
  6. 6.
    C.J. Merz and P.M. Murphy. UCI repository of machine learning databases. http://www.ics.uci.edu/≈mlearn/MLRepository.html. Irvine, CA: University of California, Department of Information and Computer Science, 1996.Google Scholar
  7. 7.
    P.M. Narendra and K. Fukunaga. A branch and bound algorithm for feature subset selection. IEEE Trans. on Computer, C-26(9):917–922, September 1977.Google Scholar
  8. 8.
    J.R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, 1993.Google Scholar
  9. 9.
    W. Siedlecki and J Sklansky. On automatic feature selection. International Journal of Pattern Recognition and Artificial Intelligence, 2:197–220, 1988.CrossRefGoogle Scholar
  10. 10.
    A. Zell and et al. Stuttgart neural network simulator (SNNS), user manual, version 4.1. ftp.informatik.uni-stuttgart.de/pub/SNNS, 1995.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Huan Liul
    • 1
  • Hiroshi Motoda
    • 2
  • Manoranjan Dash
    • 3
  1. 1.Dept of Info Sys & Comp SciNational University of SingaporeSingapore
  2. 2.Division of Intelligent Sys SciOsaka UniversityIbaraki, OsakaJapan
  3. 3.Biolnformatics CentreNational University of SingaporeSingapore

Personalised recommendations