Advertisement

Using Ensemble Feature Selection Approach in Selecting Subset with Relevant Features

  • Mohammed Attik
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)

Abstract

This study discusses the problem of feature selection as one of the most fundamental problems in the field of the machine learning. Two novel approaches for feature selection in order to select a subset with relevant features are proposed. These approaches can be considered as a direct extension of the ensemble feature selection approach. The first one deals with identifying relevant features by using a single feature selection method. While, the second one uses different feature selection methods in order to identify more correctly the relevant features. An illustration shows the effectiveness of the proposed methods on artificial databases where we have a priori the informations about the relevant features.

Keywords

Feature Selection Relevant Feature Feature Subset Feature Selection Method Neural Information Processing System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kohavi, R., John, G.: Wrappers for feature selection. Artificial Intelligence 97, 273–324 (1997)MATHCrossRefGoogle Scholar
  2. 2.
    Blum, A., Langley, P.: Selection of relevant features and examples in machine learning. Artificial Intelligence 97, 245–271 (1997)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)MATHCrossRefGoogle Scholar
  4. 4.
    Valentini, G., Masulli, F.: Ensembles of learning machines (2002)Google Scholar
  5. 5.
    Kira, K., Rendell, L.A.: The feature selection problem: Traditional methods and a new algorithm. In: Proc. of AAAI 1992, San Jose, CA, pp. 129–134 (1992)Google Scholar
  6. 6.
    Piramuthu, S.: Evaluating feature selection methods for learning in data mining applications. In: HICSS, vol. (5), pp. 294–301 (1998)Google Scholar
  7. 7.
    Le Cun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Touretzky, D.S. (ed.) Proceedings of the 1989 Conference on Advances in Neural Information Processing Systems, San Mateo, CA, pp. 598–605. Morgan-Kaufmann, San Francisco (1990)Google Scholar
  8. 8.
    Hassibi, B., Stork, D.G.: Second order derivatives for network pruning: Optimal brain surgeon. In: Hanson, S.J., Cowan, J.D., Giles, C.L. (eds.) Advances in Neural Information Processing Systems, vol. 5, pp. 164–171. Morgan Kaufmann, San Mateo (1993)Google Scholar
  9. 9.
    Stahlberger, A., Riedmiller, M.: Fast network pruning and feature extraction by using the unit-OBS algorithm. In: Mozer, M.C., Jordan, M.I., Petsche, T. (eds.) Advances in Neural Information Processing Systems, vol. 9, p. 655. The MIT Press, Cambridge (1997)Google Scholar
  10. 10.
    Attik, M., Bougrain, L., Alexandre, F.: Optimal brain surgeon variants for feature selection. In: International Joint Conference on Neural Networks - IJCNN 2004, Budapest, Hungary (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Mohammed Attik
    • 1
  1. 1.LORIA/INRIA-LorraineVandœuvre-lès-NancyFrance

Personalised recommendations