Driven Forward Features Selection: A Comparative Study on Neural Networks

  • Vincent Lemaire
  • Raphael Féraud
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4233)


In the field of neural networks, feature selection has been studied for the last ten years and classical as well as original methods have been employed. This paper reviews the efficiency of four approaches to do a driven forward features selection on neural networks . We assess the efficiency of these methods compare to the simple Pearson criterion in case of a regression problem.


Neural Network Feature Selection Variable Selection Feature Selection Method Regression Problem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    JMLR (ed.): Special Issue on Variable and Feature Selection. Journal of Machine Learning Research 3(Mar) (2003)Google Scholar
  2. 2.
    Guyon, I., Elisseef, A.: An introduction to variable and feature selection. JMLR 3(MAR), 1157–1182 (2003)MATHCrossRefGoogle Scholar
  3. 3.
    Guyon, I.: To appear - Feature extraction, foundations and applications (2006)Google Scholar
  4. 4.
    Kohavi, R., John, G.: Wrappers for feature subset selection. Artificial Intelligence 97(1-2) (1997)Google Scholar
  5. 5.
    Thomson, M.L.: Selection of variables in multiple regression part i: A review and evaluation and part ii: Chosen procedures, computations and examples. International Statistical Review 46, 1–19, 129–146 (1978)Google Scholar
  6. 6.
    McLachlan, G.: Discriminant Analysis and Statistical Pattern Recognition. Wiley-Interscience publication, Chichester (1992)CrossRefGoogle Scholar
  7. 7.
    Leray, P., Gallinari, P.: Feature selection with neural networks. Technical report, LIP6 (1998)Google Scholar
  8. 8.
    Miller, A.J.: Subset Selection in Regression. Chapman and Hall, Boca Raton (1990)MATHGoogle Scholar
  9. 9.
    Lemaire, V., Clérot, C.: An input variable importance definition based on empirical data probability and its use in variable selection. In: International Joint Conference on Neural Networks IJCNN (2004)Google Scholar
  10. 10.
    Féraud, R., Clérot, F.: A methodology to explain neural network classification. Neural Networks 15, 237–246 (2002)CrossRefGoogle Scholar
  11. 11.
    Breiman, L.: Random forest. Machine Learning 45 (2001)Google Scholar
  12. 12.
    Yacoub, M., Bennani, Y.: Hvs: A heuristic for variable selection in multilayer artificial neural network classifier. In: ANNIE, pp. 527–532 (1997)Google Scholar
  13. 13.
    Moody, J.: Prediction Risk and Architecture Selection for Neural Networks. From Statistics to Neural Networks-Theory and Pattern Recognition. Springer (1994)Google Scholar
  14. 14.
    Ruck, D.W., Rogers, S.K., Kabrisky, M.: Feature selection using a multilayer perceptron. J. Neural Network Comput. 2(2), 40–48 (1990)Google Scholar
  15. 15.
    Réfénes, A.N., Zapranis, A., Utans, J.: Stock performance using neural networks: A comparative study with regression models. Neural Network 7, 375–388 (1994)CrossRefGoogle Scholar
  16. 16.
    Refenes, A., Zapranis, A., Utans, J.: Neural model identification, variable selection and model adequacy. In: Neural Networks in Financial Engineering, Proceedings of NnCM 1996 (1996)Google Scholar
  17. 17.
    Burkitt, A.N.: Refined pruning techniques for feed-forward neural networks. Complex System 6, 479–494 (1992)MATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Vincent Lemaire
    • 1
  • Raphael Féraud
    • 1
  1. 1.France Télécom R&D Lannion 

Personalised recommendations