Filter Methods for Feature Selection – A Comparative Study
Adequate selection of features may improve accuracy and efficiency of classifier methods. There are two main approaches for feature selection: wrapper methods, in which the features are selected using the classifier, and filter methods, in which the selection of features is independent of the classifier used. Although the wrapper approach may obtain better performances, it requires greater computational resources. For this reason, lately a new paradigm, hybrid approach, that combines both filter and wrapper methods has emerged. One of its problems is to select the filter method that gives the best relevance index for each case, and this is not an easy to solve question. Different approaches to relevance evaluation lead to a large number of indices for ranking and selection. In this paper, several filter methods are applied over artificial data sets with different number of relevant features, level of noise in the output, interaction between features and increasing number of samples. The results obtained for the four filters studied (ReliefF, Correlation-based Feature Selection, Fast Correlated Based Filter and INTERACT) are compared and discussed. The final aim of this study is to select a filter to construct a hybrid method for feature selection.
KeywordsFeature Selection Relevant Feature Feature Subset Filter Method Irrelevant Feature
Unable to display preview. Download preview PDF.
- 1.Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.: Feature Extraction. In: Foundations and Applications, Springer, Heidelberg (2006)Google Scholar
- 4.Kira, K., Rendell, L.: A practical approach to feature selection. In: Proceedings of the Ninth International Conference on Machine Learning, pp. 249–256 (1992)Google Scholar
- 5.Kononenko, I.: Estimating attributes: Analysis and extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) Machine Learning: ECML-94. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994)Google Scholar
- 7.Hall, M.A.: Correlation-based Feature Selection for Machine Learning. PhD thesis, University of Waikato, Hamilton, New Zealand (1999)Google Scholar
- 8.Yu, L., Liu, H.: Feature selection for high-dimensional data: A fast correlation-based filter solution. In: ICML. Proceedings of The Twentieth International Conference on Machine Learning, pp. 856–863 (2003)Google Scholar
- 10.Zhao, Z., Liu, H.: Searching for interacting features. In: IJCAI. Proceedings of International Joint Conference on Artificial Intelligence, pp. 1156–1161 (2007)Google Scholar
- 11.Quevedo, J.R., Bahamonde, A., Luaces, O.: A simple and efficient method for variable ranking according to their usefulness for learning. Journal Computational Statistics and Data Analysis (in press, 2007)Google Scholar
- 13.WEKA Machine Learning Project. Last access (September 2007), http://www.cs.waikato.ac.nz/~ml/
- 14.Liu, H.: Searching for interacting features. Last access (September 2007), http://www.public.asu.edu/~huanliu/INTERACT/INTERACTsoftware.html