Class Specific Feature Selection Using Simulated Annealing
This paper proposes a method of identifying features which are important for each class. This entails selecting the features specifically for each class. This is carried out by using the simulated annealing technique. The algorithm is run separately for each class resulting in the feature subset for that class. A test pattern is classified by running a classifier for each class and combining the result. The 1NN classifier is the classification algorithm used. Results have been reported on eight benchmark datasets from the UCI repository. The selected features, besides giving good classification accuracy, gives an idea of the important features for each class.
- 1.Mackin, P.D., Roy, A., Mukhopadhyay, S.: Methods for pattern selection, class-specific feature selection and classification for automated learning. Neural Netw. (2013). doi: 10.1016/j.neunet.2012.12.007
- 2.Gilbert, J.E., Soares, C., Williams, P., Dozier, G.: A class-specific ensemble feature selection approach for classification problems. In: ACMSE 2010 (2010)Google Scholar
- 5.Francois, D., de Lannoy, G., Verleysen, M.: Class-specific feature selection for one-against-all multiclass svms. In: ESANN 2011 Proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, pp. 263–268 (2011)Google Scholar
- 8.Lanzi, P.L.: Fast feature selection with genetic algorithms: a filter approach. In: IEEE International Conference on Evolutionary Computation, pp. 537–540 (1997)Google Scholar
- 11.UCI Repository of Machine Learning Databases (1998). http://www.ics.uci.edu/mlearn/MLRepository.html
- 12.Chen, Y.W., Lin, C.J.: Combining svms with various feature selection strategies. Strat. 324(1), 1–10 (2006)Google Scholar