Advertisement

Surrogate-Model Based Particle Swarm Optimisation with Local Search for Feature Selection in Classification

  • Hoai Bach Nguyen
  • Bing Xue
  • Peter Andreae
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10199)

Abstract

Evolutionary computation (EC) techniques have been applied widely to many problems because of their powerful search ability. However, EC based algorithms are usually computationally intensive, especially with an expensive fitness function. In order to solve this issue, many surrogate models have been proposed to reduce the computation time by approximating the fitness function, but they are hardly applied to EC based feature selection. This paper develops a surrogate model for particle swarm optimisation based wrapper feature selection by selecting a small number of instances to create a surrogate training set. Furthermore, based on the surrogate model, we propose a sampling local search, which improves the current best solution by utilising information from the previous evolutionary iterations. Experiments on 10 datasets show that the surrogate training set can reduce the computation time without affecting the classification performance. Meanwhile the sampling local search results in a significantly smaller number of features, especially on large datasets. The combination of the two proposed ideas successfully reduces the number of features and achieves better performance than using all features, a recent sequential feature selection algorithm, original PSO, and PSO with one of them only on most datasets.

Keywords

Feature selection Particle swarm optimization Surrogate model Instance selection 

References

  1. 1.
    Xue, B., Zhang, M., Browne, W.N., Yao, X.: A survey on evolutionary computation approaches to feature selection. IEEE Trans. Evol. Comput. 20(4), 606–626 (2016)CrossRefGoogle Scholar
  2. 2.
    Hu, M., Wu, T., Weir, J.D.: An adaptive particle swarm optimization with multiple adaptive methods. IEEE Trans. Evol. Comput. 17(5), 705–720 (2013)CrossRefGoogle Scholar
  3. 3.
    Tabatabaei, M., Hakanen, J., Hartikainen, M., Miettinen, K., Sindhya, K.: A survey on handling computationally expensive multiobjective optimization problems using surrogates: non-nature inspired methods. Struct. Multi. Optim. 52(1), 1–25 (2015)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Kennedy, J.: Particle swarm optimization. In: Sammut, C., Webb, G.I. (eds.) Encyclopedia of Machine Learning, pp. 760–766. Springer, Heidelberg (2011)Google Scholar
  5. 5.
    Nguyen, B.H., Xue, B., Andreae, P.: A novel binary particle swarm optimization algorithm and its applications on knapsack and feature selection problems. In: Leu, G., Singh, H., Elsayed, S. (eds.) Intelligent and Evolutionary Systems: The 20th Asia Pacific Symposium, IES 2016, Canberra, Australia, pp. 319–332. Springer, Heidelberg (2017)Google Scholar
  6. 6.
    Xue, B., Nguyen, S., Zhang, M.: A new binary particle swarm optimisation algorithm for feature selection. In: Esparcia-Alcázar, A.I., Mora, A.M. (eds.) EvoApplications 2014. LNCS, vol. 8602, pp. 501–513. Springer, Heidelberg (2014). doi: 10.1007/978-3-662-45523-4_41Google Scholar
  7. 7.
    Whitney, A.W.: A direct method of nonparametric measurement selection. IEEE Trans. Comput. 100(9), 1100–1103 (1971)CrossRefzbMATHGoogle Scholar
  8. 8.
    Marill, T., Green, D.M.: On the effectiveness of receptors in recognition systems. IEEE Trans. Inf. Theory 9(1), 11–17 (1963)CrossRefGoogle Scholar
  9. 9.
    Stearns, S.D.: On selecting features for pattern classifiers. In: Proceedings of the 3rd International Conference on Pattern Recognition (ICPR), Coronado, CA, pp. 71–75 (1976)Google Scholar
  10. 10.
    Pudil, P., Novovičová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recogn. Lett. 15(11), 1119–1125 (1994)CrossRefGoogle Scholar
  11. 11.
    Nakariyakul, S., Casasent, D.P.: An improvement on floating search algorithms for feature subset selection. Pattern Recogn. 42(9), 1932–1940 (2009)CrossRefzbMATHGoogle Scholar
  12. 12.
    Bharti, K.K., Singh, P.K.: Opposition chaotic fitness mutation based adaptive inertia weight BPSO for feature selection in text clustering. Appl. Soft Comput. 43, 20–34 (2016)CrossRefGoogle Scholar
  13. 13.
    Vieira, S.M., Mendonça, L.F., Farinha, G.J., Sousa, J.M.: Modified binary PSO for feature selection using SVM applied to mortality prediction of septic patients. Appl. Soft Comput. 13(8), 3494–3504 (2013)CrossRefGoogle Scholar
  14. 14.
    Nguyen, H.B., Xue, B., Liu, I., Zhang, M.: PSO and statistical clustering for feature selection: A new representation. In: Dick, G., Browne, W.N., Whigham, P., Zhang, M., Bui, L.T., Ishibuchi, H., Jin, Y., Li, X., Shi, Y., Singh, P., Tan, K.C., Tang, K. (eds.) SEAL 2014. LNCS, vol. 8886, pp. 569–581. Springer, Cham (2014). doi: 10.1007/978-3-319-13563-2_48Google Scholar
  15. 15.
    Chuang, L.Y., Chang, H.W., Tu, C.J., Yang, C.H.: Improved binary PSO for feature selection using gene expression data. Comput. Biol. Chem. 32(1), 29–38 (2008)CrossRefzbMATHGoogle Scholar
  16. 16.
    Tran, B., Xue, B., Zhang, M.: Improved PSO for feature selection on high-dimensional datasets. In: Dick, G., Browne, W.N., Whigham, P., Zhang, M., Bui, L.T., Ishibuchi, H., Jin, Y., Li, X., Shi, Y., Singh, P., Tan, K.C., Tang, K. (eds.) SEAL 2014. LNCS, vol. 8886, pp. 503–515. Springer, Cham (2014). doi: 10.1007/978-3-319-13563-2_43Google Scholar
  17. 17.
    Ghamisi, P., Benediktsson, J.A.: Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geosci. Remote Sens. Lett. 12(2), 309–313 (2015)CrossRefGoogle Scholar
  18. 18.
    Nguyen, H., Xue, B., Liu, I., Zhang, M.: Filter based backward elimination in wrapper based PSO for feature selection in classification. In: IEEE Congress on Evolutionary Computation (CEC 2014), pp. 3111–3118 (2014)Google Scholar
  19. 19.
    Liu, H., Zhang, S., Zhao, J., Zhao, X., Mo, Y.: A new classification algorithm using mutual nearest neighbors. In: Ninth International Conference on Grid and Cloud Computing, pp. 52–57 (2010)Google Scholar
  20. 20.
    Wilson, D.R., Martinez, T.R.: Reduction techniques for instance-based learning algorithms. Mach. Learn. 38(3), 257–286 (2000)CrossRefzbMATHGoogle Scholar
  21. 21.
    Olvera-López, J.A., Carrasco-Ochoa, J.A., Martínez-Trinidad, J.F., Kittler, J.: A review of instance selection methods. Artif. Intell. Rev. 34(2), 133–143 (2010)CrossRefGoogle Scholar
  22. 22.
    Jin, Y.: Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011)CrossRefGoogle Scholar
  23. 23.
    Lichman, M.: UCI machine learning repository. University of California, School of Information and Computer Sciences, Irvine, CA (2013). http://archive.ics.uci.edu/ml

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.School of Engineering and Computer ScienceVictoria University of WellingtonWellingtonNew Zealand

Personalised recommendations