Advertisement

Efficient Hybrid Nature-Inspired Binary Optimizers for Feature Selection

  • Majdi Mafarja
  • Asma Qasem
  • Ali Asghar Heidari
  • Ibrahim AljarahEmail author
  • Hossam Faris
  • Seyedali Mirjalili
Article
  • 74 Downloads

Abstract

The process of dimensionality reduction is a crucial solution to deal with the dimensionality problem that may be faced when dealing with the majority of machine learning techniques. This paper proposes an enhanced hybrid metaheuristic approach using grey wolf optimizer (GWO) and whale optimization algorithm (WOA) to develop a wrapper-based feature selection method. The main objective of the proposed technique is to alleviate the drawbacks of both algorithms, including immature convergence and stagnation to local optima (LO). The hybridization is done with improvements in the mechanisms of both algorithms. To confirm the stability of the proposed approach, 18 well-known datasets are employed from the UCI repository. Furthermore, the classification accuracy, number of selected features, fitness values, and run time matrices are collected and compared with a set of well-known feature selection approaches in the literature. The results show the superiority of the proposed approach compared with both GWO and WOA. The results also show that the proposed hybrid technique outperforms other state-of-the-art approaches, significantly.

Keywords

Whale optimization algorithm Grey wolf optimizer Optimization Feature selection Metaheuristics 

Notes

Funding

This research was supported by the research committee at Birzeit University with a grant number 250177.

Compliance with Ethical Standards

Conflict of interests

The authors declare that they have no conflict of interest.

Informed Consent

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5).

Human and Animal Rights

This article does not contain any studies with human or animal subjects performed by any of the authors.

References

  1. 1.
    Aljarah I, Ala’M AZ, Faris H, Hassonah MA, Mirjalili S, Saadeh H. Simultaneous feature selection and support vector machine optimization using the grasshopper optimization algorithm. Cogn Comput 2018; 10:1–18.CrossRefGoogle Scholar
  2. 2.
    Aljarah I, Mafarja M, Heidari AA, Faris H, Zhang Y, Mirjalili S. Asynchronous accelerating multi-leader salp chains for feature selection. Appl Soft Comput 2018;71:964–79.CrossRefGoogle Scholar
  3. 3.
    Aljarah I, Mafarja M, Heidari AA, Faris H, Mirjalili S. Clustering analysis using a novel locality-informed grey wolf-inspired clustering approach. Knowl Inf Syst. 2019.  https://doi.org/10.1007/s10115-019-01358-x.
  4. 4.
    Chen H, Jiao S, Heidari AA, Wang M, Chen X, Zhao X. An opposition-based sine cosine approach with local search for parameter estimation of photovoltaic models. Energy Convers Manag 2019;195:927–942.CrossRefGoogle Scholar
  5. 5.
    Dash M, Liu H. Feature selection for classification. Intelligent Data Analysis 1997;1(3):131–56.CrossRefGoogle Scholar
  6. 6.
    Dorigo M, Di Caro G. Ant colony optimization: a new meta-heuristic. Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406). IEEE; 1999. p. 1470–77.Google Scholar
  7. 7.
    Eberhart R, Kennedy J. A new optimizer using particle swarm theory. Proceedings of the sixth international symposium on micro machine and human science, 1995. MHS’95. IEEE; 1995. p. 39–43.Google Scholar
  8. 8.
    Emary E, Zawbaa HM, Hassanien AE. Binary ant lion approaches for feature selection. Neurocomputing 2016;213:54–65.CrossRefGoogle Scholar
  9. 9.
    Emary E, Zawbaa HM, Hassanien AE. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016;172:371–81.CrossRefGoogle Scholar
  10. 10.
    Emary E, Zawbaa HM, Hassanien AE. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016;172:371–81.CrossRefGoogle Scholar
  11. 11.
    Faris H, Aljarah I, et al. Optimizing feedforward neural networks using krill herd algorithm for e-mail spam detection. In: 2015 IEEE Jordan conference on applied electrical engineering and computing technologies (AEECT). IEEE; 2015. P. 1–5.Google Scholar
  12. 12.
    Faris H, Aljarah I, Al-Madi N, Mirjalili S. Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 2016;25(06):1650033.CrossRefGoogle Scholar
  13. 13.
    Faris H, Aljarah I, Mirjalili S. Evolving radial basis function networks using moth–flame optimizer. Handbook of Neural Computation. Elsevier; 2017. P. 537–550.Google Scholar
  14. 14.
    Faris H, Al-Zoubi AM, Heidari AA, Aljarah I, Mafarja M, Hassonah MA, Fujita H. An intelligent system for spam detection and identification of the most relevant features based on evolutionary random weight networks. Information Fusion 2019;48:67–83.  https://doi.org/10.1016/j.inffus.2018.08.002.CrossRefGoogle Scholar
  15. 15.
    Faris H, Mafarja M, Heidari AA, Aljarah I, Ala’M A.Z., Mirjalili S., Fujita H. An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowl-Based Syst 2018;154:43–67.CrossRefGoogle Scholar
  16. 16.
    Fayyad U, Piatetsky-Shapiro G, Smyth P. From data mining to knowledge discovery in databases. AI Mag 1996;17(3):37.Google Scholar
  17. 17.
    Gao W, Wu H, Siddiqui MK, Baig AQ. Study of biological networks using graph theory. Saudi J Biol Sci 2018;25(6):1212–19.CrossRefPubMedGoogle Scholar
  18. 18.
    Gao W, Guirao JLG, Basavanagoud B, Wu J. Partial multi-dividing ontology learning algorithm. Inf Sci 2018;467:35–58.CrossRefGoogle Scholar
  19. 19.
    Gao W, Wang W, Dimitrov D, Wang Y. Nano properties analysis via fourth multiplicative ABC indicator calculating. Arab J Chem 2018;11(6):793–801.CrossRefGoogle Scholar
  20. 20.
    Gao W, Guirao JLG, Abdel-Aty M, Xi W. An independent set degree condition for fractional critical deleted graphs. Discrete & Continuous Dynamical Systems-Series S 2019;12(4&5):877– 86.CrossRefGoogle Scholar
  21. 21.
    Gao W, Dimitrov D, Abdo H. Tight independent set neighborhood union condition for fractional critical deleted graphs and ID deleted graphs. Discrete & Continuous Dynamical Systems-Series S 2019;12(4&5):711–21.CrossRefGoogle Scholar
  22. 22.
    Ghatasheh N, Faris H, Aljarah I, Al-Sayyed RMH. 2019. Optimizing software effort estimation models using firefly algorithm. arXiv:1903.02079.
  23. 23.
    Guyon I, Elisseeff A. An introduction to variable and feature selection. J Mach Learn Res 2003;3(Mar): 1157–82.Google Scholar
  24. 24.
    Heidari AA, Aljarah I, Faris H, Chen H, Luo J, Mirjalili S. An enhanced associative learning-based exploratory whale optimizer for global optimization. Neural Comput & Applic. 2019;1–27.Google Scholar
  25. 25.
    Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H. Harris hawks optimization: algorithm and applications. Futur Gener Comput Syst 2019;97:849–72.CrossRefGoogle Scholar
  26. 26.
    Holland JH. Genetic algorithms. Scientific American 1992;267(1):66–73.CrossRefGoogle Scholar
  27. 27.
    Jadhav AN, Gomathi N. 2017. Wgc: hybridization of exponential grey wolf optimizer with whale optimization for data clustering. Alex Eng J.Google Scholar
  28. 28.
    Kashef S, Nezamabadi-pour H. An advanced aco algorithm for feature subset selection. Neurocomputing 2015; 147:271–9.  https://doi.org/10.1016/j.neucom.2014.06.067.CrossRefGoogle Scholar
  29. 29.
    Li J, Cheng K, Wang S, Morstatter F, Trevino RP, Tang J, Liu H. Feature selection: a data perspective. ACM Computing Surveys (CSUR) 2017;50(6):94.CrossRefGoogle Scholar
  30. 30.
    Lichman M. 2013. UCI machine learning repository. http://archive.ics.uci.edu/ml.
  31. 31.
    Liu H, Motoda H. 2012. Feature selection for knowledge discovery and data mining, vol 454 Springer Science & Business Media.Google Scholar
  32. 32.
    Mafarja M, Aljarah I, Heidari AA, Faris H, Fournier-Viger P, Li X, Mirjalili S. Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowl.-Based Syst 2018;161:185–204.  https://doi.org/10.1016/j.knosys.2018.08.003.CrossRefGoogle Scholar
  33. 33.
    Mafarja M, Aljarah I, Heidari AA, Hammouri A, Faris H, Ala’M AZ, Mirjalili S. Evolutionary population dynamics and grasshopper optimization approaches for feature selection problems. Knowl-Based Syst 2017; 145:25–45.CrossRefGoogle Scholar
  34. 34.
    Mafarja M, Heidari AA, Faris H, Mirjalili S, Aljarah I. Dragonfly algorithm: theory, literature review, and application in feature selection, pp 47–67. Cham: Springer International Publishing; 2020.Google Scholar
  35. 35.
    Mafarja M, Jarrar R, Ahmad S, Abusnaina A. Feature selection using binary particle swarm optimization with time varying inertia weight strategies. The 2nd international conference on future networks & distributed systems , Amman, Jordan. ACM; 2018.Google Scholar
  36. 36.
    Mafarja M, Mirjalili S. Whale optimization approaches for wrapper feature selection. Appl Soft Comput 2018; 62:441–53.CrossRefGoogle Scholar
  37. 37.
    Mafarja M, Sabar NR. Rank based binary particle swarm optimisation for feature selection in classification. Proceedings of the 2nd international conference on future networks and distributed systems, ICFNDS ’18. New York: ACM; 2018. p. 19:1–6.  https://doi.org/10.1145/3231053.3231072. http://doi.acm.org/10.1145/3231053.3231072.
  38. 38.
    Mafarja M, Eleyan D, Jaber I, Hammouri A, Mirjalili S. Binary dragonfly algorithm for feature selection. 2017 international conference on new trends in computing sciences (ICTCS). IEEE; 2017. p. 12–7.Google Scholar
  39. 39.
    Mafarja M, Mirjalili S. Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 2017;260:302–12.CrossRefGoogle Scholar
  40. 40.
    Mirjalili S. Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput & Applic 2016;27(4):1053–73.CrossRefGoogle Scholar
  41. 41.
    Mirjalili S, Lewis A. S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 2013;9:1–14.CrossRefGoogle Scholar
  42. 42.
    Mirjalili S, Lewis A. The whale optimization algorithm. Adv Eng Softw 2016;95:51–67.CrossRefGoogle Scholar
  43. 43.
    Mirjalili S, Mirjalili SM, Lewis A. Grey wolf optimizer. Adv Eng Softw 2014;69:46–61.CrossRefGoogle Scholar
  44. 44.
    Mohamed F, AbdelNasser M, Mahmoud K, Kamel S. Accurate economic dispatch solution using hybrid whale-wolf optimization method. 2017 nineteenth international Middle East Power systems conference (MEPCON). IEEE; 2017. p. 922–7.Google Scholar
  45. 45.
    Molina D, LaTorre A, Herrera F. An insight into bio-inspired and evolutionary algorithms for global optimization: review, analysis, and lessons learnt over a decade of competitions. Cogn Comput 2018;10:1–28.CrossRefGoogle Scholar
  46. 46.
    Moradi P, Gholampour M. A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy. Appl Soft Comput 2016;43:117–30.CrossRefGoogle Scholar
  47. 47.
    Nguyen HB, Xue B, Liu I, Zhang M. Filter based backward elimination in wrapper based pso for feature selection in classification. 2014 IEEE congress on evolutionary computation (CEC). IEEE; 2014. p. 3111–8.Google Scholar
  48. 48.
    Oreski S, Oreski G. Genetic algorithm-based heuristic for feature selection in credit risk assessment. Expert systems with Applications 2014;41(4):2052–64.CrossRefGoogle Scholar
  49. 49.
    Pang J, Zhao Y, Xu J, Gu Y, Yu G. Super-graph classification based on composite subgraph features and extreme learning machine. Cogn Comput 2018;10:1–15.CrossRefGoogle Scholar
  50. 50.
    Saxena A, Soni BP, Kumar R, Gupta V. Intelligent Grey Wolf Optimizer–Development and application for strategic bidding in uniform price spot energy market. Appl Soft Comput 2018;69:1–13.CrossRefGoogle Scholar
  51. 51.
    Saxena A. A comprehensive study of chaos embedded bridging mechanisms and crossover operators for grasshopper optimisation algorithm. Expert Systems with Applications 2019;132:166–188.CrossRefGoogle Scholar
  52. 52.
    Saxena A, Kumar R, Das S. β-chaotic map enabled grey wolf optimizer. Appl Soft Comput 2019;75:84–105.CrossRefGoogle Scholar
  53. 53.
    Singh N, Hachimi H. A new hybrid whale optimizer algorithm with mean strategy of grey wolf optimizer for global optimization. Mathematical and Computational Applications 2018;23(1):14.CrossRefGoogle Scholar
  54. 54.
    Storn R, Price K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 1997;11(4):341–59.CrossRefGoogle Scholar
  55. 55.
    Talbi EG. 2009. Metaheuristics: from design to implementation, vol 74 John Wiley & Sons.Google Scholar
  56. 56.
    Taradeh M, Mafarja M, Heidari AA, Faris H, Aljarah I, Mirjalili S, Fujita H. An evolutionary gravitational search-based feature selection. Inf Sci 2019;497:219–39.  https://doi.org/10.1016/j.ins.2019.05.038.CrossRefGoogle Scholar
  57. 57.
    Wolpert DH, Macready WG, et al. No free lunch theorems for optimization. IEEE Trans Evol Comput 1997;1 (1):67–82.CrossRefGoogle Scholar
  58. 58.
    Wootton AJ, Taylor SL, Day CR, Haycock PW. Optimizing echo state networks for static pattern recognition. Cogn Comput 2017;9(3):391–399.CrossRefGoogle Scholar
  59. 59.
    Xue B, Zhang M, Browne WN. Novel initialisation and updating mechanisms in pso for feature selection in classification. European conference on the applications of evolutionary computation. Springer; 2013. p. 428–438.Google Scholar
  60. 60.
    Yang XS, Deb S, Mishra SK. Multi-species cuckoo search algorithm for global optimization. Cogn Comput 2018;10:1–11.CrossRefGoogle Scholar
  61. 61.
    Zawbaa HM, Emary E, Parv B. Feature selection based on antlion optimization algorithm. 2015 third world conference on complex systems (WCCS). IEEE; 2015. p. 1–7.Google Scholar
  62. 62.
    Zorarpacı E, Özel SA. A hybrid approach of differential evolution and artificial bee colony for feature selection. Expert Syst Appl 2016;62:91–103.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Computer ScienceBirzeit UniversityBirzeitPalestine
  2. 2.Faculty of Engineering and TechnologyBirzeit UniversityBirzeitPalestine
  3. 3.School of Surveying and Geospatial Engineering, College of EngineeringUniversity of TehranTehranIran
  4. 4.Department of Computer Science, School of ComputingNational University of SingaporeSingaporeSingapore
  5. 5.Business Information Technology Department, King Abdullah II School for Information TechnologyThe University of JordanAmmanJordan
  6. 6.School of Information and Communication TechnologyGriffith UniversityBrisbaneAustralia

Personalised recommendations