Skip to main content
Log in

Efficient Hybrid Nature-Inspired Binary Optimizers for Feature Selection

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

The process of dimensionality reduction is a crucial solution to deal with the dimensionality problem that may be faced when dealing with the majority of machine learning techniques. This paper proposes an enhanced hybrid metaheuristic approach using grey wolf optimizer (GWO) and whale optimization algorithm (WOA) to develop a wrapper-based feature selection method. The main objective of the proposed technique is to alleviate the drawbacks of both algorithms, including immature convergence and stagnation to local optima (LO). The hybridization is done with improvements in the mechanisms of both algorithms. To confirm the stability of the proposed approach, 18 well-known datasets are employed from the UCI repository. Furthermore, the classification accuracy, number of selected features, fitness values, and run time matrices are collected and compared with a set of well-known feature selection approaches in the literature. The results show the superiority of the proposed approach compared with both GWO and WOA. The results also show that the proposed hybrid technique outperforms other state-of-the-art approaches, significantly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Aljarah I, Ala’M AZ, Faris H, Hassonah MA, Mirjalili S, Saadeh H. Simultaneous feature selection and support vector machine optimization using the grasshopper optimization algorithm. Cogn Comput 2018; 10:1–18.

    Article  Google Scholar 

  2. Aljarah I, Mafarja M, Heidari AA, Faris H, Zhang Y, Mirjalili S. Asynchronous accelerating multi-leader salp chains for feature selection. Appl Soft Comput 2018;71:964–79.

    Article  Google Scholar 

  3. Aljarah I, Mafarja M, Heidari AA, Faris H, Mirjalili S. Clustering analysis using a novel locality-informed grey wolf-inspired clustering approach. Knowl Inf Syst. 2019. https://doi.org/10.1007/s10115-019-01358-x.

  4. Chen H, Jiao S, Heidari AA, Wang M, Chen X, Zhao X. An opposition-based sine cosine approach with local search for parameter estimation of photovoltaic models. Energy Convers Manag 2019;195:927–942.

    Article  Google Scholar 

  5. Dash M, Liu H. Feature selection for classification. Intelligent Data Analysis 1997;1(3):131–56.

    Article  Google Scholar 

  6. Dorigo M, Di Caro G. Ant colony optimization: a new meta-heuristic. Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406). IEEE; 1999. p. 1470–77.

  7. Eberhart R, Kennedy J. A new optimizer using particle swarm theory. Proceedings of the sixth international symposium on micro machine and human science, 1995. MHS’95. IEEE; 1995. p. 39–43.

  8. Emary E, Zawbaa HM, Hassanien AE. Binary ant lion approaches for feature selection. Neurocomputing 2016;213:54–65.

    Article  Google Scholar 

  9. Emary E, Zawbaa HM, Hassanien AE. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016;172:371–81.

    Article  Google Scholar 

  10. Emary E, Zawbaa HM, Hassanien AE. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016;172:371–81.

    Article  Google Scholar 

  11. Faris H, Aljarah I, et al. Optimizing feedforward neural networks using krill herd algorithm for e-mail spam detection. In: 2015 IEEE Jordan conference on applied electrical engineering and computing technologies (AEECT). IEEE; 2015. P. 1–5.

  12. Faris H, Aljarah I, Al-Madi N, Mirjalili S. Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 2016;25(06):1650033.

    Article  Google Scholar 

  13. Faris H, Aljarah I, Mirjalili S. Evolving radial basis function networks using moth–flame optimizer. Handbook of Neural Computation. Elsevier; 2017. P. 537–550.

  14. Faris H, Al-Zoubi AM, Heidari AA, Aljarah I, Mafarja M, Hassonah MA, Fujita H. An intelligent system for spam detection and identification of the most relevant features based on evolutionary random weight networks. Information Fusion 2019;48:67–83. https://doi.org/10.1016/j.inffus.2018.08.002.

    Article  Google Scholar 

  15. Faris H, Mafarja M, Heidari AA, Aljarah I, Ala’M A.Z., Mirjalili S., Fujita H. An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowl-Based Syst 2018;154:43–67.

    Article  Google Scholar 

  16. Fayyad U, Piatetsky-Shapiro G, Smyth P. From data mining to knowledge discovery in databases. AI Mag 1996;17(3):37.

    Google Scholar 

  17. Gao W, Wu H, Siddiqui MK, Baig AQ. Study of biological networks using graph theory. Saudi J Biol Sci 2018;25(6):1212–19.

    Article  Google Scholar 

  18. Gao W, Guirao JLG, Basavanagoud B, Wu J. Partial multi-dividing ontology learning algorithm. Inf Sci 2018;467:35–58.

    Article  Google Scholar 

  19. Gao W, Wang W, Dimitrov D, Wang Y. Nano properties analysis via fourth multiplicative ABC indicator calculating. Arab J Chem 2018;11(6):793–801.

    Article  CAS  Google Scholar 

  20. Gao W, Guirao JLG, Abdel-Aty M, Xi W. An independent set degree condition for fractional critical deleted graphs. Discrete & Continuous Dynamical Systems-Series S 2019;12(4&5):877– 86.

    Article  Google Scholar 

  21. Gao W, Dimitrov D, Abdo H. Tight independent set neighborhood union condition for fractional critical deleted graphs and ID deleted graphs. Discrete & Continuous Dynamical Systems-Series S 2019;12(4&5):711–21.

    Article  Google Scholar 

  22. Ghatasheh N, Faris H, Aljarah I, Al-Sayyed RMH. 2019. Optimizing software effort estimation models using firefly algorithm. arXiv:1903.02079.

  23. Guyon I, Elisseeff A. An introduction to variable and feature selection. J Mach Learn Res 2003;3(Mar): 1157–82.

    Google Scholar 

  24. Heidari AA, Aljarah I, Faris H, Chen H, Luo J, Mirjalili S. An enhanced associative learning-based exploratory whale optimizer for global optimization. Neural Comput & Applic. 2019;1–27.

  25. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H. Harris hawks optimization: algorithm and applications. Futur Gener Comput Syst 2019;97:849–72.

    Article  Google Scholar 

  26. Holland JH. Genetic algorithms. Scientific American 1992;267(1):66–73.

    Article  Google Scholar 

  27. Jadhav AN, Gomathi N. 2017. Wgc: hybridization of exponential grey wolf optimizer with whale optimization for data clustering. Alex Eng J.

  28. Kashef S, Nezamabadi-pour H. An advanced aco algorithm for feature subset selection. Neurocomputing 2015; 147:271–9. https://doi.org/10.1016/j.neucom.2014.06.067.

    Article  Google Scholar 

  29. Li J, Cheng K, Wang S, Morstatter F, Trevino RP, Tang J, Liu H. Feature selection: a data perspective. ACM Computing Surveys (CSUR) 2017;50(6):94.

    Article  Google Scholar 

  30. Lichman M. 2013. UCI machine learning repository. http://archive.ics.uci.edu/ml.

  31. Liu H, Motoda H. 2012. Feature selection for knowledge discovery and data mining, vol 454 Springer Science & Business Media.

  32. Mafarja M, Aljarah I, Heidari AA, Faris H, Fournier-Viger P, Li X, Mirjalili S. Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowl.-Based Syst 2018;161:185–204. https://doi.org/10.1016/j.knosys.2018.08.003.

    Article  Google Scholar 

  33. Mafarja M, Aljarah I, Heidari AA, Hammouri A, Faris H, Ala’M AZ, Mirjalili S. Evolutionary population dynamics and grasshopper optimization approaches for feature selection problems. Knowl-Based Syst 2017; 145:25–45.

    Article  Google Scholar 

  34. Mafarja M, Heidari AA, Faris H, Mirjalili S, Aljarah I. Dragonfly algorithm: theory, literature review, and application in feature selection, pp 47–67. Cham: Springer International Publishing; 2020.

    Google Scholar 

  35. Mafarja M, Jarrar R, Ahmad S, Abusnaina A. Feature selection using binary particle swarm optimization with time varying inertia weight strategies. The 2nd international conference on future networks & distributed systems , Amman, Jordan. ACM; 2018.

  36. Mafarja M, Mirjalili S. Whale optimization approaches for wrapper feature selection. Appl Soft Comput 2018; 62:441–53.

    Article  Google Scholar 

  37. Mafarja M, Sabar NR. Rank based binary particle swarm optimisation for feature selection in classification. Proceedings of the 2nd international conference on future networks and distributed systems, ICFNDS ’18. New York: ACM; 2018. p. 19:1–6. https://doi.org/10.1145/3231053.3231072. http://doi.acm.org/10.1145/3231053.3231072.

  38. Mafarja M, Eleyan D, Jaber I, Hammouri A, Mirjalili S. Binary dragonfly algorithm for feature selection. 2017 international conference on new trends in computing sciences (ICTCS). IEEE; 2017. p. 12–7.

  39. Mafarja M, Mirjalili S. Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 2017;260:302–12.

    Article  Google Scholar 

  40. Mirjalili S. Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput & Applic 2016;27(4):1053–73.

    Article  Google Scholar 

  41. Mirjalili S, Lewis A. S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 2013;9:1–14.

    Article  Google Scholar 

  42. Mirjalili S, Lewis A. The whale optimization algorithm. Adv Eng Softw 2016;95:51–67.

    Article  Google Scholar 

  43. Mirjalili S, Mirjalili SM, Lewis A. Grey wolf optimizer. Adv Eng Softw 2014;69:46–61.

    Article  Google Scholar 

  44. Mohamed F, AbdelNasser M, Mahmoud K, Kamel S. Accurate economic dispatch solution using hybrid whale-wolf optimization method. 2017 nineteenth international Middle East Power systems conference (MEPCON). IEEE; 2017. p. 922–7.

  45. Molina D, LaTorre A, Herrera F. An insight into bio-inspired and evolutionary algorithms for global optimization: review, analysis, and lessons learnt over a decade of competitions. Cogn Comput 2018;10:1–28.

    Article  Google Scholar 

  46. Moradi P, Gholampour M. A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy. Appl Soft Comput 2016;43:117–30.

    Article  Google Scholar 

  47. Nguyen HB, Xue B, Liu I, Zhang M. Filter based backward elimination in wrapper based pso for feature selection in classification. 2014 IEEE congress on evolutionary computation (CEC). IEEE; 2014. p. 3111–8.

  48. Oreski S, Oreski G. Genetic algorithm-based heuristic for feature selection in credit risk assessment. Expert systems with Applications 2014;41(4):2052–64.

    Article  Google Scholar 

  49. Pang J, Zhao Y, Xu J, Gu Y, Yu G. Super-graph classification based on composite subgraph features and extreme learning machine. Cogn Comput 2018;10:1–15.

    Article  Google Scholar 

  50. Saxena A, Soni BP, Kumar R, Gupta V. Intelligent Grey Wolf Optimizer–Development and application for strategic bidding in uniform price spot energy market. Appl Soft Comput 2018;69:1–13.

    Article  Google Scholar 

  51. Saxena A. A comprehensive study of chaos embedded bridging mechanisms and crossover operators for grasshopper optimisation algorithm. Expert Systems with Applications 2019;132:166–188.

    Article  Google Scholar 

  52. Saxena A, Kumar R, Das S. β-chaotic map enabled grey wolf optimizer. Appl Soft Comput 2019;75:84–105.

    Article  Google Scholar 

  53. Singh N, Hachimi H. A new hybrid whale optimizer algorithm with mean strategy of grey wolf optimizer for global optimization. Mathematical and Computational Applications 2018;23(1):14.

    Article  Google Scholar 

  54. Storn R, Price K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 1997;11(4):341–59.

    Article  Google Scholar 

  55. Talbi EG. 2009. Metaheuristics: from design to implementation, vol 74 John Wiley & Sons.

  56. Taradeh M, Mafarja M, Heidari AA, Faris H, Aljarah I, Mirjalili S, Fujita H. An evolutionary gravitational search-based feature selection. Inf Sci 2019;497:219–39. https://doi.org/10.1016/j.ins.2019.05.038.

    Article  Google Scholar 

  57. Wolpert DH, Macready WG, et al. No free lunch theorems for optimization. IEEE Trans Evol Comput 1997;1 (1):67–82.

    Article  Google Scholar 

  58. Wootton AJ, Taylor SL, Day CR, Haycock PW. Optimizing echo state networks for static pattern recognition. Cogn Comput 2017;9(3):391–399.

    Article  Google Scholar 

  59. Xue B, Zhang M, Browne WN. Novel initialisation and updating mechanisms in pso for feature selection in classification. European conference on the applications of evolutionary computation. Springer; 2013. p. 428–438.

  60. Yang XS, Deb S, Mishra SK. Multi-species cuckoo search algorithm for global optimization. Cogn Comput 2018;10:1–11.

    Article  Google Scholar 

  61. Zawbaa HM, Emary E, Parv B. Feature selection based on antlion optimization algorithm. 2015 third world conference on complex systems (WCCS). IEEE; 2015. p. 1–7.

  62. Zorarpacı E, Özel SA. A hybrid approach of differential evolution and artificial bee colony for feature selection. Expert Syst Appl 2016;62:91–103.

    Article  Google Scholar 

Download references

Funding

This research was supported by the research committee at Birzeit University with a grant number 250177.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ibrahim Aljarah.

Ethics declarations

Conflict of interests

The authors declare that they have no conflict of interest.

Informed Consent

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5).

Human and Animal Rights

This article does not contain any studies with human or animal subjects performed by any of the authors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mafarja, M., Qasem, A., Heidari, A.A. et al. Efficient Hybrid Nature-Inspired Binary Optimizers for Feature Selection. Cogn Comput 12, 150–175 (2020). https://doi.org/10.1007/s12559-019-09668-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-019-09668-6

Keywords

Navigation