Abstract
The process of dimensionality reduction is a crucial solution to deal with the dimensionality problem that may be faced when dealing with the majority of machine learning techniques. This paper proposes an enhanced hybrid metaheuristic approach using grey wolf optimizer (GWO) and whale optimization algorithm (WOA) to develop a wrapper-based feature selection method. The main objective of the proposed technique is to alleviate the drawbacks of both algorithms, including immature convergence and stagnation to local optima (LO). The hybridization is done with improvements in the mechanisms of both algorithms. To confirm the stability of the proposed approach, 18 well-known datasets are employed from the UCI repository. Furthermore, the classification accuracy, number of selected features, fitness values, and run time matrices are collected and compared with a set of well-known feature selection approaches in the literature. The results show the superiority of the proposed approach compared with both GWO and WOA. The results also show that the proposed hybrid technique outperforms other state-of-the-art approaches, significantly.
Similar content being viewed by others
References
Aljarah I, Ala’M AZ, Faris H, Hassonah MA, Mirjalili S, Saadeh H. Simultaneous feature selection and support vector machine optimization using the grasshopper optimization algorithm. Cogn Comput 2018; 10:1–18.
Aljarah I, Mafarja M, Heidari AA, Faris H, Zhang Y, Mirjalili S. Asynchronous accelerating multi-leader salp chains for feature selection. Appl Soft Comput 2018;71:964–79.
Aljarah I, Mafarja M, Heidari AA, Faris H, Mirjalili S. Clustering analysis using a novel locality-informed grey wolf-inspired clustering approach. Knowl Inf Syst. 2019. https://doi.org/10.1007/s10115-019-01358-x.
Chen H, Jiao S, Heidari AA, Wang M, Chen X, Zhao X. An opposition-based sine cosine approach with local search for parameter estimation of photovoltaic models. Energy Convers Manag 2019;195:927–942.
Dash M, Liu H. Feature selection for classification. Intelligent Data Analysis 1997;1(3):131–56.
Dorigo M, Di Caro G. Ant colony optimization: a new meta-heuristic. Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406). IEEE; 1999. p. 1470–77.
Eberhart R, Kennedy J. A new optimizer using particle swarm theory. Proceedings of the sixth international symposium on micro machine and human science, 1995. MHS’95. IEEE; 1995. p. 39–43.
Emary E, Zawbaa HM, Hassanien AE. Binary ant lion approaches for feature selection. Neurocomputing 2016;213:54–65.
Emary E, Zawbaa HM, Hassanien AE. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016;172:371–81.
Emary E, Zawbaa HM, Hassanien AE. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016;172:371–81.
Faris H, Aljarah I, et al. Optimizing feedforward neural networks using krill herd algorithm for e-mail spam detection. In: 2015 IEEE Jordan conference on applied electrical engineering and computing technologies (AEECT). IEEE; 2015. P. 1–5.
Faris H, Aljarah I, Al-Madi N, Mirjalili S. Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 2016;25(06):1650033.
Faris H, Aljarah I, Mirjalili S. Evolving radial basis function networks using moth–flame optimizer. Handbook of Neural Computation. Elsevier; 2017. P. 537–550.
Faris H, Al-Zoubi AM, Heidari AA, Aljarah I, Mafarja M, Hassonah MA, Fujita H. An intelligent system for spam detection and identification of the most relevant features based on evolutionary random weight networks. Information Fusion 2019;48:67–83. https://doi.org/10.1016/j.inffus.2018.08.002.
Faris H, Mafarja M, Heidari AA, Aljarah I, Ala’M A.Z., Mirjalili S., Fujita H. An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowl-Based Syst 2018;154:43–67.
Fayyad U, Piatetsky-Shapiro G, Smyth P. From data mining to knowledge discovery in databases. AI Mag 1996;17(3):37.
Gao W, Wu H, Siddiqui MK, Baig AQ. Study of biological networks using graph theory. Saudi J Biol Sci 2018;25(6):1212–19.
Gao W, Guirao JLG, Basavanagoud B, Wu J. Partial multi-dividing ontology learning algorithm. Inf Sci 2018;467:35–58.
Gao W, Wang W, Dimitrov D, Wang Y. Nano properties analysis via fourth multiplicative ABC indicator calculating. Arab J Chem 2018;11(6):793–801.
Gao W, Guirao JLG, Abdel-Aty M, Xi W. An independent set degree condition for fractional critical deleted graphs. Discrete & Continuous Dynamical Systems-Series S 2019;12(4&5):877– 86.
Gao W, Dimitrov D, Abdo H. Tight independent set neighborhood union condition for fractional critical deleted graphs and ID deleted graphs. Discrete & Continuous Dynamical Systems-Series S 2019;12(4&5):711–21.
Ghatasheh N, Faris H, Aljarah I, Al-Sayyed RMH. 2019. Optimizing software effort estimation models using firefly algorithm. arXiv:1903.02079.
Guyon I, Elisseeff A. An introduction to variable and feature selection. J Mach Learn Res 2003;3(Mar): 1157–82.
Heidari AA, Aljarah I, Faris H, Chen H, Luo J, Mirjalili S. An enhanced associative learning-based exploratory whale optimizer for global optimization. Neural Comput & Applic. 2019;1–27.
Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H. Harris hawks optimization: algorithm and applications. Futur Gener Comput Syst 2019;97:849–72.
Holland JH. Genetic algorithms. Scientific American 1992;267(1):66–73.
Jadhav AN, Gomathi N. 2017. Wgc: hybridization of exponential grey wolf optimizer with whale optimization for data clustering. Alex Eng J.
Kashef S, Nezamabadi-pour H. An advanced aco algorithm for feature subset selection. Neurocomputing 2015; 147:271–9. https://doi.org/10.1016/j.neucom.2014.06.067.
Li J, Cheng K, Wang S, Morstatter F, Trevino RP, Tang J, Liu H. Feature selection: a data perspective. ACM Computing Surveys (CSUR) 2017;50(6):94.
Lichman M. 2013. UCI machine learning repository. http://archive.ics.uci.edu/ml.
Liu H, Motoda H. 2012. Feature selection for knowledge discovery and data mining, vol 454 Springer Science & Business Media.
Mafarja M, Aljarah I, Heidari AA, Faris H, Fournier-Viger P, Li X, Mirjalili S. Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowl.-Based Syst 2018;161:185–204. https://doi.org/10.1016/j.knosys.2018.08.003.
Mafarja M, Aljarah I, Heidari AA, Hammouri A, Faris H, Ala’M AZ, Mirjalili S. Evolutionary population dynamics and grasshopper optimization approaches for feature selection problems. Knowl-Based Syst 2017; 145:25–45.
Mafarja M, Heidari AA, Faris H, Mirjalili S, Aljarah I. Dragonfly algorithm: theory, literature review, and application in feature selection, pp 47–67. Cham: Springer International Publishing; 2020.
Mafarja M, Jarrar R, Ahmad S, Abusnaina A. Feature selection using binary particle swarm optimization with time varying inertia weight strategies. The 2nd international conference on future networks & distributed systems , Amman, Jordan. ACM; 2018.
Mafarja M, Mirjalili S. Whale optimization approaches for wrapper feature selection. Appl Soft Comput 2018; 62:441–53.
Mafarja M, Sabar NR. Rank based binary particle swarm optimisation for feature selection in classification. Proceedings of the 2nd international conference on future networks and distributed systems, ICFNDS ’18. New York: ACM; 2018. p. 19:1–6. https://doi.org/10.1145/3231053.3231072. http://doi.acm.org/10.1145/3231053.3231072.
Mafarja M, Eleyan D, Jaber I, Hammouri A, Mirjalili S. Binary dragonfly algorithm for feature selection. 2017 international conference on new trends in computing sciences (ICTCS). IEEE; 2017. p. 12–7.
Mafarja M, Mirjalili S. Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 2017;260:302–12.
Mirjalili S. Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput & Applic 2016;27(4):1053–73.
Mirjalili S, Lewis A. S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 2013;9:1–14.
Mirjalili S, Lewis A. The whale optimization algorithm. Adv Eng Softw 2016;95:51–67.
Mirjalili S, Mirjalili SM, Lewis A. Grey wolf optimizer. Adv Eng Softw 2014;69:46–61.
Mohamed F, AbdelNasser M, Mahmoud K, Kamel S. Accurate economic dispatch solution using hybrid whale-wolf optimization method. 2017 nineteenth international Middle East Power systems conference (MEPCON). IEEE; 2017. p. 922–7.
Molina D, LaTorre A, Herrera F. An insight into bio-inspired and evolutionary algorithms for global optimization: review, analysis, and lessons learnt over a decade of competitions. Cogn Comput 2018;10:1–28.
Moradi P, Gholampour M. A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy. Appl Soft Comput 2016;43:117–30.
Nguyen HB, Xue B, Liu I, Zhang M. Filter based backward elimination in wrapper based pso for feature selection in classification. 2014 IEEE congress on evolutionary computation (CEC). IEEE; 2014. p. 3111–8.
Oreski S, Oreski G. Genetic algorithm-based heuristic for feature selection in credit risk assessment. Expert systems with Applications 2014;41(4):2052–64.
Pang J, Zhao Y, Xu J, Gu Y, Yu G. Super-graph classification based on composite subgraph features and extreme learning machine. Cogn Comput 2018;10:1–15.
Saxena A, Soni BP, Kumar R, Gupta V. Intelligent Grey Wolf Optimizer–Development and application for strategic bidding in uniform price spot energy market. Appl Soft Comput 2018;69:1–13.
Saxena A. A comprehensive study of chaos embedded bridging mechanisms and crossover operators for grasshopper optimisation algorithm. Expert Systems with Applications 2019;132:166–188.
Saxena A, Kumar R, Das S. β-chaotic map enabled grey wolf optimizer. Appl Soft Comput 2019;75:84–105.
Singh N, Hachimi H. A new hybrid whale optimizer algorithm with mean strategy of grey wolf optimizer for global optimization. Mathematical and Computational Applications 2018;23(1):14.
Storn R, Price K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 1997;11(4):341–59.
Talbi EG. 2009. Metaheuristics: from design to implementation, vol 74 John Wiley & Sons.
Taradeh M, Mafarja M, Heidari AA, Faris H, Aljarah I, Mirjalili S, Fujita H. An evolutionary gravitational search-based feature selection. Inf Sci 2019;497:219–39. https://doi.org/10.1016/j.ins.2019.05.038.
Wolpert DH, Macready WG, et al. No free lunch theorems for optimization. IEEE Trans Evol Comput 1997;1 (1):67–82.
Wootton AJ, Taylor SL, Day CR, Haycock PW. Optimizing echo state networks for static pattern recognition. Cogn Comput 2017;9(3):391–399.
Xue B, Zhang M, Browne WN. Novel initialisation and updating mechanisms in pso for feature selection in classification. European conference on the applications of evolutionary computation. Springer; 2013. p. 428–438.
Yang XS, Deb S, Mishra SK. Multi-species cuckoo search algorithm for global optimization. Cogn Comput 2018;10:1–11.
Zawbaa HM, Emary E, Parv B. Feature selection based on antlion optimization algorithm. 2015 third world conference on complex systems (WCCS). IEEE; 2015. p. 1–7.
Zorarpacı E, Özel SA. A hybrid approach of differential evolution and artificial bee colony for feature selection. Expert Syst Appl 2016;62:91–103.
Funding
This research was supported by the research committee at Birzeit University with a grant number 250177.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interests
The authors declare that they have no conflict of interest.
Informed Consent
All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5).
Human and Animal Rights
This article does not contain any studies with human or animal subjects performed by any of the authors.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Mafarja, M., Qasem, A., Heidari, A.A. et al. Efficient Hybrid Nature-Inspired Binary Optimizers for Feature Selection. Cogn Comput 12, 150–175 (2020). https://doi.org/10.1007/s12559-019-09668-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-019-09668-6