Skip to main content

Advertisement

Log in

Feature selection via Lèvy Antlion optimization

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

In this paper, a modification of the newly proposed antlion optimization (ALO) is introduced and applied to feature selection relied on the Lèvy flights. ALO method is one of the encouraging swarm intelligence algorithms which make use of random walking to perform the exploration and exploitation operations. Random walks based on uniform distribution is responsible for premature convergence and stagnation. A Lèvy flight random walk is suggested as a permutation for performing a local search. Lèvy random walking grants the optimization ability to generate several solutions that are apart from existing solutions and furthermore enables it to escape from local minima and much efficient in examining large search area. The proposed Lèvy antlion optimization (LALO) algorithm is applied in a wrapper-based mode to select optimal feature combination that maximizing classification accuracy while minimizing the number of selected features. LALO algorithm is applied on 21 different benchmark datasets against genetic algorithm (GA), particle swarm optimization (PSO), and the native ALO methods. Different initialization methods and several evaluation criteria are employed to assess algorithm diversification and intensification of the optimization algorithms. The experimental results demonstrate the significant improvement in the proposed LALO over the native ALO and many well-known methods used in feature selection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Guyon I, Elisseeff A (2003) An introduction to variable and attribute selection. Mach Learn Res 3:1157–1182

    MATH  Google Scholar 

  2. Dash M, Liu H (1997) Feature selection for classification. Intell Data Anal J 1(3):131–156

    Article  Google Scholar 

  3. Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif. Intell. 97(1):273–324

    Article  MATH  Google Scholar 

  4. Chuang LY, Tsai SW, Yang CH (2011) Improved binary particle swarm optimization using catsh effect for attribute selection. Expert Syst Appl 38:12699–12707

    Article  Google Scholar 

  5. Xue B, Zhang M, Browne WN (2014) Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl Soft Comput 18:261–276

    Article  Google Scholar 

  6. Whitney A (1971) A direct method of nonparametric measurement selection. IEEE Trans Comput C–20(9):1100–1103

    Article  MATH  Google Scholar 

  7. Marill T, Green D (1963) On the effectiveness of receptors in recognition systems. IEEE Trans Inf Theory IT–9(1):11–17

    Article  Google Scholar 

  8. Xue B, Zhang M, Browne WN (2013) Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans Cybern 43(6):1656–1671

    Article  Google Scholar 

  9. Daolio F, Liefooghe A, Verel S, Aguirre H, Tanaka K (2015) Global vs local search on multi-objective NK-landscapes: contrasting the impact of problem features. In: Conference on genetic and evolutionary computation (GECCO), pp 559–566

  10. Valdez F (2015) Bio-inspired optimization methods. In: Springer handbook of, computational intelligence, pp 1533–1538

  11. Emary E, Zawbaa HM (2016) Impact of chaos functions on modern swarm optimizers. PLoS ONE 11(7):e0158738

    Article  Google Scholar 

  12. Xue B, Zhang M, Browne WN, Yao X (2016) A survey on evolutionary computation approaches to feature selection. IEEE Trans Evolut Comput 20(4):606–626

    Article  Google Scholar 

  13. Mirjalili S, Saremi S, Mirjalili SM, Coelho LDS (2016) Multi-objective grey wolf optimizer: a novel algorithm for multi-criterion optimization. Expert Syst Appl 47:106–119

    Article  Google Scholar 

  14. Emary E, Zawbaa HM, Grosan C (2017) Experienced Grey Wolf optimizer through reinforcement learning and neural networks. IEEE Trans Neural Netw Learn Syst (TNNLS) 99:1–14

    Google Scholar 

  15. Mittal N, Singh U, Sohi BS (2016) Modified Grey Wolf optimizer for global engineering optimization. Appl Comput Intell Soft Comput 2016:8

    Article  Google Scholar 

  16. Shoghian S, Kouzehgar M (2012) A comparison among Wolf Pack search and four other optimization algorithms. World Academy of Science, Engineering and Technology, vol 6

  17. Segura C, Rionda SB, Aguirre AH, Pena SIV (2015) A novel diversity-based evolutionary algorithm for the traveling salesman problem. In: Conference on genetic and evolutionary computation (GECCO), pp 489–496

  18. Goncalves EC, Plastino A, Freitas AA (2015) Simpler is better: a novel genetic algorithm to induce compact multi-label chain classifiers. In: Conference on genetic and evolutionary computation (GECCO), pp 559–566

  19. Holland J (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor

    Google Scholar 

  20. Chakraborty B (2002) Genetic algorithm with fuzzy fitness function for feature selection. In: IEEE international symposium on industrial electronics vol 1, pp 315–319

  21. Zhu Z, Ong YS, Dash M (2007) Wrapper-filter feature selection algorithm using a memetic framework. IEEE Trans Syst Man Cybern 37(1):70–76

    Article  Google Scholar 

  22. Eiben AE, Raue PE, Ruttkay Z (1994) Genetic algorithms with multi-parent recombination. In: Conference on evolutionary computation, third conference on parallel problem solving from nature, vol 866, pp 78–87

  23. Wang X, Yang J, Teng X, Xia W, Jensen R (2007) Feature selection based on rough sets and particle swarm optimization. Pattern Recognit Lett 28:459–471

    Article  Google Scholar 

  24. Ming H (2008) A rough set based hybrid method to feature selection. In: International symposium on knowledge acquisition and modeling, pp 585–588

  25. Akbari R, Mohammadi A, Ziarati K (2010) A novel bee swarm optimization algorithm for numerical function optimization. Commun Nonlinear Sci Numer Simul 15(10):3142–3155

    Article  MathSciNet  MATH  Google Scholar 

  26. Maeda M, Tsuda S (2015) Reduction of artificial bee colony algorithm for global optimization. Neurocomputing 148:70–74

    Article  Google Scholar 

  27. Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. Glob. Optim. 39(3):459–471

    Article  MathSciNet  MATH  Google Scholar 

  28. Yang XS (2005) Engineering optimizations via nature-inspired virtual bee algorithms. In: Artificial intelligence and knowledge engineering applications, first international work-conference on the interplay between natural and artificial computation, pp 317–323

  29. Sundareswaran K, Sreedevi VT (2008) Development of novel optimization procedure based on honey bee foraging behavior. In: IEEE international conference on systems, man and cybernetics, pp 1220–1225

  30. Li XL, Shao ZJ, Qian JX (2002) An optimizing method based on autonomous animates: Fish-swarm algorithm. Methods Pract Syst Eng 22:32–38

    Google Scholar 

  31. Emary E, Zawbaa HM, Hassanien AE (2016) Binary Grey Wolf optimization approaches for feature selection. Neurocomputing, Elsevier 172:371–381

    Article  Google Scholar 

  32. Mirjalili S (2015) The ant lion optimizer. Adv. Eng. Softw. 83:83–98

    Article  Google Scholar 

  33. Meiri R, Zahavi J (2006) Using simulated annealing to optimize the feature selection problem in marketing applications. Eur J Oper Res 171(3):842–858

    Article  MATH  Google Scholar 

  34. Pavlyukevich I (2007) Levy flights, non-local search and simulated annealing. Computat Phys 226:1830–1844

    Article  MathSciNet  MATH  Google Scholar 

  35. Reynolds AM, Frye MA (2007) Free-flight odor tracking in Drosophila is consistent with an optimal intermittent scale-free search. PLoS ONE 2(4):e354

    Article  Google Scholar 

  36. Viswanathan GM (2008) Lèvy flights and superdiffusion in the context of biological encounters and random searches. Phys Life Rev 5:133–150

    Article  Google Scholar 

  37. Barthelemy P, Bertolotti J, Wiersma DS (2008) A Levy flight for light. Nature 453:495–498

    Article  Google Scholar 

  38. Zawbaa HM, Emary E, PARV B (2015) Feature selection based on antlion optimization algorithm. In: 3rd world conference on complex systems (WCCS), Morocco, pp 1–7

  39. Emary E, Zawbaa HM, Hassanien AE (2016) Binary ant lion approaches for feature selection. Neurocomputing, Elsevier 213:54–65

    Article  Google Scholar 

  40. Zawbaa HM, Emary E, Grosan C (2016) Feature selection via chaotic antlion optimization. PLoS ONE 11(3):e0150652

    Article  Google Scholar 

  41. Yang XS (2010) Nature-inspired, metaheuristic algorithms. Luniver Press, Cambridge

    Google Scholar 

  42. Yang XS, Cui Z, Xiao R, Gandomi AH, Karamanoglu M (2013) swarm intelligence and bio-inspired computation. Elsevier, London, pp 18–20

    Google Scholar 

  43. Bhandari AK, Singh VK, Kumar A, Singh GK (2014) Cuckoo search algorithm and wind driven optimization based study of satellite image segmentation for multilevel thresholding using Kapur’s entropy. Expert Syst Appl 41:3538–3560

    Article  Google Scholar 

  44. Tang J, Zhao X (2009) Particle swarm optimization with adaptive mutation. In: International conference on information engineering, vol 2, pp 234–237

  45. Vieira SM, Sousa JMC, Runkler TA (2010) Two cooperative ant colonies for feature selection using fuzzy models. Expert Syst Appl 37:2714–2723

    Article  Google Scholar 

  46. Frank A, Asuncion A (2010) UCI machine learning repository

  47. Hastie T, Tibshirani R, Friedman J (2001) The Elements of Statistical Learning. Springer, Berlin

    Book  MATH  Google Scholar 

  48. Akadi AE, Amine A, Ouardighi AE, Aboutajdine D (2011) A two-stage gene selection scheme utilizing MRMR filter and GA wrapper. Knowl Inf Syst 26(3):487–500

    Article  Google Scholar 

  49. Chuang LY, Chang HW, Tu CJ, Yang CH (2008) Improved binary PSO for feature selection using gene expression data. Comput Biol Chem 32(1):29–38

    Article  MATH  Google Scholar 

  50. Tilahun SL, Ong HC (2015) Prey-Predator algorithm: a new metaheuristic algorithm for optimization problems. Inf Technol Decis Mak 14(6):1331–1352

    Article  Google Scholar 

  51. Jia L, Gong W, Wu H (2009) An improved self-adaptive control parameter of differential evolution for global optimization. Comput Intell Intell Syst 51:215–224

    Article  MATH  Google Scholar 

  52. Rice JA (2006) Mathematical Statistics and Data Analysis, 3rd edn. Duxbury Advanced

  53. Duda RO, Hart PE, Stork DG (2000) Pattern classification, 2nd edn. Wiley-Interscience Publication, London

    MATH  Google Scholar 

  54. Kotropoulos VC (2008) Fast and accurate feature subset selection applied into speech emotion recognition. Signal Process 88(12):2956–2970

    Article  MATH  Google Scholar 

  55. Yang CH, Tu CJ, Chang JY, Liu HH (2006) Po–Chang Ko, dimensionality reduction using GA-PSO. In: Joint conference on information sciences (JCIS), Atlantis Press, Taiwan

  56. Eberhart RC, Kennedy J (1995) A new optimizer using particle swarm theory. In: Sixth international symposium on micro machine and human science, Japan, pp 39–43

Download references

 Acknowledgments

This work was partially supported by the IPROCOM Marie Curie initial training network, funded through the People Programme (Marie Curie Actions) of the European Union’s Seventh Framework Programme FP7/2007-2013/ under REA grant agreement No. 316555.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hossam M. Zawbaa.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Emary, E., Zawbaa, H.M. Feature selection via Lèvy Antlion optimization. Pattern Anal Applic 22, 857–876 (2019). https://doi.org/10.1007/s10044-018-0695-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-018-0695-2

Keywords

Navigation