Skip to main content
Log in

S-shaped versus V-shaped transfer functions for binary Manta ray foraging optimization in feature selection problem

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Feature selection (FS) is considered as one of the core concepts in the areas of machine learning and data mining which immensely impacts the performance of classification model. Through FS, irrelevant or partially relevant features can be eliminated which in turn helps in enhancing the performance of the model. Over the years, researchers have applied different meta-heuristic optimization techniques for the purpose of FS as these overcome the limitations of traditional optimization approaches. Going by the trend, we introduce a new FS approach based on a recently proposed meta-heuristic algorithm called Manta ray foraging optimization (MRFO) which is developed following the food foraging nature of the Manta rays, one of the largest known marine creatures. As MRFO is apposite for continuous search space problems, we have adapted a binary version of MRFO to fit it into the problem of FS by applying eight different transfer functions belonging to two different families: S-shaped and V-shaped. We have evaluated the eight binary versions of MRFO on 18 standard UCI datasets. Of these, the best one is considered for comparison with 16 recently proposed meta-heuristic FS approaches. The results show that MRFO outperforms the state-of-the-art methods in terms of both classification accuracy and number of features selected. The source code of this work is available in https://github.com/Rangerix/MetaheuristicOptimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Han J, Pei J, Kamber M (2011) Data mining: concepts and techniques. Elsevier, Amsterdam

    MATH  Google Scholar 

  2. Ghosh KK et al (2020) A two-stage approach towards protein secondary structure classification. Med Biol Eng Comput 58(8):1723–1737. https://doi.org/10.1007/s11517-020-02194-w

    Article  Google Scholar 

  3. Kowal M, Skobel M, Nowicki N (2018) The feature selection problem in computer-assisted cytology. Int Jou Appl Math Comput Sci 28(4):759–770. https://doi.org/10.2478/amcs-2018-0058

    Article  MathSciNet  Google Scholar 

  4. Borowik G (2018) Optimization on the complementation procedure to- wards efficient implementation of the index generation function. Int J Appl Math Comput Sci 28(4):803–815. https://doi.org/10.2478/amcs-2018-0061

    Article  MathSciNet  MATH  Google Scholar 

  5. Kumar N, Makkar A (2020) Machine learning in cognitive IoT. CRC Press, Boca Raton

    Book  Google Scholar 

  6. Garg S et al (2019) Hybrid deep-learning-based anomaly detection scheme for suspicious flow detection in SDN: a social multimedia perspective. IEEE Trans Multimed 21(3):566–578. https://doi.org/10.1109/tmm.2019.2893549

    Article  Google Scholar 

  7. Garg S et al (2019) A hybrid deep learning-based model for anomaly de- tection in cloud datacenter networks. IEEE Trans Netw Serv Manag 16(3):924–935. https://doi.org/10.1109/tnsm.2019.2927886

    Article  Google Scholar 

  8. Miglani A, Kumar N (2019) Deep learning models for traffic ow prediction in autonomous vehicles: a review, solutions, and challenges. Veh Commun 20:100184. https://doi.org/10.1016/j.vehcom.2019.100184

    Article  Google Scholar 

  9. Liu H, Motoda H (2012) Feature selection for knowledge discovery and data mining, vol 454. Springer, Berlin

    MATH  Google Scholar 

  10. Guha R et al. (2020) “Mutually informed correlation coefficient (MICC)—a new filter based feature selection method. In: 2020 IEEE Calcutta Conference (CALCON), pp. 54–58. https://doi.org/10.1109/CALCON49167.2020.9106516

  11. Ghosh M et al (2018) Genetic algorithm based cancerous gene identification from microarray data using ensemble of filter methods. Med Biol Eng Comput 57(1):159–176. https://doi.org/10.1007/s11517-018-1874-4

    Article  Google Scholar 

  12. Blachnik M (2019) Ensembles of instance selection methods: a comparative study. Int J Appl Math Comput Sci 29(1):151–168. https://doi.org/10.2478/amcs-2019-0012

    Article  MathSciNet  MATH  Google Scholar 

  13. Balochian S, Baloochian H (2019) Social mimic optimization algorithm and engineering applications. Expert Syst Appl 134:178–191. https://doi.org/10.1016/j.eswa.2019.05.035

    Article  Google Scholar 

  14. Guha R et al (2020) Embedded chaotic whale survival algorithm for filter-wrapper feature selection. Soft Comput 24(17):12821–12843. https://doi.org/10.1007/s00500-020-05183-1

    Article  Google Scholar 

  15. Guha R et al (2020) Introducing clustering based population in binary gravitational search algorithm for feature selection. Appl Soft Comput 93:106341. https://doi.org/10.1016/j.asoc.2020.106341

    Article  Google Scholar 

  16. Al-Tashi Q et al (2019) Binary optimization using hybrid grey wolf optimization for feature selection. IEEE Access 7:39496–39508

    Article  Google Scholar 

  17. Mafarja MM, Mirjalili S (2019) Hybrid binary ant lion optimizer with rough set and approximate entropy reducts for feature selection. Soft Comput 23(15):6249–6265

    Article  Google Scholar 

  18. Il-Seok O, Lee J-S, Moon B-R (2004) Hybrid genetic algorithms for feature selection. IEEE Trans Pattern Anal Mach Intell 26(11):1424–1437. https://doi.org/10.1109/tpami.2004.105

    Article  Google Scholar 

  19. Chen H et al (2013) A heuristic feature selection approach for text categorization by using chaos optimization and genetic algorithm. Math Probl Eng. https://doi.org/10.1155/2013/524017

    Article  Google Scholar 

  20. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95-international conference on neural networks. vol 4. IEEE, pp 1942–1948

  21. Chakraborty B (2008) Feature subset selection by particle swarm optimization with fuzzy fitness function. In: 2008 3rd international conference on intelligent system and knowledge engineering. vol 1, pp 1038–1042. https://doi.org/10.1109/ISKE.2008.4731082

  22. Lee S et al (2008) Modified binary particle swarm optimization. Progress Nat Sci 18(9):1161–1166. https://doi.org/10.1016/j.pnsc.2008.03.018

    Article  MathSciNet  Google Scholar 

  23. Wang X et al (2007) Feature selection based on rough sets and particle swarm optimization. Pattern Recognit Lett 28(4):459–471

    Article  Google Scholar 

  24. Dorigo M, Maniezzo V, Colorni A (1996) Ant system optimization by a colony of cooperating agents. IEEE Trans Syst Man Cybern Part B (Cybern) 26(1):29–41

    Article  Google Scholar 

  25. Ke L, Feng Z, Ren Z (2008) An efficient ant colony optimization approach to attribute reduction in rough set theory. Pattern Recognit Lett 29(9):1351–1357

    Article  Google Scholar 

  26. Chen Y, Miao D, Wang R (2010) A rough set approach to feature selection based on ant colony optimization. Pattern Recognit Lett 31(3):226–233. https://doi.org/10.1016/j.patrec.2009.10.013

    Article  Google Scholar 

  27. Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Tech. rep. Technical report-tr06, Erciyes University, Engineering Faculty, Computer Engineering Department

  28. Shokouhifar M, Sabet S (2010) A hybrid approach for effective feature selection using neural networks and artificial bee colony optimization. In: 3rd international conference on machine vision (ICMV 2010), pp 502–506

  29. Ghosh KK et al (2020) Binary social mimic optimization algorithm with X-shaped transfer function for feature selection. IEEE Access 8:97890–97906. https://doi.org/10.1109/ACCESS.2020.2996611

    Article  Google Scholar 

  30. Mafarja M, Mirjalili S (2018) Whale optimization approaches for wrapper feature selection. Appl Soft Comput 62:441–453

    Article  Google Scholar 

  31. Rashedi E, Nezamabadi-Pour H, Saryazdi S (2010) BGSA: binary gravitational search algorithm. Nat Comput 9(3):727–745

    Article  MathSciNet  Google Scholar 

  32. Chatterjee B et al (2020) Late acceptance hill climbing based social ski driver algorithm for feature selection. IEEE Access 8:75393–75408. https://doi.org/10.1109/ACCESS.2020.2988157

    Article  Google Scholar 

  33. Ghosh KK et al (2020) Improved binary sailfish optimizer based on adaptive \(\beta\)-Hill climbing for feature selection. IEEE Access 8:83548–83560. https://doi.org/10.1109/access.2020.2991543

    Article  Google Scholar 

  34. Ahmed S et al (2020) Hybrid of harmony search algorithm and ring theory- based evolutionary algorithm for feature selection. IEEE Access 8:102629–102645. https://doi.org/10.1109/ACCESS.2020.2999093

    Article  Google Scholar 

  35. Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381. https://doi.org/10.1016/j.neucom.2015.06.083

    Article  Google Scholar 

  36. Mafarja Majdi, et al. (2017) S-shaped vs. V-shaped transfer functions for ant lion optimization algorithm in feature selection problem. In: Proceedings of the international conference on future networks and distributed systems, pp 1–7

  37. Mafarja M et al (2019) Binary grasshopper optimisation algorithm approaches for feature selection problems. Expert Syst Appl 117:267–286

    Article  Google Scholar 

  38. Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14

    Article  Google Scholar 

  39. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82. https://doi.org/10.1109/4235.585893

    Article  Google Scholar 

  40. Zhao W, Zhang Z, Wang L (2020) Manta ray foraging optimization: an effective bio-inspired optimizer for engineering applications. Eng Appl Artif Intell 87:103300. https://doi.org/10.1016/j.engappai.2019.103300

    Article  Google Scholar 

  41. Rizzo J (2016) Ocean Animals: Who’s Who in the Deep Blue. Animals Series. National Geographic Society. ISBN: 9781426325069. https://books.google.co.in/books?id=Klu4DQAAQBAJ

  42. Helfman G, Burgess GH (2014) Sharks. JHU Press, Baltimore

    Google Scholar 

  43. Bigelow HB (1953) Sawfishes, guitarfishes, skates and rays. In: Saw- fishes, guitarfishes, skates and rays, and chimaeroids, pp 1–514

  44. Mafarja MM, Mirjalili S (2017) Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312. https://doi.org/10.1016/j.neucom.2017.04.053

    Article  Google Scholar 

  45. Altman NS (1992) An introduction to kernel and nearest-neighbor non parametric regression. Am Stat 46(3):175–185. https://doi.org/10.1080/00031305.1992.10475879

    Article  Google Scholar 

  46. Dua D, Graff C (2017) UCI machine learning repository. http://archive.ics.uci.edu/ml

  47. Van Rossum G, Drake FL (2011) The python language reference manual. Network Theory Ltd, London

    Google Scholar 

  48. Taradeh M et al (2019) An evolutionary gravitational search-based feature selection. Inf Sci 497:219–239

    Article  Google Scholar 

  49. Mafarja M et al. (2019) Efficient hybrid nature-inspired binary optimizers for feature selection. Cognit Comput. https://doi.org/10.1007/s12559-019-09668-6

    Article  Google Scholar 

  50. Wilcoxon F, Katti SK, Wilcox RA (1970) Critical values and probability levels for the Wilcoxon rank sum test and the Wilcoxon signed rank test. Sel Tables Math Stat 1:171–259

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Suman Kumar Bera.

Ethics declarations

Conflict of interest

We wish to confirm that there are no known conflicts of interest associated with this publication and there has been no significant financial support for this work that could have influenced its outcome.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ghosh, K.K., Guha, R., Bera, S.K. et al. S-shaped versus V-shaped transfer functions for binary Manta ray foraging optimization in feature selection problem. Neural Comput & Applic 33, 11027–11041 (2021). https://doi.org/10.1007/s00521-020-05560-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05560-9

Keywords

Navigation