Skip to main content

Hybrid SFO and TLBO optimization for biodegradable classification

Abstract

In this paper, a hybrid sunflower-teaching learning-based optimization (SFO-TLBO) algorithm has been proposed, integrating two well-known algorithms, namely sunflower optimization, and teaching–learning-based optimization. The proposed hybrid SFO-TLBO algorithm has been incorporated based on the ratio of the fitness value of the search agent and the average fitness value of the population. The SFO has more tendency of global exploration but has a problem of slow convergence speed. On the other hand, the convergence of the TLBO algorithm is fast, but it may get stuck in local optima. In this work, we have hybridized SFO with TLBO to take advantage of both the algorithms and also to balance the trade-off between exploitation and exploration. The proposed hybrid SFO-TLBO has been first applied to nineteen test benchmark functions. The outcomes have been compared with twelve state-of-the-art algorithms. In seventeen benchmark test functions, better performance has been achieved by the proposed hybrid SFO-TLBO. Also, the proposed hybrid SFO-TLBO algorithm has been used to solve an Internet of Vehicles (IoV) optimization problem and performance has been compared with fifteen state-of-the-art algorithms. In solving the IoV problem, hybrid SFO-TLBO has been achieved better performance in comparison with all fifteen considered algorithms which showcases the applicability of hybrid SFO-TLBO algorithm in resolving large-scale, real-world problems. In addition, for feature selection problems, a binary version of hybrid SFO-TLBO has been proposed, in which a wrapper approach is used to decide an optimal subset of features with a k-nearest neighbor classifier. The proposed binary method also determines an optimal value of k for the k-nearest neighbor classifier. The binary hybrid SFO-TLBO algorithm has been applied to the QSAR biodegradation dataset for classification, and for evidencing its efficiency, its performance has been compared with the other state-of-the-art algorithms. The results of the proposed approach are quite encouraging, and an average accuracy of 89.10% has been achieved.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Data availability

The data that support the findings of this study are available in UCI Machine Learning Repository (https://archive.ics.uci.edu/ml).

References

  1. Abdullah JM, Rashid TA (2019) Fitness dependent optimizer: inspired by the bee swarming reproductive process. IEEE Access 7:43473–43486. https://doi.org/10.1109/ACCESS.2019.2907012

    Article  Google Scholar 

  2. Abualigah L, Yousri D, Abd Elaziz M, Ewees AA, Al-qaness MAA, Gandomi AH (2021) Aquila optimizer: a novel meta-heuristic optimization algorithm. Comput Ind Eng 157:107250. https://doi.org/10.1016/j.cie.2021.107250

    Article  Google Scholar 

  3. Ahmadianfar I, Bozorg-Haddad O, Chu X (2020) Gradient-based optimizer: a new metaheuristic optimization algorithm. Inf Sci 540:131–159. https://doi.org/10.1016/j.ins.2020.06.037

    MathSciNet  Article  Google Scholar 

  4. Alshammari BM, Guesmi T (2020) New chaotic sunflower optimization algorithm for optimal tuning of power system stabilizers. J Electr Eng Technol 15(5):1985–1997. https://doi.org/10.1007/s42835-020-00470-1

    Article  Google Scholar 

  5. Amirsadri S, Mousavirad SJ, Ebrahimpour-Komleh H (2018) A Levy flight-based grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput Appl 30(12):3707–3720. https://doi.org/10.1007/s00521-017-2952-5

    Article  Google Scholar 

  6. Arora S, Anand P (2019) Binary butterfly optimization approaches for feature selection. Expert Syst Appl 116:147–160. https://doi.org/10.1016/j.eswa.2018.08.051

    Article  Google Scholar 

  7. Atamian HS, Creux NM, Brown EA, Garner AG, Blackman BK, Harmer SL (2016) Circadian regulation of sunflowerheliotropism, floral orientation, and pollinator visits. In: American Association for the advancement of science, vol 353, Issue 6299

  8. Aydilek İB (2018) A hybrid firefly and particle swarm optimization algorithm for computationally expensive numerical problems. Appl Soft Comput J 66:232–249. https://doi.org/10.1016/j.asoc.2018.02.025

    Article  Google Scholar 

  9. Ballabio D, Biganzoli F, Todeschini R, Consonni V (2017) Qualitative consensus of QSAR ready biodegradability predictions. Toxicol Environ Chem 99(7–8):1193–1216. https://doi.org/10.1080/02772248.2016.1260133

    Article  Google Scholar 

  10. Bell D, Wang H (2000) A formalism for relevance and its application in feature subset selection. Mach Learn 41(2):175–195

    Article  Google Scholar 

  11. Bhawan P, Nagar EA (2013) Overview of plastic waste management. In: Central Pollution Control Board, Parivesh Bhawan, East Arjun Nagar, Delhi. Central Pollution Control Board, Parivesh Bhawan, East Arjun Nagar, Delhi. https://doi.org/10.1016/j.aquaculture.2006.08.054

  12. Cao Q, Leung KM (2014) Prediction of chemical biodegradability using support vector classifier optimized with differential evolution. J Chem Inf Model 54(9):2515–2523. https://doi.org/10.1021/ci500323t

    Article  Google Scholar 

  13. Chen X, Tianfield H, Mei C, Du W, Liu G (2017) Biogeography-based learning particle swarm optimization. Soft Comput 21(24):7519–7541. https://doi.org/10.1007/s00500-016-2307-7

    Article  Google Scholar 

  14. Chen H, Xu Y, Wang M, Zhao X (2019) A balanced whale optimization algorithm for constrained engineering design problems. Appl Math Model 71:45–59. https://doi.org/10.1016/j.apm.2019.02.004

    MathSciNet  Article  MATH  Google Scholar 

  15. Cherkasov A, Muratov EN, Fourches D, Varnek A, Baskin II, Cronin M, Dearden J, Gramatica P, Martin YC, Todeschini R, Consonni V, Kuz’min VE, Cramer R, Benigni R, Yang C, Rathman J, Terfloth L, Gasteiger J, Richard A, Tropsha A (2013) QSAR modeling: Where have you been? Where are you going to? J Med Chem 57(12):66. https://doi.org/10.1021/jm4004285

    Article  Google Scholar 

  16. Dash M, Liu H (1997) Feature selection for classification. Intell Data Anal 1(97):131–156. https://doi.org/10.1016/S1088-467X(97)00008-5

    Article  Google Scholar 

  17. Deng W, Zhao H, Zou L, Li G, Yang X, Wu D (2017) A novel collaborative optimization algorithm in solving complex optimization problems. Soft Comput 21(15):4387–4398. https://doi.org/10.1007/s00500-016-2071-8

    Article  Google Scholar 

  18. Doraisami S, Golzari S (2008) A study on feature selection and classification techniques for automatic genre classification of traditional Malay music. In: Proceedings of the international society for music information retrieval conference, pp 331–336

  19. Dorigo M, Socha K (2007) Ant colony optimization. In: Handbook of approximation algorithms and metaheuristics, pp 26-1–26-14). https://doi.org/10.1201/9781420010749

  20. El-Sehiemy RA, Hamida MA, Mesbahi T (2020) Parameter identification and state-of-charge estimation for lithium-polymer battery cells using enhanced sunflower optimization algorithm. Int J Hydrog Energy 45(15):8833–8842. https://doi.org/10.1016/j.ijhydene.2020.01.067

    Article  Google Scholar 

  21. Eusuff M, Lansey K, Pasha F (2006) Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization. Eng Optim 38(2):129–154. https://doi.org/10.1080/03052150500384759

    MathSciNet  Article  Google Scholar 

  22. Francisco MB, Pereira JLJ, Oliver GA, da Silva FHS, da Cunha SS, Gomes GF (2021) Multiobjective design optimization of CFRP isogrid tubes using sunflower optimization based on metamodel. Comput Struct 249:106508. https://doi.org/10.1016/j.compstruc.2021.106508

    Article  Google Scholar 

  23. Ghamisi P, Benediktsson JA (2015) Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geosci Remote Sens Lett 12(2):309–313. https://doi.org/10.1109/LGRS.2014.2337320

    Article  Google Scholar 

  24. Goldberg DE, Holland JH (1988) Genetic algorithms and machine learning. Mach Learn 3(2):95–99

    Article  Google Scholar 

  25. Gomes GF, de Almeida FA (2020) Tuning metaheuristic algorithms using mixture design: application of sunflower optimization for structural damage identification. Adv Eng Softw 149:102877. https://doi.org/10.1016/j.advengsoft.2020.102877

    Article  Google Scholar 

  26. Gomes GF, Giovani RS (2020) An efficient two-step damage identification method using sunflower optimization algorithm and mode shape curvature (MSDBI–SFO). Eng Comput. https://doi.org/10.1007/s00366-020-01128-2

    Article  Google Scholar 

  27. Gomes GF, da Cunha SS, Ancelotti AC (2018) A sunflower optimization (SFO) algorithm applied to damage identification on laminated composite plates. Eng Comput 35(2):619–626. https://doi.org/10.1007/s00366-018-0620-8

    Article  Google Scholar 

  28. Gomes GF, da Cunha SS, Ancelotti AC (2019) A sunflower optimization (SFO) algorithm applied to damage identification on laminated composite plates. Eng Comput 35(2):619–626. https://doi.org/10.1007/s00366-018-0620-8

    Article  Google Scholar 

  29. Görgel P, Sertbas A, Ucan ON (2013) Mammographical mass detection and classification using Local Seed Region Growing-Spherical Wavelet Transform (LSRG-SWT) hybrid scheme. Comput Biol Med 43(6):765–774. https://doi.org/10.1016/j.compbiomed.2013.03.008

    Article  Google Scholar 

  30. Goswami S, Chakraborty S, Guha P, Tarafdar A, Kedia A (2018) Filter-based feature selection methods using hill climbing approach. In: Li X, Wong KC (eds) Natural computing for unsupervised learning. Unsupervised and semi-supervised learning. Springer, Cham, pp 213–234. https://doi.org/10.1007/978-3-319-98566-4_10

  31. Gu S, Cheng R, Jin Y (2018) Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Comput 22(3):811–822. https://doi.org/10.1007/s00500-016-2385-6

    Article  Google Scholar 

  32. Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182. https://doi.org/10.1162/153244303322753616

    Article  MATH  Google Scholar 

  33. Hall MA, Smith LA (1999) Feature selection for machine learning: comparing a correlation-based filter approach to the wrapper. In: In: Proceedings of the twelfth international FLAIRS conference. AAAI Press. http://dl.acm.org/citation.cfm?id=646812.707499

  34. Huang J (2007) A hybrid genetic algorithm for feature selection wrapper based on mutual information. Pattern Recogn Lett 28(13):1825–1844. https://doi.org/10.1016/j.patrec.2007.05.011

    Article  Google Scholar 

  35. Huang Y, Gangshan WT, Zeigh F (1999) A two phase feature selection method using both filter and wrapper. Proc IEEE Conf Syst Man Cybernet 2:132–136

    Google Scholar 

  36. Hussien AG, Houssein EH, Hassanien AE (2017) A binary whale optimization algorithm with hyperbolic tangent fitness function for feature selection. In: 2017 Eighth international conference on intelligent computing and information systems (ICICIS), pp 166–172. https://doi.org/10.1109/INTELCIS.2017.8260031

  37. Hussien AM, Hasanien HM, Mekhamer SF (2021) Sunflower optimization algorithm-based optimal PI control for enhancing the performance of an autonomous operation of a microgrid. Ain Shams Eng J 12(2):1883–1893. https://doi.org/10.1016/j.asej.2020.10.020

    Article  Google Scholar 

  38. Kaur J, Chauhan SS, Singh P (2019) A modified non-dominated sorting TLBO technique using group learning and learning experience of others for multi-objective test problems Jatinder. Adv Intell Syst Comput. https://doi.org/10.1007/978-981-13-3600-3

    Article  Google Scholar 

  39. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95—international conference on neural networks, pp 1942–1948. https://doi.org/10.1109/ICNN.1995.488968

  40. Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization of simulated annealing. Science 220(4598):671–680

    MathSciNet  Article  Google Scholar 

  41. Kohavi R, Sommerfield D (1995) Feature subset selection using the wrapper method: overfitting and dynamic search space topology. In: First international conference on knowledge discovery and data mining (KDD-95), pp 192–197. http://www.aaai.org/Papers/KDD/1995/KDD95-049.pdf

  42. Krishnanand KR, Hasani SMF, Panigrahi BK, Panda SK (2013) Optimal power flow solution using self-evolving brain-storming inclusive teaching–learning-based algorithm. Adv Swarm Intell 7928:338–345. https://doi.org/10.1007/978-3-642-38703-6_40

    Article  Google Scholar 

  43. Kumar V, Minz S (2014) Feature selection: a literature Review. Smart Comput Rev 4:66

    Google Scholar 

  44. Leja K, Lewandowicz G (2010) Polymer biodegradation and biodegradable polymers. Pol J Environ Stud 19(2):255–266

    Google Scholar 

  45. Li G, Niu P, Zhang W, Liu Y (2013) Model NOx emissions by least squares support vector machine with tuning based on ameliorated teaching–learning-based optimization. Chemom Intell Lab Syst 126:11–20. https://doi.org/10.1016/j.chemolab.2013.04.012

    Article  Google Scholar 

  46. Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. In: Springer, 1st ed., vol. 454. Springer, Boston. https://doi.org/10.1007/978-1-4615-5689-3_1

  47. Liu H, Yu L (2005) Toward integrating feature selection algorithms for classification and clustering. IEEE Trans Knowl Data Eng 17(4):491–502. https://doi.org/10.1109/TKDE.2005.66

    MathSciNet  Article  Google Scholar 

  48. Luo X, Li Y, Wang W, Ban X, Wang JH, Zhao W (2020) A robust multilayer extreme learning machine using kernel risk-sensitive loss criterion. Int J Mach Learn Cybern 11(1):197–216. https://doi.org/10.1007/s13042-019-00967-w

    Article  Google Scholar 

  49. Mafarja MM, Mirjalili S (2017) Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312. https://doi.org/10.1016/j.neucom.2017.04.053

    Article  Google Scholar 

  50. Mafarja MM, Mirjalili S (2019) Hybrid binary ant lion optimizer with rough set and approximate entropy reducts for feature selection. Soft Comput 23(15):6249–6265. https://doi.org/10.1007/s00500-018-3282-y

    Article  Google Scholar 

  51. Mafarja M, Eleyan D, Abdullah S, Mirjalili S (2017) S-shaped vs. V-shaped transfer functions for ant lion optimization algorithm in feature selection problem. In: ACM international conference proceeding series, pp 1–7. https://doi.org/10.1145/3102304.3102325

  52. Man KF, Tang KS, Kwong S (1996) Genetic algorithms: concepts and applications [in engineering design]. IEEE Trans Ind Electron 43(5):519–534. https://doi.org/10.1109/41.538609

    Article  Google Scholar 

  53. Mansouri K, Ringsted T, Ballabio D, Todeschini R, Consonni V (2013a) Quantitative structure–activity relationship models for ready biodegradability of chemicals. J Chem Inf Model 53:867–878

    Article  Google Scholar 

  54. Mansouri K, Ringsted T, Ballabio D, Todeschini R, Consonni V (2013b) Quantitative structure–activity relationship models for ready biodegradability of chemicals. J Chem Inf Model 53(4):867–878. https://doi.org/10.1021/ci4000213

    Article  Google Scholar 

  55. Martínez MJ, Dussaut JS, Ponzoni I (2018) Biclustering as strategy for improving feature selection in consensus QSAR modeling. Electron Notes Discrete Math 69:117–124. https://doi.org/10.1016/j.endm.2018.07.016

    Article  Google Scholar 

  56. Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27(4):1053–1073. https://doi.org/10.1007/s00521-015-1920-1

    MathSciNet  Article  Google Scholar 

  57. Mirjalili S, Hashim SZM (2010) A new hybrid PSOGSA algorithm for function optimization. In: Proceedings of ICCIA 2010—2010 international conference on computer and information application, vol 1, pp 374–377. https://doi.org/10.1109/ICCIA.2010.6141614

  58. Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14. https://doi.org/10.1016/j.swevo.2012.09.002

    Article  Google Scholar 

  59. Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008

    Article  Google Scholar 

  60. Mirjalili S, Hashim SZM, Taherzadeh G, Mirjalili SZ, Salehi S (2011) A study of different transfer functions for binary version of particle swarm optimization. In: International conference on genetic and evolutionary methods, vol 1, pp 2–7

  61. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007

    Article  Google Scholar 

  62. Mitchell M (1998) An introduction to genetic algorithms, 5th ed. MIT Press, Cambridge. https://doi.org/10.1016/S0898-1221(96)90227-8

  63. Mwadulo MW (2016) A review on feature selection methods for classification tasks. Int J Comput Appl Technol Res 5(6):395–402. https://doi.org/10.7753/IJCATR0506.1013

    Article  Google Scholar 

  64. Olorunda O, Engelbrecht AP (2008) Measuring exploration/exploitation in particle swarms using swarm diversity. In: 2008 IEEE congress on evolutionary computation (CEC 2008), pp 1128–1134. https://doi.org/10.1109/CEC.2008.4630938

  65. Ouyang HB, Gao LQ, Kong XY, Zou DX, Li S (2015) Teaching–learning based optimization with global crossover for global optimization problems. Appl Math Comput 265:533–556. https://doi.org/10.1016/j.amc.2015.05.012

    MathSciNet  Article  MATH  Google Scholar 

  66. Ponzoni I, Sebastián-Pérez V, Requena-Triguero C, Roca C, Martínez MJ, Cravero F, Díaz MF, Páez JA, Arrayás RG, Adrio J, Campillo NE (2017) Hybridizing feature selection and feature learning approaches in QSAR modeling for drug discovery. Sci Rep 7(1):1–19. https://doi.org/10.1038/s41598-017-02114-3

    Article  Google Scholar 

  67. Putra RID, Maulana AL, Saputro AG (2019) Study on building machine learning model to predict biodegradable-ready materials. In: AIP conference proceedings, vol 2088. https://doi.org/10.1063/1.5095351

  68. Qais MH, Hasanien HM, Alghuwainem S (2019) Identification of electrical parameters for three-diode photovoltaic model using analytical and sunflower optimization algorithm. Appl Energy 250:109–117. https://doi.org/10.1016/j.apenergy.2019.05.013

    Article  Google Scholar 

  69. Rao RV (2020) Rao algorithms: Three metaphor-less simple algorithms for solving optimization problems. Int J Ind Eng Comput 11:107–130. https://doi.org/10.5267/j.ijiec.2019.6.002

    Article  Google Scholar 

  70. Rao RV, Savsani VJ, Vakharia DP (2011) Teaching – learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput Aided Des 43(3):303–315. https://doi.org/10.1016/j.cad.2010.12.015

    Article  Google Scholar 

  71. Rocha WFC, Sheen DA (2016) Classification of biodegradable materials using QSAR modelling with uncertainty estimation. SAR QSAR Environ Res 27(10):799–811. https://doi.org/10.1080/1062936X.2016.1238010

    Article  Google Scholar 

  72. Saeys Y, Inza I, Larrañaga P (2007) A review of feature selection techniques in bioinformatics. Bioinformatics 23(19):2507–2517. https://doi.org/10.1093/bioinformatics/btm344

    Article  Google Scholar 

  73. Saremi S, Mirjalili S, Lewis A (2014) Biogeography-based optimisation with chaos. Neural Comput Appl 25(5):1077–1097. https://doi.org/10.1007/s00521-014-1597-x

    Article  Google Scholar 

  74. Saremi S, Mirjalili S, Lewis A (2015) How important is a transfer function in discrete heuristic algorithms. Neural Comput Appl 26(3):625–640. https://doi.org/10.1007/s00521-014-1743-5

    Article  Google Scholar 

  75. Shaheen MAM, Hasanien HM, Mekhamer SF, Talaat HEA (2019) Optimal power flow of power systems including distributed generation units using sunflower optimization algorithm. IEEE Access 7:109289–109300. https://doi.org/10.1109/ACCESS.2019.2933489

    Article  Google Scholar 

  76. Singh N, Singh SB (2017) Hybrid algorithm of particle swarm optimization and grey wolf optimizer for improving convergence performance. J Appl Math. https://doi.org/10.1155/2017/2030489

    MathSciNet  Article  MATH  Google Scholar 

  77. Singh R, Chaudhary H, Singh AK (2017) A new hybrid teaching–learning particle swarm optimization algorithm for synthesis of linkages to generate path. Sadhana Acad Proc Eng Sci 42(11):1851–1870. https://doi.org/10.1007/s12046-017-0737-2

    MathSciNet  Article  MATH  Google Scholar 

  78. Snaselova P, Zboril F (2015) Genetic algorithm using theory of chaos. Procedia Computr Sci 51(1):316–325. https://doi.org/10.1016/j.procs.2015.05.248

    Article  Google Scholar 

  79. Sun Y, Wang X, Chen Y, Liu Z (2018) A modified whale optimization algorithm for large-scale global optimization problems. Expert Syst Appl 114:563–577. https://doi.org/10.1016/j.eswa.2018.08.027

    Article  Google Scholar 

  80. Tang D, Yang J, Dong S, Liu Z (2016) A lévy flight-based shuffled frog-leaping algorithm and its applications for continuous optimization problems. Appl Soft Comput J 49:641–662. https://doi.org/10.1016/j.asoc.2016.09.002

    Article  Google Scholar 

  81. Tiwari S, Singh B, Kaur M (2017) An approach for feature selection using local searching and global optimization techniques. Neural Comput Appl 28(10):2915–2930. https://doi.org/10.1007/s00521-017-2959-y

    Article  Google Scholar 

  82. Tuo S, Yong L, Deng F, Li Y, Lin Y, Lu Q (2017) HSTLBO: a hybrid algorithm based on harmony search and Teaching–Learning-based optimization for complex highdimensional optimization problems. PLoS ONE 12(4):1–23. https://doi.org/10.1371/journal.pone.0175114

    Article  Google Scholar 

  83. van den Bergh F, Engelbrecht AP (2004) A cooperative approach to participle swam optimization. IEEE Trans Evol Comput 8(3):225–239. https://doi.org/10.1109/TEVC.2004.826069

    Article  Google Scholar 

  84. Vieira SM, Sousa JMC, Runkler TA (2010) Two cooperative ant colonies for feature selection using fuzzy models. Expert Syst Appl 37:2714–2723

    Article  Google Scholar 

  85. Villuendas-Rey Y, Velázquez-Rodríguez JL, Alanis-Tamez MD, Moreno-Ibarra MA, Yáñez-Márquez C (2021) Mexican axolotl optimization: a novel bioinspired heuristic. Mathematics 9(7):1–20. https://doi.org/10.3390/math9070781

    Article  Google Scholar 

  86. Vorberg S, Tetko IV (2014) Modeling the biodegradability of chemical compounds using the online chemical modeling environment (OCHEM). Mol Inf 33(1):73–85. https://doi.org/10.1002/minf.201300030

    Article  Google Scholar 

  87. Wan Y, Wang M, Ye Z, Lai X (2016) A feature selection method based on modified binary coded ant colony optimization algorithm. Appl Soft Comput J 49:248–258. https://doi.org/10.1016/j.asoc.2016.08.011

    Article  Google Scholar 

  88. Wang Y, Li Y, Pu W, Wen K, Shugart YY, Xiong M, Jin L (2016) Random bits forest: a strong classifier/regressor for big data. Sci Rep 6:1–8. https://doi.org/10.1038/srep30086

    Article  Google Scholar 

  89. Xu H, Liu X, Su J (2017) An improved grey Wolf optimizer algorithm integrated with Cuckoo Search. In: Proceedings of the 2017 IEEE 9th international conference on intelligent data acquisition and advanced computing systems: technology and applications, IDAACS 2017, vol 1(no. 61602162), pp 490–493. https://doi.org/10.1109/IDAACS.2017.8095129

  90. Xu Y, Yang Z, Li X, Kang H, Yang X (2020) Dynamic opposite learning enhanced teaching–learning-based optimization. Knowl Based Syst. https://doi.org/10.1016/j.knosys.2019.104966

    Article  Google Scholar 

  91. Xue B, Zhang M, Browne WN (2013) Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE Trans Cybernet 43(6):1656–1671. https://doi.org/10.1109/TSMCB.2012.2227469

    Article  Google Scholar 

  92. Yadav A, Swetapadma A (2014) Classification of readily biodegradable molecules using principal component analysis and artificial neural network. Artif Intell Appl 1(2):33–42

    Google Scholar 

  93. Yang XS (2012) Flower pollination algorithm for global optimization. Unconvent Comput Nat Comput 7445:240–249. https://doi.org/10.1016/j.eswa.2016.03.047

    Article  MATH  Google Scholar 

  94. Yang Z, Li K, Niu Q, Xue Y, Foley A (2014) A self-learning TLBO based dynamic economic/environmental dispatch considering multiple plug-in electric vehicle loads. J Modern Power Syst Clean Energy 2(4):298–307. https://doi.org/10.1007/s40565-014-0087-6

    Article  Google Scholar 

  95. Yuan H, Tseng S, Gangshan W, Fuyan Z (1999) A two-phase feature selection method using both filter and wrapper huang. In: Systems, man, and cybernetics, 1999. IEEE SMC’99 conference proceedings, vol 2, pp 132–136

  96. Zainudin MNS, Sulaiman N, Mustapha N, Perumal T (2017) Feature selection optimization using hybrid relief-f with self-adaptive differential evolution. In: International journal of intelligent engineering & systems (INASS), vol 10(no. 2), pp 21–29. https://doi.org/10.22266/ijies2017.0430.03

  97. Zhang B, Ma Z, Liu Y, Yuan H, Sun L (2018) Ensemble based reactivated regularization extreme learning machine for classification. Neurocomputing 275:255–266. https://doi.org/10.1016/j.neucom.2017.07.018

    Article  Google Scholar 

  98. Zou F, Chen D, Wang J (2016) An improved teaching–learning-based optimization with the social character of PSO for global optimization. Comput Intell Neurosci 2016:1–10. https://doi.org/10.1155/2016/4561507

    Article  Google Scholar 

  99. Zou F, Chen D, Xu Q (2019) A survey of teaching–learning-based optimization. Neurocomputing 335:366–383. https://doi.org/10.1016/j.neucom.2018.06.076

    Article  Google Scholar 

Download references

Funding

The authors did not receive support from any organization for the submitted work.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Birmohan Singh.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any one of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Sharma, S.R., Singh, B. & Kaur, M. Hybrid SFO and TLBO optimization for biodegradable classification. Soft Comput 25, 15417–15443 (2021). https://doi.org/10.1007/s00500-021-06196-0

Download citation

Keywords

  • Biodegradable molecules
  • IoV
  • k-Nearest neighbor classifier
  • Quantitative structure–activity relationship
  • Sunflower optimization
  • Teaching learning-based optimization