Skip to main content
Log in

Boosting the training of neural networks through hybrid metaheuristics

  • Published:
Cluster Computing Aims and scope Submit manuscript

Abstract

In this paper, the learning process of multilayer perceptron (MLP) neural network is boosted using hybrid metaheuristic optimization algorithms. Normally, the learning process in MLP requires suitable settings of its weight and bias parameters. In the original version of MLP, the gradient descent algorithm is used as a learner in MLP which suffers from two chronic problems: local minima and slow convergence. In this paper, six versions of memetic algorithms (MAs) are proposed to replace gradient descent learning mechanism of MLP where adaptive \(\beta\)-hill climbing (A\(\beta\)HC) as a local search algorithm is hybridized with six population-based metaheuristics which are hybrid flower pollination algorithm, hybrid salp swarm algorithm, hybrid crow search algorithm, hybrid grey wolf optimization (HGWO), hybrid particle swarm optimization, and hybrid JAYA algorithm. This is to show the effect of the proposed MA versions on the performance of MLP. To evaluate the proposed MA versions for MLP, 15 classification benchmark problems with different size and complexity are used. The A\(\beta\)HC algorithm is invoked in the improvement loop of any MA version with a probability of \(B_r\) parameter, which is investigated to monitor its effect on the behavior of the proposed MA versions. The \(B_r\) setting which obtains the most promising results is then used to set the hybrid MA. The results show that the proposed MA versions excel the original algorithms. Moreover, HGWO outperforms all other MA versions in almost all the datasets. In a nutshell, MAs are a good choice for training MLP to produce results with high accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data availability

The data that support the findings of this study are available from the corresponding author upon reasonable request.

References

  1. Hassoun, M.H., et al.: Fundamentals of Artificial Neural Networks. MIT press, Cambridge (1995)

    MATH  Google Scholar 

  2. Lawrence, S., Giles, C.L., Tsoi, A.C., Back, A.D.: Face recognition: a convolutional neural-network approach. IEEE Trans. Neural Netw. 8(1), 98–113 (1997)

    Article  Google Scholar 

  3. Bebis, G., Georgiopoulos, M.: Feed-forward neural networks. IEEE Potentials 13(4), 27–31 (1994)

    Article  Google Scholar 

  4. Ghosh-Dastidar, S., Adeli, H.: Spiking neural networks. Int. J. Neural Syst. 19(04), 295–308 (2009)

    Article  Google Scholar 

  5. Medsker, L.R., Jain, L.C.: Recurrent neural networks. Des. Appl. 5, 64 (2001)

    Google Scholar 

  6. Orr, M.J.L. et al.: Introduction to radial basis function networks (1996)

  7. Yong, Y., Si, X., Changhua, H., Zhang, J.: A review of recurrent neural networks: Lstm cells and network architectures. Neural Comput. 31(7), 1235–1270 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  8. Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT press, Cambridge (2016)

    MATH  Google Scholar 

  9. Verikas, A., Bacauskiene, M.: Feature selection with neural networks. Pattern Recogn. Lett. 23(11), 1323–1335 (2002)

    Article  MATH  Google Scholar 

  10. She, F.H., Kong, L.X., Nahavandi, S., Kouzani, A.Z.: Intelligent animal fiber classification with artificial neural networks. Textile Res. J. 72(7), 594–600 (2002)

    Article  Google Scholar 

  11. Savalia, S., Emamian, V.: Cardiac arrhythmia classification by multi-layer perceptron and convolution neural networks. Bioengineering 5(2), 35 (2018)

    Article  Google Scholar 

  12. Meshram, S.G., Ghorbani, M.A., Shamshirband, S., Karimi, V., Meshram, C.: River flow prediction using hybrid Psogsa algorithm based on feed-forward neural network. Soft Comput. 23(20), 10429–10438 (2019)

    Article  Google Scholar 

  13. Doush, I.A., Sawalha, A.: Automatic music composition using genetic algorithm and artificial neural networks. Malays. J. Comput. Sci. 33(1), 35–51 (2020)

    Article  Google Scholar 

  14. Belciug, S.: Logistic regression paradigm for training a single-hidden layer feedforward neural network. application to gene expression datasets for cancer research. J. Biomed. Inform. 102, 103373 (2020)

    Article  Google Scholar 

  15. Aljarah, I., Faris, H., Mirjalili, S.: Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput. 22(1), 1–15 (2018)

    Article  Google Scholar 

  16. Ghanem, W.A.H.M., Jantan, A.: A cognitively inspired hybridization of artificial bee colony and dragonfly algorithms for training multi-layer perceptrons. Cogn. Comput. 10(6), 1096–1134 (2018)

    Article  Google Scholar 

  17. Valian, E., Mohanna, S., Tavakoli, S.: Improved cuckoo search algorithm for feedforward neural network training. Int. J. Artif. Intell. Appl. 2(3), 36–43 (2011)

    Google Scholar 

  18. Mirjalil, S., Hashim, S.Z.M., Sardroudi, H.M.: Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Appl. Math. Comput. 218(22), 11125–11137 (2012)

    MathSciNet  MATH  Google Scholar 

  19. Faris, H., Mirjalili, S., Aljarah, I.: Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme. Int. J. Mach. Learn. Cybern. 10(10), 2901–2920 (2019)

    Article  Google Scholar 

  20. Mirjalili, S.: How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl. Intell. 43(1), 150–161 (2015)

    Article  Google Scholar 

  21. Jalali, S.M.J., Ahmadian, S., Kebria, P.M., Khosravi, A., Lim, C.P., Nahavandi, S.: Evolving artificial neural networks using butterfly optimization algorithm for data classification. In: International Conference on Neural Information Processing, pp. 596–607. Springer (2019)

  22. Chen, H., Wang, S., Li, J., Li, Y.: A hybrid of artificial fish swarm algorithm and particle swarm optimization for feedforward neural network training. In: International Conference on Intelligent Systems and Knowledge Engineering 2007. Atlantis Press (2007)

  23. Bairathi, D., Gopalani, D.: Salp swarm algorithm (ssa) for training feed-forward neural networks. In: Soft computing for problem solving, pp. 521–534. Springer (2019)

  24. Yin, Y., Tu, Q., Chen, X.: Enhanced salp swarm algorithm based on random walk and its application to training feedforward neural networks. Soft Comput. 24, 14791 (2020)

  25. Alboaneen D.A., Tianfield H., Zhang, Y.: Glowworm swarm optimisation for training multi-layer perceptrons. In: Proceedings of the Fourth IEEE/ACM International Conference on Big Data Computing, Applications and Technologies, pp. 131–138 (2017)

  26. Montana, D.J., Davis, L.: Training feedforward neural networks using genetic algorithms. In: IJCAI, vol. 89, pp. 762–767 (1989)

  27. Moayedi, H., Nguyen, H., Foong, L.K.: Nonlinear evolutionary swarm intelligence of grasshopper optimization algorithm and gray wolf optimization for weight adjustment of neural network. Eng. Comput. 37, 1265 (2019)

    Article  Google Scholar 

  28. Heidari, A.A., Faris, H., Aljarah, I., Mirjalili, S.: An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 23(17), 7941–7958 (2019)

    Article  Google Scholar 

  29. Faris, H., Aljarah, I., Alqatawna, J.: Optimizing feedforward neural networks using krill herd algorithm for e-mail spam detection. In: 2015 IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies (AEECT), pp. 1–5. IEEE (2015)

  30. Faris, H., Aljarah, I., Mirjalili, S.: Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl. Intell. 48(2), 445–464 (2018)

    Article  Google Scholar 

  31. Mirjalili, S.Z., Saremi, S., Mirjalili, S.M.: Designing evolutionary feedforward neural networks using social spider optimization algorithm. Neural Comput. Appl. 26(8), 1919–1928 (2015)

    Article  Google Scholar 

  32. Socha, K., Blum, C.: An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training. Neural Comput. Appl. 16(3), 235–247 (2007)

    Article  Google Scholar 

  33. Jaddi, N.S., Abdullah, S., Hamdan, A.R.: Multi-population cooperative bat algorithm-based optimization of artificial neural network model. Inf. Sci. 294, 628–644 (2015)

    Article  MathSciNet  Google Scholar 

  34. Zhang, Y., Phillips, P., Wang, S., Ji, G., Yang, J., Jianguo, W.: Fruit classification by biogeography-based optimization and feedforward neural network. Expert Systems 33(3), 239–253 (2016)

    Article  Google Scholar 

  35. Faris, H., Aljarah, I., Al-Madi, N., Mirjalili, S.: Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int. J. Artif. Intell. Tools 25(06), 1650033 (2016)

    Article  Google Scholar 

  36. Heidari, A.A., Faris, H., Mirjalili, S., Aljarah, I., Mafarja, M.: Ant lion optimizer: theory, literature review, and application in multi-layer perceptron neural networks. In: Mirjalili, S., Song Dong, J., Lewis, A. (eds.) Nature-Inspired Optimizers, pp. 23–46. Springer, Cham (2020)

    Google Scholar 

  37. Wu, H., Zhou, Y., Luo, Q., Basset, M.A.: Training feedforward neural networks using symbiotic organisms search algorithm. Comput. Intell. Neurosci. https://doi.org/10.1155/2016/9063065 (2016)

  38. Al-Betar, M.A., Khader, A.T., Zaman, M.: University course timetabling using a hybrid harmony search metaheuristic algorithm. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 42(5), 664–681 (2012)

    Article  Google Scholar 

  39. Blum, C., Puchinger, J., Raidl, G.R., Roli, A.: Hybrid metaheuristics in combinatorial optimization: a survey. Appl. Soft Comput. 11(6), 4135–4151 (2011)

    Article  MATH  Google Scholar 

  40. Ong, Y.-S., Lim, M.-H., Zhu, N., Wong, K.-W.: Classification of adaptive memetic algorithms: a comparative study. IEEE Trans. Syst. Man Cybern. Part B Cybern. 36(1), 141–152 (2006)

    Article  Google Scholar 

  41. Dokeroglu, T., Sevinc, E., Kucukyilmaz, T., Cosar, A.: A survey on new generation metaheuristic algorithms. Comput. Ind. Eng. 137, 106040 (2019)

    Article  Google Scholar 

  42. Eiben, A.E., Smith, J.E., et al.: Introduction to Evolutionary Computing, vol. 53. Springer, Berlin (2003)

    Book  MATH  Google Scholar 

  43. Sörensen, K.: Metaheuristics’ the metaphor exposed. Int. Trans. Opera. Res. 22(1), 3–18 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  44. Dawkins, R.: The Selfish Gene. Oxford University Press, Oxford (1967)

    Google Scholar 

  45. Al-Betar, M.A., Aljarah, I., Awadallah, M.A., Faris, H., Mirjalili, S.: Adaptive \(\beta\)-hill climbing for optimization. Soft Comput. 23(24), 13489–13512 (2019)

    Article  Google Scholar 

  46. Sun, K., Huang, S.-H., Shan-Hill-Wong, D., Jang, S.-S.: Design and application of a variable selection method for multilayer perceptron neural network with lasso. IEEE Trans. Neural Netw. Learn. Syst. 28(6), 1386–1396 (2016)

    Article  Google Scholar 

  47. Al-Betar, M.A.: \(\beta\)-hill climbing: an exploratory local search. Neural Comput. Appl. 28(1), 153–168 (2017)

    Article  Google Scholar 

  48. Al-Betar, M.A., Hammouri, A.I., Awadallah, M.A., Doush, I.A.: Binary \(\beta\)-hill climbing optimizer with s-shape transfer function for feature selection. J. Ambient Intell. Humaniz. Comput. 12, 7637 (2020)

    Article  Google Scholar 

  49. Ahmed, S., Ghosh, K.K., Garcia-Hernandez, L., Abraham, A., Sarkar, R.: Improved coral reefs optimization with adaptive \(\beta\)-hill climbing for feature selection. Neural Comput. Appl. 33, 6467 (2020)

    Article  Google Scholar 

  50. Alweshah, M., Al-Daradkeh, A., Al-Betar, M.A., Almomani, A., Oqeili, S.: \(\beta\)-hill climbing algorithm with probabilistic neural network for classification problems. J. Ambient Intell. Humaniz. Comput. 11, 3405 (2019)

    Article  Google Scholar 

  51. Al-Betar, M.A., Awadallah, M.A., Doush, I.A., Alsukhni, E., ALkhraisat, H.: A non-convex economic dispatch problem with valve loading effect using a new modified \(\beta\)-hill climbing local search algorithm. Arabian J. Sci. Eng. 43(12), 7439–7456 (2018)

    Article  Google Scholar 

  52. Al-Betar, M.A.: A \(\beta\)-hill climbing optimizer for examination timetabling problem. J. Ambient Intell. Humaniz. Comput. 12, 653–666 (2021)

    Article  Google Scholar 

  53. Alsukni, E., Arabeyyat, O.S., Awadallah, M.A., Alsamarraie, L., Abu-Doush, I., Al-Betar, M.A.: Multiple-reservoir scheduling using \(\beta\)-hill climbing algorithm. J. Intell. Syst. 28(4), 559–570 (2019)

    Google Scholar 

  54. Alzaidi, A.A., Ahmad, M., Doja, M.N., Al Solami, E., Beg, M.M.S.: A new 1d chaotic map and \(\beta\)-hill climbing for generating substitution-boxes. IEEE Access 6, 55405–55418 (2018)

    Article  Google Scholar 

  55. Al-Betar, M.A., Awadallah, M.A., Bolaji, A.L., Alijla, B.O.: \(\beta\)-hill climbing algorithm for sudoku game. In: 2017 Palestinian International Conference on Information and Communication Technology (PICICT), pp. 84–88. IEEE (2017)

  56. Alomari, O.A., Khader, A.T., Al-Betar, M.A., Awadallah, M.A.: A novel gene selection method using modified mrmr and hybrid bat-inspired algorithm with \(\beta\)-hill climbing. Appl. Intell. 48(11), 4429–4447 (2018)

    Article  Google Scholar 

  57. Abed-alguni, B.H., Alkhateeb, F.: Intelligent hybrid cuckoo search and \(\beta\)-hill climbing algorithm. J. King Saud Univ. Comput. Inf. Sci. 32(2), 159–173 (2020)

    Google Scholar 

  58. Doush, I.A., Santos, E.: Best polynomial harmony search with best \(\beta\)-hill climbing algorithm. J. Intell. Syst. 30(1), 1–17 (2020)

    Google Scholar 

  59. Abasi, A.K., Khader, A.T., Al-Betar, M.A., Alyasseri, Z.A.A., Makhadmeh, S.N., Al-laham, M., Naim, S.: A hybrid salp swarm algorithm with \(\beta\)-hill climbing algorithm for text documents clustering. In: Aljarah, I., Faris, H., Mirjalili, S. (eds.) Evolutionary Data Clustering: Algorithms and Applications, p. 129. Springer, Singapore (2021)

    Chapter  Google Scholar 

  60. Alyasseri, Z.A.A., Khader, A.T., Al-Betar, M.A., Alomari, O.A.: Person identification using EEG channel selection with hybrid flower pollination algorithm. Pattern Recogn. 105, 107393 (2020)

    Article  Google Scholar 

  61. Jarrah, M.I., Jaya, A.S.M., Alqattan, Z.N., Azam, M.A., Abdullah, R., Jarrah, H., Abu-Khadrah, A.I.: A novel explanatory hybrid artificial bee colony algorithm for numerical function optimization. J. Supercomput. 76, 9330 (2020)

    Article  Google Scholar 

  62. Al-Betar, M.A., Awadallah, M.A., Krishan, M.M.: A non-convex economic load dispatch problem with valve loading effect using a hybrid grey wolf optimizer. Neural Comput. Appl. 32, 12127–12154 (2020)

    Article  Google Scholar 

  63. Sun, K., Jia, H., Li, Y., Jiang, Z.: Hybrid improved slime mould algorithm with adaptive \(\beta\) hill climbing for numerical optimization. J. Intell. Fuzzy Syst. 40(1), 1667–1679 (2021)

    Article  Google Scholar 

  64. Sarkar, R.: An improved salp swarm algorithm based on adaptive \(\beta\)-hill climbing for stock market prediction. In: Machine Learning and Metaheuristics Algorithms, and Applications: Second Symposium, SoMMA 2020, Chennai, India, October 14–17, 2020, Revised Selected Papers, vol. 1366, p. 107. Springer (2021)

  65. Stathakis, D.: How many hidden layers and nodes? Int. J. Remote Sens. 30(8), 2133–2147 (2009)

    Article  Google Scholar 

  66. Aljarah, I., Faris, H., Mirjalili, S., Al-Madi, N., Sheta, A., Mafarja, M.: Evolving neural networks using bird swarm algorithm for data classification and regression applications. Cluster Comput. 22(4), 1317–1345 (2019)

    Article  Google Scholar 

  67. Yang, X.-S.: Flower pollination algorithm for global optimization. In: International Conference on Unconventional Computing and Natural Computation, pp. 240–249. Springer (2012)

  68. Alyasseri, Z.A.A., Khader, A.T., Al-Betar, M.A., Awadallah, M.A., Yang, X.S.: Variants of the flower pollination algorithm: a review. In: Yang, X.S. (ed.) Nature-Inspired Algorithms and Applied Optimization, pp. 91–118. Springer, Cham (2018)

    Chapter  Google Scholar 

  69. Mirjalili, S., Gandomi, A.H., Mirjalili, S.Z., Saremi, S., Faris, H., Mirjalili, S.M.: Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 114, 163–191 (2017)

    Article  Google Scholar 

  70. Askarzadeh, A.: A novel metaheuristic method for solving constrained engineering optimization problems: crow search algorithm. Comput. Struct. 169, 1–12 (2016)

    Article  Google Scholar 

  71. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)

    Article  Google Scholar 

  72. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN’95-International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995)

  73. Rao, R.: Jaya: a simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int. J. Ind. Eng. Comput. 7(1), 19–34 (2016)

    Google Scholar 

  74. UCI Machine Learning Repository. https://archive.ics.uci.edu/ml/index.php (2021). Accessed 6 June 2021

  75. Wdaa, A.S.I., Sttar, A.: Differential evolution for neural networks learning enhancement. PhD thesis, Universiti Teknologi Malaysia Johor Bahru (2008)

  76. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Let a biogeography-based optimizer train your multi-layer perceptron. Inf. Sci. 269, 188–209 (2014)

    Article  MathSciNet  Google Scholar 

  77. Cano, J.-R., Garcia, S., Herrera, F.: Subgroup discover in large size data sets preprocessed using stratified instance selection for increasing the presence of minority classes. Pattern Recogn. Lett. 29(16), 2156–2164 (2008)

    Article  Google Scholar 

  78. Srinivasan, P.A., Guastoni, L., Azizpour, H.: PHILIPP Schlatter, and Ricardo Vinuesa Predictions of turbulent shear flows using deep neural networks. Phys. Rev. Fluids. 4(5), 054603 (2019)

    Article  Google Scholar 

  79. Zeng, X., Yeung, D.S.: Sensitivity analysis of multilayer perceptron to input and weight perturbations. IEEE Trans. Neural Netw. 12(6), 1358–1366 (2001)

Download references

Funding

This work is supported by the Deanship of Scientific Research & Innovation at Al-Zaytoonah University of Jordan granted to the second author (Grant No. 2022-2021/08/17).

Author information

Authors and Affiliations

Authors

Contributions

MAA-B: Conceptualization, Methodology, Software, Formal analysis, Investigation, Writing—original draft, Writing—review and editing. MAA: Programming, Methodology, Writing—original draft, Writing—review and editing. IAD: Methodology, Writing—original draft, Writing—review and editing. OAA: Statistical test, Writing—original draft, Writing—review and editing. AKA: Formal analysis, Investigation, Writing—original draft. SNM: Statistical test, Writing—original draft. ZAAA: Formal analysis, Investigation, Writing—original draft.

Corresponding author

Correspondence to Mohammed Azmi Al-Betar.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Al-Betar, M.A., Awadallah, M.A., Doush, I.A. et al. Boosting the training of neural networks through hybrid metaheuristics. Cluster Comput 26, 1821–1843 (2023). https://doi.org/10.1007/s10586-022-03708-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10586-022-03708-x

Keywords

Navigation