Advertisement

Soft Computing

, Volume 23, Issue 23, pp 12331–12345 | Cite as

MLP-LOA: a metaheuristic approach to design an optimal multilayer perceptron

  • Priti BansalEmail author
  • Shakshi Gupta
  • Sumit Kumar
  • Shubham Sharma
  • Shreshth Sharma
Methodologies and Application
  • 94 Downloads

Abstract

Designing an ANN is a complex task as its performance is highly dependent on the network architecture as well as the training algorithm used to select proper synaptic weights and biases. Choosing an optimal design leads to greater accuracy when the ANN is used for classification. In this paper, we propose an approach multilayer perceptron-lion optimization algorithm (MLP-LOA) that uses lion optimization algorithm to find an optimum multilayer perceptron (MLP) architecture for a given classification problem. MLP-LOA uses back-propagation (BP) for training during the optimization process. MLP-LOA also optimizes learning rate and momentum as they have a significant role while training MLP using BP. LOA is a population-based metaheuristic algorithm inspired by the lifestyle of lions and their cooperative behavior. LOA, unlike other metaheuristics, uses different strategies to search for optimal solution, performs strong local search and helps to escape from worst solutions. A new fitness function is proposed to evaluate MLP based on its generalization ability as well as the network’s complexity. This is done to avoid dense architectures as they increase chances of overfitting. The proposed approach is tested on different classification problems selected from University of California Irvine repository and compared with the existing state-of-the-art techniques in terms of accuracy achieved during testing phase. Experimental results show that MLP-LOA performs better as compared to the existing state-of-the-art techniques.

Keywords

Multilayer perceptron Lion optimization algorithm Classification Back-propagation 

Notes

Compliance with ethical standards

Conflict of interest

Priti Bansal, Shakshi Gupta, Sumit Kumar, Shubham Sharma and Shreshth Sharma declare that he has no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

References

  1. Aljarah I, Faris H, Mirjalili S (2016) Optimizing connection weights in neural networks using the whale optimization algorithm. In: Proceedings soft computing, pp 1–15Google Scholar
  2. Amato F, López A, Peña-Méndez EM, Vaňhara P, Hampl A, Havel J (2013) Artificial neural networks in medical diagnosis. J Appl Biomed 11:47–58CrossRefGoogle Scholar
  3. Bishop C (1995) Neural networks for pattern recognition. Oxford University Press, OxfordzbMATHGoogle Scholar
  4. Carvalho M, Ludermir T (2007) Particle swarm optimization of neural network architectures and weights. In: 7th International conference on hybrid intelligent systems, pp 336–339Google Scholar
  5. Carvalho AR, Ramos FM, Chaves AA (2011) Metaheuristics for the feedforward artificial neural network (ann) architecture optimization problem. Neural Comput Appl 20(8):1273–1284CrossRefGoogle Scholar
  6. Chen L, Zhang X (2009) Application of artificial neural networks to classify water quality of the yellow river. In: Cao B, Zhang C, Li T (eds) Fuzzy information and engineering. Advances in soft computing, vol 54, pp 15–23. . Springer, Berlin, HeidelbergGoogle Scholar
  7. Conforth M, Meng Y (2008) Toward evolving neural networks using bio-inspired algorithms. In: Proceedings of the international conference on artificial intelligence, pp 413–419Google Scholar
  8. Ettaouil M, Ghanou Y (2009) Neural architectures optimization and genetic algorithms. WSEAS Trans Comput 8(3):526–537Google Scholar
  9. Frank A, Asuncion A (2010) UCI machine learning repository http://archive.ics.uci.edu/ml. University of California, School of Information and Computer Science, Irvine, CA
  10. Garro BA, Vázquez RA (2015) Designing artificial neural networks using particle swarm optimization algorithms. Comput Intell Neurosci.  https://doi.org/10.1155/2015/369298 CrossRefGoogle Scholar
  11. Garro BA, Sossa H, Vazquez RA (2011) Artificial neural network synthesis by means of artificial bee colony (abc) algorithm. In: Proceedings of the IEEE congress on evolutionary computation (CEC’11), pp 331–338Google Scholar
  12. Islam MM, Sattar MA, Amin MF, Yao X, Murase K (2009a) A new constructive algorithm for architectural and functional adaptation of artificial neural networks. IEEE Trans Syst Man Cybern B Cybern 39(6):1590–1605CrossRefGoogle Scholar
  13. Islam MM, Sattar MA, Amin MF, Yao X, Murase K (2009b) A new adaptive merging and growing algorithm for designing artificial neural networks. IEEE Trans Syst Man Cybern B Cybern 39(3):705–722CrossRefGoogle Scholar
  14. Jaddi NS, Abdullah S, Hamdan AR (2015) Optimization of neural network model using modified bat-inspired algorithm. Appl Soft Comput 37:71–86CrossRefGoogle Scholar
  15. Lauret P, Fock E, Mara TA (2006) A node pruning algorithm based on a Fourier amplitude sensitivity test method. IEEE Trans Neural Netw 17(2):273–293CrossRefGoogle Scholar
  16. Lu C, Shi B, Chen L (2000) Hybrid BP-GA for multilayer feedforward neural networks. In: Proceedings 7th IEEE international conference electronics, circuits and systems (ICECS), pp 958–961Google Scholar
  17. Ma X, Gan X (2009) Condition monitoring and faults recognizing of dish centrifugal separator by artificial neural network combined with expert system. In: Proceedings 5th international conference on natural computing, pp 203–207Google Scholar
  18. Ma L, Khorasani K (2005) Constructive feedforward neural networks using hermite polynomial activation functions. IEEE Trans Neural Netw 16(4):821–833CrossRefGoogle Scholar
  19. Sheng W, Shan P, Mao J, Zheng Y, Chen S, Wang Z (2017) An adaptive memetic algorithm with rank-based mutation for artificial neural network architecture optimization. IEEE Access 5:8895–18908Google Scholar
  20. Silalahi DD, Reano CE, Lansigan FP, Panopio RG, Bantayan NC (2016) Using genetic algorithm neural network on near infrared spectral data for ripeness grading of oil palm (Elaeis guineensis Jacq.) fresh fruit. Inf Process Agric 3(4):252–261Google Scholar
  21. Tizhoosh HR (2005) Opposition-based learning: a new scheme for machine intelligence. In: Proceedings of international conference on computational intelligence for modeling control and automation, pp 695–701Google Scholar
  22. Tsai JT, Chou JH, Liu TK (2006) Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm. IEEE Trans Neural Netw 17(1):69–80CrossRefGoogle Scholar
  23. Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. Ph.D. thesis, University Teknologi, MalaysiaGoogle Scholar
  24. Wilson DR, Martinez TR (2001) The need for small learning rates on large problems. In: Proceedings of international joint conference on neural networks, pp 115–119Google Scholar
  25. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82CrossRefGoogle Scholar
  26. Yang S-H, Chen Y-P (2012) An evolutionary constructive and pruning algorithm for artificial neural networks and its prediction applications. Neurocomputing 86(4):140–149CrossRefGoogle Scholar
  27. Yao X (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447CrossRefGoogle Scholar
  28. Yazdani M, Jolai F (2015) Lion optimization algorithm (LOA): a nature-inspired metaheuristic algorithm. J Comput Des Eng 1:1.  https://doi.org/10.1016/j.jcde.2015.06.003 CrossRefGoogle Scholar
  29. Zanchettin C, Ludermir TB, Almeida LM (2011) Hybrid training method for MLP: optimization of architecture and training. IEEE Trans Syst Man Cybern B Cybern 41(4):1097–1109CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Information TechnologyNetaji Subhas Institute of TechnologyDwarka, New DelhiIndia
  2. 2.Department of Information TechnologyKIET Group of InstitutionsGhaziabadIndia

Personalised recommendations