Skip to main content

Multi-layer Perceptron Training Using Hybridized Bat Algorithm

  • Conference paper
  • First Online:
Computational Vision and Bio-Inspired Computing

Abstract

Neural networks (NNs) are a subset in the field of machine learning (ML) that tends to make it possible for a machine to learn and make new predictions based on previous experiences and provided data. It is important to emphasize that there is no need to program this kind of behavior since the whole process is supported by the “self-adjustment” of the algorithm, which can evaluate itself and therefore adjust its parameters to get better performance and accuracy. Neural networks are different from other types of machine learning algorithms in such a way that they do not use statistical and mathematical models to make future predictions. Instead, they replicate the structure and the processes that happen inside the human brain. However, this type of learning is very computationally expensive since there is an enormous amount of states and conditions in which the network itself can be found. Therefore, it can be said that a process of learning for neural networks relates to the collection of NP-complete problems because of the large search space of possible solutions. Swarm intelligence (SI) algorithms can help to reduce this large space of solutions by finding a solution that is not optimal but close optimal and provide satisfactory results given how much longer it would take to train a network without them. In this paper, the authors have proposed a solution for the stated problem based on hybridized bat algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. W.S. McCulloch, W. Pitts, A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5(4), 115–133 (1943)

    Article  MathSciNet  Google Scholar 

  2. M.S. Shanker, Using neural networks to predict the onset of diabetes mellitus. J. Chem. Inform. Computer Sci. 36(1), 35–41 (1996)

    Article  Google Scholar 

  3. J.J. Palop, L. Mucke, Amyloid-\(\beta \)-induced neuronal dysfunction in alzheimer’s disease: from synapses toward neural networks. Nature Neurosci. 13(7), 812–818 (2010)

    Article  Google Scholar 

  4. O. Er, F. Temurtas, A.Ç. Tanrıkulu, Tuberculosis disease diagnosis using artificial neural networks. J. Med. Syst. 34(3), 299–302 (2010)

    Article  Google Scholar 

  5. Y. Lu, S. Yi, N. Zeng, Y. Liu, Y. Zhang, Identification of rice diseases using deep convolutional neural networks. Neurocomputing 267, 378–384 (2017)

    Article  Google Scholar 

  6. B. Liu, Y. Zhang, D. He, Y. Li, Identification of apple leaf diseases based on deep convolutional neural networks. Symmetry 10(1), 11 (2018)

    Article  Google Scholar 

  7. J. Orbach, Principles of neurodynamics. Perceptrons and the theory of brain mechanisms. Arch. General Psychiatry 7(3), 218–219 (1962)

    Google Scholar 

  8. Y. Freund, R.E. Schapire, Large margin classification using the perceptron algorithm. Mach. Learn. 37(3), 277–296 (1999)

    Article  Google Scholar 

  9. R. Hecht-Nielsen, Theory of the backpropagation neural network, in Neural Networks for Perception (Elsevier, Amsterdam, 1992)

    Google Scholar 

  10. F.-C. Chen, Back-propagation neural networks for nonlinear self-tuning adaptive control. IEEE Control Syst. Mag. 10(3), 44–48 (1990)

    Article  Google Scholar 

  11. M. Dorigo, M. Birattari, T. Stutzle, Ant colony optimization. IEEE Comput. Intell. Mag. 1(4), 28–39 (2006)

    Article  Google Scholar 

  12. X.-S. Yang, A new metaheuristic bat-inspired algorithm, in Nature Inspired Cooperative Strategies for Optimization (NICSO 2010) (Springer, Berlin, 2010), pp. 65–74

    Book  Google Scholar 

  13. D. Karaboga, B. Basturk, Artificial bee colony (abc) optimization algorithm for solving constrained optimization problems, in International Fuzzy Systems Association World Congress (Springer, Berlin, 2007), pp. 789–798

    MATH  Google Scholar 

  14. A.A. Heidari, S. Mirjalili, H. Faris, I. Aljarah, M. Mafarja, H. Chen, Harris hawks optimization: algorithm and applications. Future Gener. Comput. Syst. 97, 849–872 (2019)

    Article  Google Scholar 

  15. T. Bezdan, M. Zivkovic, E. Tuba, I. Strumberger, N. Bacanin, M. Tuba, Multi-objective task scheduling in cloud computing environment by hybridized bat algorithm, in International Conference on Intelligent and Fuzzy Systems (Springer, 2020), pp. 718–725

    Google Scholar 

  16. T. Bezdan, M. Zivkovic, M. Antonijevic, T. Zivkovic, N. Bacanin, Enhanced flower pollination algorithm for task scheduling in cloud computing environment, in Machine Learning for Predictive Analysis, ed. by A. Joshi, M. Khosravy, N. Gupta (Springer, Singapore, 2021), pp. 163–171

    Chapter  Google Scholar 

  17. N. Bacanin, T. Bezdan, E. Tuba, I. Strumberger, M. Tuba, M. Zivkovic, Task scheduling in cloud computing environment by grey wolf optimizer, in 2019 27th Telecommunications Forum (TELFOR) (IEEE, 2019), pp. 1–4

    Google Scholar 

  18. M. Zivkovic, N. Bacanin, E. Tuba, I. Strumberger, T. Bezdan, M. Tuba, Wireless sensor networks life time optimization based on the improved firefly algorithm, in 2020 International Wireless Communications and Mobile Computing (IWCMC) (IEEE, 2020), pp. 1176–1181

    Google Scholar 

  19. N. Bacanin, E. Tuba, M. Zivkovic, I. Strumberger, M. Tuba, Whale optimization algorithm with exploratory move for wireless sensor networks localization, in International Conference on Hybrid Intelligent Systems (Springer, Berlin, 2019), pp. 328–338

    Google Scholar 

  20. M. Zivkovic, N. Bacanin, T. Zivkovic, I. Strumberger, E. Tuba, M. Tuba, Enhanced grey wolf algorithm for energy efficient wireless sensor networks, in 2020 Zooming Innovation in Consumer Technologies Conference (ZINC) (IEEE, 2020), pp. 87–92

    Google Scholar 

  21. T. Bezdan, M. Zivkovic, E. Tuba, I. Strumberger, N. Bacanin, M. Tuba, Glioma brain tumor grade classification from MRI using convolutional neural networks designed by modified FA”, in International Conference on Intelligent and Fuzzy Systems (Springer, Berlin, 2020), pp. 955–963

    Google Scholar 

  22. E.T.I.S. Nebojsa Bacanin, T. Bezdan, M. Tuba, Optimizing convolutional neural network hyperparameters by enhanced swarm intelligence metaheuristics. Algorithms 13(3), 67 (2020)

    Google Scholar 

  23. N. Bacanin, T. Bezdan, E. Tuba, I. Strumberger, M. Tuba, Monarch butterfly optimization based convolutional neural network design. Mathematics 8(6), 936 (2020)

    Article  Google Scholar 

  24. T. Bezdan, E. Tuba, I. Strumberger, N. Bacanin, M. Tuba, Automatically designing convolutional neural network architecture with artificial flora algorithm, in ICT Systems and Sustainability, ed. by M. Tuba, S. Akashe, A. Joshi (Springer, Singapore, 2020), pp. 371–378

    Google Scholar 

  25. A.A. Heidari, H. Faris, I. Aljarah, S. Mirjalili, An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 23(17), 7941–7958 (2019)

    Article  Google Scholar 

  26. J.C. Duchi, E. Hazan, Y. Singer, Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)

    MathSciNet  MATH  Google Scholar 

  27. M. D. Zeiler, Adadelta: an adaptive learning rate method (2012)

    Google Scholar 

  28. D.P. Kingma, J. Ba, Adam: a method for stochastic optimization (2014)

    Google Scholar 

  29. D. Karaboga, B. Akay, A modified artificial bee colony (ABC) algorithm for constrained optimization problems. Appl. Soft Comput. 11(3), 3021–3031 (2011)

    Article  Google Scholar 

  30. M. Tuba, N. Bacanin, Hybridized bat algorithm for multi-objective radio frequency identification (RFID) network planning, in 2015 IEEE Congress on Evolutionary Computation (CEC) (2015), pp. 499–506

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luka Gajic .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gajic, L., Cvetnic, D., Zivkovic, M., Bezdan, T., Bacanin, N., Milosevic, S. (2021). Multi-layer Perceptron Training Using Hybridized Bat Algorithm. In: Smys, S., Tavares, J.M.R.S., Bestak, R., Shi, F. (eds) Computational Vision and Bio-Inspired Computing. Advances in Intelligent Systems and Computing, vol 1318. Springer, Singapore. https://doi.org/10.1007/978-981-33-6862-0_54

Download citation

Publish with us

Policies and ethics