Skip to main content

Convolutional Neural Networks Hyperparameters Tuning

  • Chapter
  • First Online:
Artificial Intelligence: Theory and Applications

Abstract

Digital images have revolutionized work in numerous scientific fields such as healthcare, astronomy, biology, agriculture as well as in every day life. One of the frequent tasks in applications with digital images is image classification which is a very challenging task. Major progress was made when convolution neural networks were introduced. The use of CNN has produced significant improvements in applications that require image classification. With today’s technology, it is relatively simple to implement and use CNNs but in order to obtain the best possible results it is necessary to find the optimal architecture and hyperparameters for every single task. Due to a large number of hyperparameters, it is difficult to find the optimal configuration and there is no deterministic way to do it. In this early stage of CNN development, the common method of tuning CNN is by guessing and estimating, known as the guestimating method. Since this is a hard optimization problem, there is a chance to apply an optimization metaheuristic. There are several studies that have applied different optimization methods for tuning CNN hyperparameters have applied. One promising approach is the application of swarm intelligence algorithms. In this paper, a brief review of the CNN hyperparameters tuning will be presented and discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Alihodzic, A., Tuba, E., Simian, D., Tuba, V., Tuba, M.: Extreme learning machines for data classification tuning by improved bat algorithm. In: 2018 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2018)

    Google Scholar 

  2. Bacanin, N., Bezdan, T., Tuba, E., Strumberger, I., Tuba, M.: Monarch butterfly optimization based convolutional neural network design. Mathematics 8(6), 936 (2020)

    Article  Google Scholar 

  3. Bacanin, N., Bezdan, T., Tuba, E., Strumberger, I., Tuba, M.: Optimizing convolutional neural network hyperparameters by enhanced swarm intelligence metaheuristics. Algorithms 13(3), 67 (2020)

    Article  MathSciNet  Google Scholar 

  4. Bacanin, N., Tuba, E., Bezdan, T., Strumberger, I., Jovanovic, R., Tuba, M.: Dropout probability estimation in convolutional neural networks by the enhanced bat algorithm. In: IEEE International Joint Conference on Neural Networks (IJCNN), pp. 1–7 (2020)

    Google Scholar 

  5. Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the 6th international symposium on micromachine and human science, pp. 39–43. IEEE (1995)

    Google Scholar 

  6. Darwish, A., Ezzat, D., Hassanien, A.E.: An optimized model based on convolutional neural networks and orthogonal learning particle swarm optimization algorithm for plant diseases diagnosis. Swarm Evol. Comput. 52 (2020)

    Google Scholar 

  7. De Rosa, G.H., Papa, J.P., Yang, X.S.: Handling dropout probability estimation in convolution neural networks using meta-heuristics. Soft. Comput. 22(18), 6147–6156 (2018)

    Article  Google Scholar 

  8. Eberhart, R.C., Shi, Y.: Comparing inertia weights and constriction factors in particle swarm optimization. In: IEEE Congress on Evolutionary Computation (CEC), vol. 1, pp. 84–88 (2000)

    Google Scholar 

  9. Geem, Z.W., Kim, J.H., Loganathan, G.V.: A new heuristic optimization algorithm: harmony search. Simulation 76(2), 60–68 (2001)

    Google Scholar 

  10. Haidar, A., Jan, Z.M., Verma, B.: Evolving one-dimensional deep convolutional neural network: a swarm based approach. In: 2019 IEEE Congress on Evolutionary Computation (CEC), pp. 1299–1305 (2019). https://doi.org/10.1109/CEC.2019.8790036

  11. Hongtao, L., Qinchuan, Z.: Applications of deep convolutional neural network in computer vision. J. Data Acquisi. Proc. 31(1), 1–17 (2016)

    Google Scholar 

  12. Hrosik, R.C., Tuba, E., Dolicanin, E., Jovanovic, R., Tuba, M.: Brain image segmentation based on firefly algorithm combined with k-means clustering. Stud. Inform. Control 28, 167–176 (2019)

    Google Scholar 

  13. Ishibushi, S., Taniguchi, A., Takano, T., Hagiwara, Y., Taniguchi, T.: Statistical localization exploiting convolutional neural network for an autonomous vehicle. In: IECON 2015-41st Annual Conference of the IEEE Industrial Electronics Society, pp. 001369–001375. IEEE (2015)

    Google Scholar 

  14. Jiang, J., Han, F., Ling, Q., Wang, J., Li, T., Han, H.: Efficient network architecture search via multiobjective particle swarm optimization based on decomposition. Neural Netw. 123, 305–316 (2020)

    Article  Google Scholar 

  15. Junior, F.E.F., Yen, G.G.: Particle swarm optimization of deep neural networks architectures for image classification. Swarm Evol. Comput. 49, 62–74 (2019)

    Article  Google Scholar 

  16. Khalifa, M.H., Ammar, M., Ouarda, W., Alimi, A.M.: Particle swarm optimization for deep learning of convolution neural network. In: Sudan Conference on Computer Science and Information Technology (SCCSIT), pp. 1–5 (2017). https://doi.org/10.1109/SCCSIT.2017.8293059

  17. Khaw, H.Y., Soon, F.C., Chuah, J.H., Chow, C.O.: High-density impulse noise detection and removal using deep convolutional neural network with particle swarm optimisation. IET Image Proc. 13(2), 365–374 (2018)

    Article  Google Scholar 

  18. Klyuchnikov, N., Trofimov, I., Artemova, E., Salnikov, M., Fedorov, M., Burnaev, E.: Nas-bench-nlp: neural architecture search benchmark for natural language processing. arXiv:2006.07116 (2020)

  19. Krizhevsky, A.: Learning multiple layers of features from tiny images. Technical Report (2009)

    Google Scholar 

  20. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2017)

    Article  Google Scholar 

  21. Kumar, P.J., Huan, T.L., Li, X., Yuan, Y.: Panchromatic and multispectral remote sensing image fusion using particle swarm optimization of convolutional neural network for effective comparison of bucolic and farming region. In: Earth Science and Remote Sensing Applications, Series of Remote Sensing/Photogrammetry, vol. 43, pp. 1–30 (2018)

    Google Scholar 

  22. Lan, K., Liu, L., Li, T., Chen, Y., Fong, S., Marques, J.A.L., Wong, R.K., Tang, R.: Multi-view convolutional neural network with leader and long-tail particle swarm optimizer for enhancing heart disease and breast cancer detection. Neural Comput. Appl. 1–20 (2020)

    Google Scholar 

  23. LeCun, Y., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., Jackel, L.D.: Backpropagation applied to handwritten zip code recognition. Neural Comput. 1(4), 541–551 (1989)

    Article  Google Scholar 

  24. LeCun, Y., Boser, B.E., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W.E., Jackel, L.D.: Handwritten digit recognition with a back-propagation network. In: Advances in Neural Information Processing Systems, pp. 396–404 (1990)

    Google Scholar 

  25. LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)

    Article  Google Scholar 

  26. LeCun, Y., Cortes, C.: MNIST handwritten digit database (2010), http://yann.lecun.com/exdb/mnist/

  27. Lee, W.Y., Park, S.M., Sim, K.B.: Optimal hyperparameter tuning of convolutional neural networks based on the parameter-setting-free harmony search algorithm. Optik 172, 359–367 (2018)

    Article  Google Scholar 

  28. Lorenzo, P.R., Nalepa, J., Kawulok, M., Ramos, L.S., Pastor, J.R.: Particle swarm optimization for hyper-parameter selection in deep neural networks. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 481–488 (2017)

    Google Scholar 

  29. Dorigo, M., Gambardella, L.M.: Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Trans. Evol. Comput. 1(1), 53–66 (1997)

    Google Scholar 

  30. Dorigo, M., Maniezzo, V., Colorni, A.: Ant system: optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B: Cybern. 26(1), 29–41 (1996)

    Google Scholar 

  31. de Pinho Pinheiro, C.A., Nedjah, N., de Macedo Mourelle, L.: Detection and classification of pulmonary nodules using deep learning and swarm intelligence. Multimedia Tools Appl. 79(21), 15437–15465 (2020)

    Article  Google Scholar 

  32. Qayyum, A., Anwar, S.M., Awais, M., Majid, M.: Medical image retrieval using deep convolutional neural network. Neurocomputing 266, 8–20 (2017)

    Article  Google Scholar 

  33. Rosa, G., Papa, J., Marana, A., Scheirer, W., Cox, D.: Fine-tuning convolutional neural networks using harmony search. In: Iberoamerican Congress on Pattern Recognition, pp. 683–690. Springer (2015)

    Google Scholar 

  34. Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., Huang, Z., Karpathy, A., Khosla, A., Bernstein, M., Berg, A.C., Fei-Fei, L.: ImageNet large scale visual recognition challenge. Int. J. Comput. Vis. (IJCV) 115(3), 211–252 (2015). https://doi.org/10.1007/s11263-015-0816-y

    Article  MathSciNet  Google Scholar 

  35. Serizawa, T., Fujita, H.: Optimization of convolutional neural network using the linearly decreasing weight particle swarm optimization. arXiv:2001.05670; Submitted to Neural and Evolutionary Computing (2020)

  36. Shi, Y.: Brain storm optimization algorithm. In: International conference in swarm intelligence, pp. 303–309. Springer (2011)

    Google Scholar 

  37. Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: IEEE World Congress on Computational Intelligence (CEC), pp. 69–73. IEEE (1998)

    Google Scholar 

  38. Shi, Y., Eberhart, R.C.: Empirical study of particle swarm optimization. IEEE Congress Evol. Comput. (CEC). 3, 1945–1950 (1999)

    Google Scholar 

  39. da Silva, G.L.F., Valente, T.L.A., Silva, A.C., de Paiva, A.C., Gattass, M.: Convolutional neural network-based pso for lung nodule false positive reduction on ct images. Comput. Methods Programs Biomed. 162, 109–118 (2018)

    Article  Google Scholar 

  40. Sinha, T., Haidar, A., Verma, B.: Particle swarm optimization based approach for finding optimal values of convolutional neural network parameters. In: 2018 IEEE Congress on Evolutionary Computation (CEC), pp. 1–6. IEEE (2018)

    Google Scholar 

  41. Sinha, T., Verma, B., Haidar, A.: Optimization of convolutional neural network parameters for image classification. In: 2017 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–7. IEEE (2017)

    Google Scholar 

  42. Strumberger, I., Tuba, E., Bacanin, N., Jovanovic, R., Tuba, M.: Convolutional neural network architecture design by the tree growth algorithm framework. In: International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2019)

    Google Scholar 

  43. Syulistyo, A.R., Purnomo, D.M.J., Rachmadi, M.F., Wibowo, A.: Particle swarm optimization (PSO) for training optimization on convolutional neural network. Jurnal Ilmu Komputer dan Informasi 9(1), 52–58 (2016)

    Article  Google Scholar 

  44. Tan, Y., Zhu, Y.: Fireworks algorithm for optimization. Advances in Swarm Intelligence, LNCS 6145, pp. 355–364 (2010)

    Google Scholar 

  45. Toda, Y., Okura, F., et al.: How convolutional neural networks diagnose plant disease. Plant Phenomics 2019, 9237136 (2019)

    Article  Google Scholar 

  46. Tuba, E., Bacanin, N.: An algorithm for handwritten digit recognition using projection histograms and svm classifier. In: 2015 23rd Telecommunications Forum Telfor (TELFOR), pp. 464–467. IEEE (2015)

    Google Scholar 

  47. Tuba, E., Hrosik, R.C., Alihodzic, A., Jovanovic, R., Tuba, M.: Support vector machine optimized by fireworks algorithm for handwritten digit recognition. In: International Conference on Modelling and Development of Intelligent Systems, pp. 187–199. Springer (2019)

    Google Scholar 

  48. Tuba, E., Mrkela, L., Tuba, M.: Support vector machine parameter tuning using firefly algorithm. In: 26th International Conference Radioelektronika, pp. 413–418. IEEE (2016)

    Google Scholar 

  49. Tuba, E., Stanimirovic, Z.: Elephant herding optimization algorithm for support vector machine parameters tuning. In: Proceedings of the 2017 International Conference on Electronics, Computers and Artificial Intelligence (ECAI), pp. 1–5 (2017)

    Google Scholar 

  50. Tuba, E., Strumberger, I., Bezdan, T., Bacanin, N., Tuba, M.: Classification and feature selection method for medical datasets by brain storm optimization algorithm and support vector machine. Proc. Comput. Sci. 162, 307–315 (2019)

    Article  Google Scholar 

  51. Tuba, E., Tuba, M., Dolicanin, E.: Adjusted fireworks algorithm applied to retinal image registration. Studies Inf. Control 26(1), 33–42 (2017)

    Google Scholar 

  52. Vijh, S., Sharma, S., Gaurav, P.: Brain tumor segmentation using otsu embedded adaptive particle swarm optimization method and convolutional neural network. In: Data Visualization and Knowledge Engineering, pp. 171–194. Springer (2020)

    Google Scholar 

  53. Wang, B., Sun, Y., Xue, B., Zhang, M.: Evolving deep convolutional neural networks by variable-length particle swarm optimization for image classification. In: 2018 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8. IEEE (2018)

    Google Scholar 

  54. Wang, G.G., Deb, S., dos S. Coelho, L.: Elephant herding optimization. In: Proceedings of the 2015 3rd International Symposium on Computational and Business Intelligence (ISCBI), pp. 1–5 (2015)

    Google Scholar 

  55. Yamasaki, T., Honma, T., Aizawa, K.: Efficient optimization of convolutional neural networks using particle swarm optimization. In: IEEE Third International Conference on Multimedia Big Data (BigMM), pp. 70–73 (2017)

    Google Scholar 

  56. Yang, X.S.: Firefly algorithms for multimodal optimization. Stochastic Algorithms: Foundations and Applications, LNCS 5792, pp. 169–178 (2009)

    Google Scholar 

  57. Yang, X.S.: A new metaheuristic bat-inspired algorithm. Studies Comput. Intell. 284, 65–74 (2010)

    MATH  Google Scholar 

  58. Yang, X.S., Deb, S.: Cuckoo search via levy flights. In: Proceedings of World Congress on Nature & Biologically Inspired Computing (NaBIC 2009), pp. 210–214 (2009)

    Google Scholar 

  59. Yang, Y., Xiang, T., Liu, H., Liao, X.: Convolutional neural network for visual security evaluation. IEEE Trans. Circuits Syst. Video Technol. 1–1 (2020)

    Google Scholar 

  60. Yao, X., Wang, X., Wang, S.H., Zhang, Y.D.: A comprehensive survey on convolutional neural network in medical image analysis. Multimedia Tools Appl. 1–45 (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Milan Tuba .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Tuba, E., Bačanin, N., Strumberger, I., Tuba, M. (2021). Convolutional Neural Networks Hyperparameters Tuning. In: Pap, E. (eds) Artificial Intelligence: Theory and Applications. Studies in Computational Intelligence, vol 973. Springer, Cham. https://doi.org/10.1007/978-3-030-72711-6_4

Download citation

Publish with us

Policies and ethics