Skip to main content

Parallel/Distributed Intelligent Hyperparameters Search for Generative Artificial Neural Networks

  • Conference paper
  • First Online:
High Performance Computing (ISC High Performance 2021)

Abstract

This article presents a parallel/distributed methodology for the intelligent search of the hyperparameters configuration for generative artificial neural networks (GANs). Finding the configuration that best fits a GAN for a specific problem is challenging because GANs simultaneously train two deep neural networks. Thus, in general, GANs have more configuration parameters than other deep learning methods. The proposed system applies the iterated racing approach taking advantage of parallel/distributed computing for the efficient use of resources for configuration. The main results of the experimental evaluation performed on the MNIST dataset showed that the parallel system is able to efficiently use the GPU, achieving a high level of parallelism and reducing the computational wall clock time by 78%, while providing competitive comparable results to the sequential hyperparameters search.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akay, B., Karaboga, D., Akay, R.: A comprehensive survey on optimizing deep learning models by metaheuristics. Artificial Intelligence Review, pp. 1–66 (2021)

    Google Scholar 

  2. Al-Dujaili, A., Schmiedlechner, T., Hemberg, E., O’Reilly, U.M.: Towards distributed coevolutionary GANs. In: AAAI Symposium (2018)

    Google Scholar 

  3. Bergstra, J., Bardenet, R., Bengio, Y., Kégl, B.: Algorithms for hyper-parameter optimization. Advances in Neural Information Processing Systems, vol. 24 (2011)

    Google Scholar 

  4. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. Journal of Machine Learning Research, vol. 13, no. 2 (2012)

    Google Scholar 

  5. Birattari, M., Stützle, T., Paquete, L., Varrentrapp, K., et al.: A racing algorithm for configuring metaheuristics. In: Gecco, vol. 2 (2002)

    Google Scholar 

  6. Camero, A., Toutouh, J., Alba, E.: Random error sampling-based recurrent neural network architecture optimization. Eng. Appl. Artif. Intell. 96, 103946 (2020)

    Google Scholar 

  7. Camero, A., Toutouh, J., Stolfi, D.H., Alba, E.: Evolutionary deep learning for car park occupancy prediction in smart cities. In: Battiti, R., Brunato, M., Kotsireas, I., Pardalos, P.M. (eds.) LION 12 2018. LNCS, vol. 11353, pp. 386–401. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-05348-2_32

  8. Gitler, I., Gomes, A.T.A., Nesmachnow, S.: The latin american supercomputing ecosystem for science. Commun. ACM 63(11), 66–71 (2020)

    Article  Google Scholar 

  9. Goodfellow, I., et al.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, pp. 2672–2680 (2014)

    Google Scholar 

  10. Hemberg, E., Toutouh, J., Al-Dujaili, A., Schmiedlechner, T., O’Reilly U-M.: Spatial coevolution for generative adversarial network training. ACM Trans. Evol. Learn. Optim. 2(1) (2021). Article 6, 28 p. https://doi.org/10.1145/3458845

  11. Hinz, T., Navarro-Guerrero, N., Magg, S., Wermter, S.: Speeding up the hyperparameter optimization of deep convolutional neural networks. Int. J. Comput. Intell. Appl. 17(02), 1850008 (2018)

    Article  Google Scholar 

  12. Kaselimi, M., Doulamis, N., Doulamis, A., Voulodimos, A., Protopapadakis, E.: Bayesian-optimized bidirectional lstm regression model for non-intrusive load monitoring. In: ICASSP 2019–2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 2747–2751 (2019). https://doi.org/10.1109/ICASSP.2019.8683110

  13. López-Ibáñez, M., Dubois-Lacoste, J., Cáceres, L.P., Birattari, M., Stützle, T.: The IRACE package: iterated racing for automatic algorithm configuration. Oper. Res. Perspect. 3, 43–58 (2016)

    MathSciNet  Google Scholar 

  14. Lopez-Ibanez, M., Stutzle, T.: The automatic design of multiobjective ant colony optimization algorithms. IEEE Trans. Evol. Comput. 16(6), 861–875 (2012)

    Article  Google Scholar 

  15. López-Ibánez, M., Stützle, T.: Automatically improving the anytime behaviour of optimisation algorithms. Eur. J. Oper. Res. 235(3), 569–582 (2014)

    Article  MathSciNet  Google Scholar 

  16. Lopez-Rincon, A., Tonda, A., Elati, M., Schwander, O., Piwowarski, B., Gallinari, P.: Evolutionary optimization of convolutional neural networks for cancer mirna biomarkers classification. Appl. Soft Comput. 65, 91–100 (2018)

    Article  Google Scholar 

  17. Nesmachnow, S., Iturriaga, S.: Cluster-UY: collaborative scientific high performance computing in uruguay. In: Torres, M., Klapp, J. (eds.) ISUM 2019. CCIS, vol. 1151, pp. 188–202. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-38043-4_16

    Chapter  Google Scholar 

  18. Pan, Z., Yu, W., Yi, X., Khan, A., Yuan, F., Zheng, Y.: Recent progress on generative adversarial networks (GANs): a survey. IEEE Access 7, 36322–36333 (2019)

    Article  Google Scholar 

  19. Prestes, L., Delgado, M.R., Lüders, R., Gonçalves, R., Almeida, C.P.: Boosting the performance of moea/d-dra with a multi-objective hyper-heuristic based on irace and ucb method for heuristic selection. In: 2018 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8. IEEE (2018)

    Google Scholar 

  20. Schmiedlechner, T., Yong, I., Al-Dujaili, A., Hemberg, E., O’Reilly, U.: Lipizzaner: a system that scales robust generative adversarial network training. In: 32\(^{nd}\) Conference on Neural Information Processing Systems (2018)

    Google Scholar 

  21. Schmiedlechner, T., Yong, I.N.Z., Al-Dujaili, A., Hemberg, E., O’Reilly, U.M.: Lipizzaner: a system that scales robust generative adversarial network training. In: the 32nd Conference on Neural Information Processing Systems (NeurIPS 2018) Workshop on Systems for ML and Open Source Software (2018)

    Google Scholar 

  22. Thornton, C., Hutter, F., Hoos, H.H., Leyton-Brown, K.: Auto-weka: combined selection and hyperparameter optimization of classification algorithms. In: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 847–855 (2013)

    Google Scholar 

  23. Toutouh, J., Esteban, M., Nesmachnow., S.: Parallel/distributed generative adversarial neural networks for data augmentation of covid-19 training images. In: Latin America High Performance Computing Conference (CARLA 2020), p. 10 (2020)

    Google Scholar 

  24. Toutouh, J., Esteban, M., Nesmachnow, S.: Parallel/distributed generative adversarial neural networks for data augmentation of COVID-19 training images. In: Nesmachnow, S., Castro, H., Tchernykh, A. (eds.) CARLA 2020. CCIS, vol. 1327, pp. 162–177. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-68035-0_12

    Chapter  Google Scholar 

  25. Toutouh, J., Hemberg, E., O’Reilly, U.M.: Spatial evolutionary generative adversarial networks. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 472–480. GECCO ’19, Association for Computing Machinery, New York, NY, USA (2019)

    Google Scholar 

  26. Toutouh, J., Hemberg, E., O’Reilly, U.-M.: Data dieting in GAN training. In: Iba, H., Noman, N. (eds.) Deep Neural Evolution. NCS, pp. 379–400. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-3685-4_14

    Chapter  Google Scholar 

  27. Ugolotti, R., Nashed, Y.S., Mesejo, P., Cagnoni, S.: Algorithm configuration using gpu-based metaheuristics. In: Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation, pp. 221–222 (2013)

    Google Scholar 

  28. Victoria, A.H., Maragatham, G.: Automatic tuning of hyperparameters using bayesian optimization. Evolving Syst. 12, 217–223 (2021)

    Article  Google Scholar 

  29. Wang, C., Xu, C., Yao, X., Tao, D.: Evolutionary generative adversarial networks. IEEE Trans. Evol. Comput. 23(6), 921–934 (2019)

    Article  Google Scholar 

  30. Wu, J., Chen, X.Y., Zhang, H., Xiong, L.D., Lei, H., Deng, S.H.: Hyperparameter optimization for machine learning models based on bayesian optimization. J. Electron. Sci. Technol. 17(1), 26–40 (2019)

    Google Scholar 

  31. Yang, L., Shami, A.: On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415, 295–316 (2020)

    Article  Google Scholar 

Download references

Acknowledgements

This research was partially funded by European Union’s Horizon 2020 research and innovation program under the Marie SkÅ‚odowska-Curie grant agreement No 799078 and by the European Union H2020-ICT-2019-3.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jamal Toutouh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Esteban, M., Toutouh, J., Nesmachnow, S. (2021). Parallel/Distributed Intelligent Hyperparameters Search for Generative Artificial Neural Networks. In: Jagode, H., Anzt, H., Ltaief, H., Luszczek, P. (eds) High Performance Computing. ISC High Performance 2021. Lecture Notes in Computer Science(), vol 12761. Springer, Cham. https://doi.org/10.1007/978-3-030-90539-2_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-90539-2_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-90538-5

  • Online ISBN: 978-3-030-90539-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics