Skip to main content

Evaluating Acceleration Techniques forĀ Genetic Neural Architecture Search

  • Conference paper
  • First Online:
Engineering Applications of Neural Networks (EANN 2022)

Abstract

The increase in the available data and computational power has led to the rapid evolution of the field of deep learning over the last few years. However, the success of deep learning methods relies on making appropriate neural architecture choices, which is not a straightforward task and usually requires a time-consuming trial-and-error procedure. Neural architecture search is the process of automating the design of neural network architectures capable of performing well on specific tasks. It is a field that has emerged in order to address the problem of designing efficient neural architectures and is gaining popularity due to the rapid evolution of deep learning, which has led to an increasing need for the discovery of high-performing neural architectures. This paper focuses on evolutionary neural architecture search, which is an efficient but also time-consuming and computationally expensive neural architecture search approach, and aims to pave the way for speeding up such algorithms by assessing the effect of acceleration methods on the overall performance of the neural architecture search procedure as well as on the produced architectures.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785ā€“794. KDD 2016. Association for Computing Machinery, New York, NY (2016). https://doi.org/10.1145/2939672.2939785

  2. Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res. 20(55), 1ā€“21 (2019). http://jmlr.org/papers/v20/18-598.html

  3. Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press (2016). http://www.deeplearningbook.org

  4. Hodges, J.L.: The significance probability of the Smirnov two-sample test. Arkiv fƶr Matematik 3(5), 469ā€“486 (1958). https://doi.org/10.1007/BF02589501

    ArticleĀ  MathSciNetĀ  MATHĀ  Google ScholarĀ 

  5. Kyriakides, G., Margaritis, K.: An introduction to neural architecture search for convolutional networks. arXiv preprint arXiv:2005.11074 (2020)

  6. Kyriakides, G., Margaritis, K.: NORD: a python framework for neural architecture search. Softw. Impacts 6, 100042 (2020). https://doi.org/10.1016/j.simpa.2020.100042

    ArticleĀ  Google ScholarĀ 

  7. Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. In: International Conference on Learning Representations (2019). https://openreview.net/forum?id=S1eYHoC5FX

  8. Mellor, J., Turner, J., Storkey, A., Crowley, E.J.: Neural architecture search without training. In: Meila, M., Zhang, T. (eds.) Proceedings of the 38th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 139, pp. 7588ā€“7598. PMLR, 18ā€“24 July 2021. https://proceedings.mlr.press/v139/mellor21a.html

  9. Pan, C., Yao, X.: Neural architecture search based on evolutionary algorithms with fitness approximation. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1ā€“8 (2021). https://doi.org/10.1109/IJCNN52387.2021.9533986

  10. Pham, H., Guan, M., Zoph, B., Le, Q., Dean, J.: Efficient neural architecture search via parameters sharing. In: Dy, J., Krause, A. (eds.) Proceedings of the 35th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 80, pp. 4095ā€“4104. PMLR, 10ā€“15 July 2018. https://proceedings.mlr.press/v80/pham18a.html

  11. Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. In: Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence and Thirty-First Innovative Applications of Artificial Intelligence Conference and Ninth AAAI Symposium on Educational Advances in Artificial Intelligence. AAAI Press (2019). https://doi.org/10.1609/aaai.v33i01.33014780

  12. Wistuba, M., Rawat, A., Pedapati, T.: A survey on neural architecture search. arXiv preprint arXiv:1905.01392 (2019)

  13. Ying, C., Klein, A., Christiansen, E., Real, E., Murphy, K., Hutter, F.: NAS-Bench-101: towards reproducible neural architecture search. In: Chaudhuri, K., Salakhutdinov, R. (eds.) Proceedings of the 36th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 97, pp. 7105ā€“7114. PMLR, Long Beach, California, USA, 09ā€“15 June 2019. http://proceedings.mlr.press/v97/ying19a.html

  14. Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, 24ā€“26 April 2017, Conference Track Proceedings. OpenReview.net (2017). https://openreview.net/forum?id=r1Ue8Hcxg

Download references

Acknowledgements

This paper is a result of research conducted within the ā€œMSc in Artificial Intelligence and Data Analyticsā€ of the Department of Applied Informatics of the University of Macedonia. The presentation of the paper is funded by the University of Macedonia Research Committee.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Foteini Dervisi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dervisi, F., Kyriakides, G., Margaritis, K. (2022). Evaluating Acceleration Techniques forĀ Genetic Neural Architecture Search. In: Iliadis, L., Jayne, C., Tefas, A., Pimenidis, E. (eds) Engineering Applications of Neural Networks. EANN 2022. Communications in Computer and Information Science, vol 1600. Springer, Cham. https://doi.org/10.1007/978-3-031-08223-8_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-08223-8_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-08222-1

  • Online ISBN: 978-3-031-08223-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics