Abstract
The increase in the available data and computational power has led to the rapid evolution of the field of deep learning over the last few years. However, the success of deep learning methods relies on making appropriate neural architecture choices, which is not a straightforward task and usually requires a time-consuming trial-and-error procedure. Neural architecture search is the process of automating the design of neural network architectures capable of performing well on specific tasks. It is a field that has emerged in order to address the problem of designing efficient neural architectures and is gaining popularity due to the rapid evolution of deep learning, which has led to an increasing need for the discovery of high-performing neural architectures. This paper focuses on evolutionary neural architecture search, which is an efficient but also time-consuming and computationally expensive neural architecture search approach, and aims to pave the way for speeding up such algorithms by assessing the effect of acceleration methods on the overall performance of the neural architecture search procedure as well as on the produced architectures.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785ā794. KDD 2016. Association for Computing Machinery, New York, NY (2016). https://doi.org/10.1145/2939672.2939785
Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res. 20(55), 1ā21 (2019). http://jmlr.org/papers/v20/18-598.html
Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press (2016). http://www.deeplearningbook.org
Hodges, J.L.: The significance probability of the Smirnov two-sample test. Arkiv fƶr Matematik 3(5), 469ā486 (1958). https://doi.org/10.1007/BF02589501
Kyriakides, G., Margaritis, K.: An introduction to neural architecture search for convolutional networks. arXiv preprint arXiv:2005.11074 (2020)
Kyriakides, G., Margaritis, K.: NORD: a python framework for neural architecture search. Softw. Impacts 6, 100042 (2020). https://doi.org/10.1016/j.simpa.2020.100042
Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. In: International Conference on Learning Representations (2019). https://openreview.net/forum?id=S1eYHoC5FX
Mellor, J., Turner, J., Storkey, A., Crowley, E.J.: Neural architecture search without training. In: Meila, M., Zhang, T. (eds.) Proceedings of the 38th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 139, pp. 7588ā7598. PMLR, 18ā24 July 2021. https://proceedings.mlr.press/v139/mellor21a.html
Pan, C., Yao, X.: Neural architecture search based on evolutionary algorithms with fitness approximation. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1ā8 (2021). https://doi.org/10.1109/IJCNN52387.2021.9533986
Pham, H., Guan, M., Zoph, B., Le, Q., Dean, J.: Efficient neural architecture search via parameters sharing. In: Dy, J., Krause, A. (eds.) Proceedings of the 35th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 80, pp. 4095ā4104. PMLR, 10ā15 July 2018. https://proceedings.mlr.press/v80/pham18a.html
Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. In: Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence and Thirty-First Innovative Applications of Artificial Intelligence Conference and Ninth AAAI Symposium on Educational Advances in Artificial Intelligence. AAAI Press (2019). https://doi.org/10.1609/aaai.v33i01.33014780
Wistuba, M., Rawat, A., Pedapati, T.: A survey on neural architecture search. arXiv preprint arXiv:1905.01392 (2019)
Ying, C., Klein, A., Christiansen, E., Real, E., Murphy, K., Hutter, F.: NAS-Bench-101: towards reproducible neural architecture search. In: Chaudhuri, K., Salakhutdinov, R. (eds.) Proceedings of the 36th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 97, pp. 7105ā7114. PMLR, Long Beach, California, USA, 09ā15 June 2019. http://proceedings.mlr.press/v97/ying19a.html
Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, 24ā26 April 2017, Conference Track Proceedings. OpenReview.net (2017). https://openreview.net/forum?id=r1Ue8Hcxg
Acknowledgements
This paper is a result of research conducted within the āMSc in Artificial Intelligence and Data Analyticsā of the Department of Applied Informatics of the University of Macedonia. The presentation of the paper is funded by the University of Macedonia Research Committee.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Dervisi, F., Kyriakides, G., Margaritis, K. (2022). Evaluating Acceleration Techniques forĀ Genetic Neural Architecture Search. In: Iliadis, L., Jayne, C., Tefas, A., Pimenidis, E. (eds) Engineering Applications of Neural Networks. EANN 2022. Communications in Computer and Information Science, vol 1600. Springer, Cham. https://doi.org/10.1007/978-3-031-08223-8_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-08223-8_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-08222-1
Online ISBN: 978-3-031-08223-8
eBook Packages: Computer ScienceComputer Science (R0)