Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures

  • George KyriakidesEmail author
  • Konstantinos Margaritis
Conference paper
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 559)


Autonomous design and optimization of neural networks is gaining increasingly more attention from the research community. The main barrier is the computational resources required to conduct experimental and production project. Although most researchers focus on new design methodologies, the main computational cost remains the evaluation of candidate architectures. In this paper we investigate the feasibility of using reduced epoch training, by measuring the rank correlation coefficients between sets of optimizers, given a fixed number of training epochs. We discover ranking correlations of more than 0.75 and up to 0.964 between Adam with 50 training epochs, stochastic gradient descent with nesterov momentum with 10 training epochs and Adam with 20 training epochs. Moreover, we show the ability of genetic algorithms to find high-quality solutions of a function, by searching in a perturbed search space, given that certain correlation criteria are met.


Deep learning Neural architecture search Ranking 



This work was supported by computational time granted from the Greek Research & Technology Network (GRNET) in the National HPC facility - ARIS - under project ID DNAD. Furthermore, this research is funded by the University of Macedonia Research Committee as part of the “Principal Research 2019” funding program.


  1. 1.
    Miikkulainen, R., Liang, J., Meyerson, E., et al.: Evolving deep neural networks. In: Artificial Intelligence in the age of Neural Networks and Brain Computing, pp. 293–312 (2019). Scholar
  2. 2.
    Zoph, B., Le, Q.: Neural architecture search with reinforcement learning (2016). Accessed 21 Feb 2019
  3. 3.
    Zoph, B., Vasudevan, V., Shlens, J., Le, Q.: Learning transferable architectures for scalable image recognition. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition (2018).
  4. 4.
    Kyriakides, G., Margaritis, K.: Neural architecture search with synchronous advantage actor-critic methods and partial training. In: Proceedings of the 10th Hellenic Conference on Artificial Intelligence - SETN 2018 (2018).
  5. 5.
    Liu, C., et al.: Progressive neural architecture search. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11205, pp. 19–35. Springer, Cham (2018). Scholar
  6. 6.
    Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)MathSciNetzbMATHGoogle Scholar
  7. 7.
    Cai, H., Zhu, L., Han, S.: ProxylessNAS: direct neural architecture search on target task and hardware. Accessed 19 Mar 2019
  8. 8.
    Cortes, C., Gonzalvo, X., Kuznetsov, V., et al.: AdaNet: adaptive structural learning of artificial neural networks (2019). Accessed 21 Feb 2019
  9. 9.
    Dong, C., Shi, Y., Tao, R.: Convolutional neural networks for clothing image style recognition. DEStech Trans. Comput. Sci. Eng. (2018).
  10. 10.
    Lee, D., McNair, J.: Deep reinforcement learning agent for playing 2D shooting games. Int. J. Control Autom. 11, 193–200 (2018). Scholar
  11. 11.
    Gatys, L., Ecker, A., Bethge, M.: Image style transfer using convolutional neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)Google Scholar
  12. 12.
    Stanley, K., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10, 99–127 (2002). Scholar
  13. 13.
    Moriarty, D., Miikkulainen, R.: Forming neural networks through efficient and adaptive coevolution. Evol. Comput. 5, 373–399 (1997). Scholar
  14. 14.
    Kendall, M.: A new measure of rank correlation. Biometrika 30, 81 (1938). Scholar
  15. 15.
    Biscani, F., Izzo, D., Jakob, W., et al.: esa/pagmo2: pagmo 2.10. In: Zenodo (2019). Accessed 21 Feb 2019

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  1. 1.University of MacedoniaThessalonikiGreece

Personalised recommendations