Skip to main content

Toward Real-Time Federated Evolutionary Neural Architecture Search

  • Chapter
  • First Online:
Automated Design of Machine Learning and Search Algorithms

Part of the book series: Natural Computing Series ((NCS))

Abstract

Neural architecture search (NAS) is an important research topic of automated machine learning, which aims to automatically search for neural network architectures that can efficiently learn for a given task. Furthermore, there is an increasing demand for deploying computationally efficient NAS systems that can be used on edge devices that have limited computational powers. In particular, federated learning is an online distributed machine learning scheme that requires online and federated search of neural architectures involving edge devices. However, most exiting NAS methods are naturally not well suited for distributed real-time systems, due to high computational and communication costs. This chapter provides a brief introduction to methods for reducing computational cost in NAS, followed by a presentation of two evolutionary frameworks we recently developed for real-time federated NAS.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. B. Baker, O. Gupta, N. Naik, R. Raskar, Designing neural network architectures using reinforcement learning (2016). arXiv:1611.02167

  2. G. Bender, Understanding and simplifying one-shot architecture search (2019)

    Google Scholar 

  3. H. Brendan McMahan, E. Moore, D. Ramage, S. Hampson, B. Agüera y Arcas, Communication-efficient learning of deep networks from decentralized data (2016). arXiv:1602.05629

  4. M. Buzdalov, I. Yakupov, A. Stankevich, Fast implementation of the steady-state nsga-ii algorithm for two dimensions based on incremental non-dominated sorting, in Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation (2015), pp. 647–654

    Google Scholar 

  5. H. Cai, L. Zhu, S. Han, Proxylessnas: direct neural architecture search on target task and hardware (2018). arXiv:1812.00332

  6. C. Coello Coello, A comprehensive survey of evolutionary-based multiobjective optimization techniques. Knowl. Inf. Syst. 1(3), 269–308 (1999)

    Google Scholar 

  7. C. Coello Coello, Evolutionary multiobjective optimization: open research areas and some challenges lying ahead. Complex Intell. Syst. 6(2), 221–236 (2020)

    Google Scholar 

  8. K. Deb, A. Pratap, S. Agarwal, T.A.M.T. Meyarivan, A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Google Scholar 

  9. X. Dong, Y. Yang, Searching for a robust neural architecture in four gpu hours, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2019), pp. 1761–1770

    Google Scholar 

  10. X. Dong, Y. Yang, Nas-bench-102: extending the scope of reproducible neural architecture search (2020). arXiv:2001.00326

  11. Z. Guo, X. Zhang, H. Mu, W. Heng, Z. Liu, Y. Wei, J. Sun, Single path one-shot neural architecture search with uniform sampling (2019). arXiv:1904.00420

  12. K. He, X. Zhang, S. Ren, J. Sun, Deep residual learning for image recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016), pp. 770–778

    Google Scholar 

  13. A.G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. Weyand, M. Andreetto, H. Adam, Mobilenets: efficient convolutional neural networks for mobile vision applications (2017). arXiv:1704.04861

  14. S. Ioffe, C. Szegedy, Batch normalization: Accelerating deep network training by reducing internal covariate shift (2015). arXiv:1502.03167

  15. Y. Jin, H. Wang, T. Chugh, D. Guo, K. Miettinen, Data-driven evolutionary optimization: an overview and case studies. IEEE Trans. Evol. Comput. 23(3), 442–458 (2019)

    Article  Google Scholar 

  16. K. Kandasamy, W. Neiswanger, J. Schneider, B. Poczos, E.P. Xing, Neural architecture search with Bayesian optimisation and optimal transport, in Advances in Neural Information Processing Systems (2018), pp. 2016–2025

    Google Scholar 

  17. A. Krizhevsky, V. Nair, G. Hinton, Cifar-10 (canadian institute for advanced research) (2010). http://www.cs.toronto.edu/kriz/cifar.html

  18. H. Liu, K. Simonyan, O. Vinyals, C. Fernando, K. Kavukcuoglu, Hierarchical representations for efficient architecture search (2017). arXiv:1711.00436

  19. H. Liu, K. Simonyan, Y. Yang, Darts: differentiable architecture search (2018). arXiv:1806.09055

  20. Z. Lu, I. Whalen, V. Boddeti, Y. Dhebar, K. Deb, E. Goodman, W. Banzhaf, A multi-objective genetic algorithm for neural architecture search, Nsga-net (2018)

    Google Scholar 

  21. Y. Netzer, T. Wang, A. Coates, A. Bissacco, B. Wu, A.Y. Ng, Reading digits in natural images with unsupervised feature learning (2011)

    Google Scholar 

  22. L. Perez, J. Wang, The effectiveness of data augmentation in image classification using deep learning (2017). arXiv:1712.04621

  23. H. Pham, M.Y. Guan, B. Zoph, Q.V. Le, J. Dean, Efficient neural architecture search via parameter sharing (2018). arXiv:1802.03268

  24. E. Real, A. Aggarwal, Y. Huang, Q.V. Le, Regularized evolution for image classifier architecture search, in Proceedings of the aaai Conference on Artificial Intelligence, vol. 33 (2019), pp. 4780–4789

    Google Scholar 

  25. E. Real, S. Moore, A. Selle, S. Saxena, Y. Leon Suematsu, J. Tan, Q.V. Le, A. Kurakin, Large-scale evolution of image classifiers, in Proceedings of the 34th International Conference on Machine Learning, vol. 70 (JMLR. org, 2017), pp. 2902–2911

    Google Scholar 

  26. S. Ruder, An overview of gradient descent optimization algorithms (2016). arXiv:1609.04747

  27. M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, L.-C. Chen, Mobilenetv2: inverted residuals and linear bottlenecks, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018), pp. 4510–4520

    Google Scholar 

  28. C. Sciuto, K. Yu, M. Jaggi, C. Musat, M. Salzmann, Evaluating the search phase of neural architecture search (2019). arXiv:1902.08142

  29. M. Suganuma, S. Shirakawa, T. Nagao, A genetic programming approach to designing convolutional neural network architectures, in Proceedings of the Genetic and Evolutionary Computation Conference (2017), pp. 497–504

    Google Scholar 

  30. Y. Sun, H. Wang, B. Xue, Y. Jin, G.G. Yen, M. Zhang, Surrogate-assisted evolutionary deep learning using an end-to-end random forest-based performance predictor. IEEE Trans. Evol. Comput. 24(2), 350–364 (2020)

    Article  Google Scholar 

  31. C. White, W. Neiswanger, Y. Savani, BANANAS: bayesian optimization with neural architectures for neural architecture search (2019). arXiv:1910.11858

  32. Y. Xu, L. Xie, X. Zhang, X. Chen, G.-J. Qi, Q. Tian, H. Xiong, Pc-darts: partial channel connections for memory-efficient architecture search, in International Conference on Learning Representations (2019)

    Google Scholar 

  33. I. Yakupov, M. Buzdalov, Improved incremental non-dominated sorting for steady-state evolutionary multiobjective optimization, in Proceedings of the Genetic and Evolutionary Computation Conference (2017), pp. 649–656

    Google Scholar 

  34. I. Yakupov, M. Buzdalov, On asynchronous non-dominated sorting for steady-state multiobjective evolutionary algorithms, in Proceedings of the Genetic and Evolutionary Computation Conference Companion (2018), pp. 205–206

    Google Scholar 

  35. A. Zela, J. Siems, F. Hutter, Nas-bench-1shot1: benchmarking and dissecting one-shot neural architecture search (2020). arXiv:2001.10422

  36. Z. Zhong, J. Yan, W. Wu, J. Shao, C.-L. Liu, Practical block-wise neural network architecture generation, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018), pp. 2423–2432

    Google Scholar 

  37. Y. Zhou, Y. Jin, J. Ding, Surrogate-assisted evolutionary search of spiking neural architectures in liquid state machines. Neurocomputing (2020)

    Google Scholar 

  38. H. Zhu, Y. Jin, Multi-objective evolutionary federated learning. IEEE Trans. Neural Netw. Learn. Syst. 31(4), 1320–1322 (2020)

    Google Scholar 

  39. B. Zoph, Q.V. Le, Neural architecture search with reinforcement learning (2016). arXiv:1611.01578

  40. B. Zoph, V. Vasudevan, J. Shlens, Q.V. Le, Learning transferable architectures for scalable image recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018), pp. 8697–8710

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yaochu Jin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Zhu, H., Jin, Y. (2021). Toward Real-Time Federated Evolutionary Neural Architecture Search. In: Pillay, N., Qu, R. (eds) Automated Design of Machine Learning and Search Algorithms. Natural Computing Series. Springer, Cham. https://doi.org/10.1007/978-3-030-72069-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-72069-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-72068-1

  • Online ISBN: 978-3-030-72069-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics