Skip to main content

Efficient Two-Stage Evolutionary Search of Convolutional Neural Architectures Based on Cell Independence Analysis

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1516))

Included in the following conference series:

  • 2316 Accesses

Abstract

In the literature, cell-based neural architecture search (NAS) has achieved efficient NAS performance by decomposing the general search space and focusing the search on the micro-architecture. Recently, it has attracted much attention and achieved considerable success in the design of deep convolutional neural networks (CNNs) for vision-oriented tasks, such as image recognition, object detection, etc. However, in most heuristic cell-based NAS methods, the joint optimization of normal cells and reduction cells leads to an extremely time-consuming search. Taking this cue, in this paper, we present a preliminary study on investigating the independence between different cells towards efficient cell-based NAS design. Based on the investigation, we further propose a two-stage search paradigm for cell-based NAS, which can be easily integrated into existing heuristic search methods. To validate the efficacy of the proposed approach, an empirical study has been conducted on the CIFAR-10 dataset, using a genetic algorithm as the basic heuristic solver.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The GPUs used throughout the experiment were NVIDIA RTX 2080Ti.

References

  1. Brock, A., Lim, T., Ritchie, J., Weston, N.: Smash: one-shot model architecture search through hypernetworks. In: International Conference on Learning Representations (2018)

    Google Scholar 

  2. Cai, H., Zhu, L., Han, S.: ProxylessNAS: direct neural architecture search on target task and hardware. In: International Conference on Learning Representations (2019). https://arxiv.org/pdf/1812.00332.pdf

  3. DeVries, T., Taylor, G.W.: Improved regularization of convolutional neural networks with cutout. arXiv preprint arXiv:1708.04552 (2017)

  4. Huang, G., Liu, Z., van der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2017)

    Google Scholar 

  5. Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. In: International Conference on Learning Representations (2019)

    Google Scholar 

  6. Pham, H., Guan, M.Y., Zoph, B., Le, Q.V., Dean, J.: Efficient neural architecture search via parameters sharing. In: International Conference on Machine Learning, pp. 4092–4101 (2018)

    Google Scholar 

  7. Radosavovic, I., Johnson, J., Xie, S., Lo, W.Y., Dollár, P.: On network design spaces for visual recognition. arXiv preprint arXiv:1905.13214 (2019)

  8. Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. Proc. AAAI Conf. Artif. Intell. 33(01), 4780–4789 (2019). https://doi.org/10.1609/aaai.v33i01.33014780

  9. Sun, Y., Xue, B., Zhang, M., Yen, G.G.: Completely automated CNN architecture design based on blocks. IEEE Trans. Neural Netw. Learn. Syst. 31(4), 1–13 (2019). https://doi.org/10.1109/TNNLS.2019.2919608

  10. Xie, S., Girshick, R., Dollár, P., Tu, Z., He, K.: Aggregated residual transformations for deep neural networks. arXiv preprint arXiv:1611.05431 (2016)

  11. Xu, Y., et al.: PC-DARTS: partial channel connections for memory-efficient architecture search. In: International Conference on Learning Representations (2020)

    Google Scholar 

  12. Zela, A., Klein, A., Falkner, S., Hutter, F.: Towards automated deep learning: efficient joint neural architecture and hyperparameter search. In: ICML 2018 AutoML Workshop, July 2018

    Google Scholar 

  13. Zheng, X., et al.: Rethinking performance estimation in neural architecture search. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 11356–11365 (2020)

    Google Scholar 

  14. Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. arXiv:1611.01578 [cs], February 2017

  15. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning transferable architectures for scalable image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8697–8710 (2018)

    Google Scholar 

Download references

Acknowledgement

This work is partially supported by the Alibaba Group through the Alibaba Innovative Research Program under Grant No. H20210412 and the National Natural Science Foundation of China (NSFC) under Grant No. 61876025.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liang Feng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hou, B., Dong, J., Feng, L., Qiu, M. (2021). Efficient Two-Stage Evolutionary Search of Convolutional Neural Architectures Based on Cell Independence Analysis. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Communications in Computer and Information Science, vol 1516. Springer, Cham. https://doi.org/10.1007/978-3-030-92307-5_70

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-92307-5_70

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-92306-8

  • Online ISBN: 978-3-030-92307-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics