Skip to main content

FTR-NAS: Fault-Tolerant Recurrent Neural Architecture Search

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2020)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1333))

Included in the following conference series:

Abstract

With the popularity of the applications equipped with neural networks on edge devices, robustness has become the focus of researchers. However, when deploying the applications onto the hardware, environmental noise is unavoidable, in which errors may cause applications crash, especially for the safety-critic applications. In this paper, we propose FTR-NAS to optimize recurrent neural architectures to enhance the fault tolerance. First, according to real deployment scenarios, we formalize computational faults and weight faults, which are simulated with Multiply-Accumulate (MAC)-independent and identically distributed (i.i.d) Bit-Bias (MiBB) model and Stuck-at-Fault (SAF) model, respectively. Next, we establish a multi-objective NAS framework powered by the fault models to discover high-performance and fault-tolerant recurrent architectures. Moreover, we incorporate fault-tolerant training (FTT) in the search process to further enhance the fault tolerance of the recurrent architectures. Experimentally, C-FTT-RNN and W-FTT-RNN we discovered on PTB dataset have promising fault tolerance for computational and weight faults. Besides, we further demonstrate the usefulness of the learned architectures by transferring it to WT2 dataset well.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    \(\alpha _l\) keeps consistent in both architecture searching and training process.

References

  1. Li, W., Ning, X., Ge, G., Chen, X., Wang, Y., Yang, H.: FTT-NAS: discovering fault-tolerant neural architecture. In: 2020 25th Asia and South Pacific Design Automation Conference (ASP-DAC), pp. 211–216. IEEE (2020)

    Google Scholar 

  2. Pham, H., Guan, M.Y., Zoph, B., Le, Q.V., Dean, J.: Efficient neural architecture search via parameter sharing. arXiv preprint arXiv:1802.03268 (2018)

  3. Liu, H., Simonyan, K., Yang, Y.: Darts: Differentiable architecture search. arXiv preprint arXiv:1806.09055 (2018)

  4. He, Z., Lin, J., Ewetz, R., Yuan, J.S., Fan, D.: Noise injection adaption: end-to-end reram crossbar non-ideal effect adaption for neural network mapping. In: Proceedings of the 56th Annual Design Automation Conference 2019, pp. 1–6 (2019)

    Google Scholar 

  5. Zilly, J.G., Srivastava, R.K., Koutnık, J., Schmidhuber, J.: Recurrent highway networks. In: Proceedings of the 34th International Conference on Machine Learning-Volume 70. pp. 4189–4198. JMLR. org (2017)

    Google Scholar 

  6. Chen, C.Y., et al.: RRAM defect modeling and failure analysis based on march test and a novel squeeze-search scheme. IEEE Trans. Comput. 64(1), 180–190 (2014)

    Article  MathSciNet  Google Scholar 

  7. Hu, K., Shuo, T., Shasha, G., Nan, L., Li, L., Wang, L.: Recurrent neural architecture search based on randomness-enhanced Tabu algorithm (2020, in press)

    Google Scholar 

  8. Cai, H., Zhu, L., Han, S.: Proxylessnas: direct neural architecture search on target task and hardware. arXiv preprint arXiv:1812.00332 (2018)

Download references

Acknowledgments

This work is founded by National Key R&D Program of China [grant numbers 2018YFB2202603].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lei Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hu, K., Ding, D., Tian, S., Gong, R., Luo, L., Wang, L. (2020). FTR-NAS: Fault-Tolerant Recurrent Neural Architecture Search. In: Yang, H., Pasupa, K., Leung, A.CS., Kwok, J.T., Chan, J.H., King, I. (eds) Neural Information Processing. ICONIP 2020. Communications in Computer and Information Science, vol 1333. Springer, Cham. https://doi.org/10.1007/978-3-030-63823-8_67

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-63823-8_67

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-63822-1

  • Online ISBN: 978-3-030-63823-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics