Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

A survey on LSTM memristive neural network architectures and applications

Abstract

The recurrent neural networks (RNN) found to be an effective tool for approximating dynamic systems dealing with time and order dependent data such as video, audio and others. Long short-term memory (LSTM) is a recurrent neural network with a state memory and multilayer cell structure. Hardware acceleration of LSTM using memristor circuit is an emerging topic of study. In this work, we look at history and reasons why LSTM neural network has been developed. We provide a tutorial survey on the existing LSTM methods and highlight the recent developments in memristive LSTM architectures.

This is a preview of subscription content, log in to check access.

References

  1. 1.

    K. Adam, K. Smagulova, A.P. James, Memristive LSTM network hardware architecture for time-series predictive modeling problems, in 2018 IEEE Asia Pacific Conference on Circuits and Systems (APCCAS) (IEEE, 2018), pp. 459–462

  2. 2.

    K. Adam, K. Smagulova, O. Krestinskaya, A.P. James, Wafer quality inspection using memristive LSTM, ANN, DNN, and HTM, https://arXiv:11809.10438 (2018)

  3. 3.

    F. Conti, L. Cavigelli, G. Paulin, I. Susmelj, L. Benini, Chipmunk: A systolically scalable 0.9 mm 2, 3.08 Gop/s/mW at 1.2 mW accelerator for near-sensor recurrent neural network inference, in Custom Integrated Circuits Conference (CICC) (IEEE, 2018), pp. 1–4

  4. 4.

    F.A. Gers, J. Schmidhuber, F. Cummins, Learning to forget: Continual prediction with LSTM (IEEE, London, 1999), pp. 850–855

  5. 5.

    F.A. Gers, N.N. Schraudolph, J. Schmidhuber, J. Mach. Learn. Res. 3, 115 (2002)

  6. 6.

    A. Gomez, Backpropogating an LSTM: a numerical example, Aidan Gomez blog at Medium, 2016

  7. 7.

    K. Greff, R.K. Srivastava, J. Koutnk, B.R. Steunebrink, J. Schmidhuber, IEEE Trans. Neural Netw. Learn. Syst. 28, 2222 (2017)

  8. 8.

    S. Hochreiter, J. Schmidhuber, Neural Comput. 9, 1735 (1997)

  9. 9.

    A. Karpathy, The unreasonable effectiveness of recurrent neural networks, 2015 (2016), http://karpathy.github.io/2015/05/21/rnn-effectiveness

  10. 10.

    C. Li, Z. Wang, M. Rao, D. Belkin, W. Song, H. Jiang, P. Yan, Y. Li, P. Lin, M. Hu, N. Ge, Nat. Mach. Intell. 1, 49 (2019)

  11. 11.

    Z.C. Lipton, J. Berkowitz, C. Elkan, A critical review of recurrent neural networks for sequence learning, https://arXiv:1506.00019 (2015)

  12. 12.

    C. Olah, Understanding LSTM networks (2015)

  13. 13.

    K. Smagulova, K. Adam, O. Krestinskaya, A.P. James, Design of cmos-memristor circuits for lstm architecture, https://arXiv:1806.02366 (2018)

  14. 14.

    K. Smagulova, O. Krestinskaya, A.P. James, Analog Integr. Circ. Sig. Process. 95, 467 (2018)

  15. 15.

    Z. Sun, Y. Zhu, Y. Zheng, H. Wu, Z. Cao, P. Xiong, J. Hou, T. Huang, Z. Que, FPGA acceleration of lstm based on data for test flight, in 2018 IEEE International Conference on Smart Cloud (SmartCloud) (IEEE, 2018), pp. 1–6

  16. 16.

    I. Sutskever, O. Vinyals, Q.V. Le, Sequence to sequence learning with neural networks, in Advances in Neural Information Processing Systems (2014), pp. 3104–3112

  17. 17.

    Z. Zhao, A. Srivastava, L. Peng, Q. Chen, ACM J. Emerg. Technol. Comput. Syst. 15, 13 (2019)

Download references

Author information

Correspondence to Alex Pappachen James.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Smagulova, K., James, A.P. A survey on LSTM memristive neural network architectures and applications. Eur. Phys. J. Spec. Top. 228, 2313–2324 (2019). https://doi.org/10.1140/epjst/e2019-900046-x

Download citation