Skip to main content
Log in

A survey on LSTM memristive neural network architectures and applications

  • Review
  • Published:
The European Physical Journal Special Topics Aims and scope Submit manuscript

Abstract

The recurrent neural networks (RNN) found to be an effective tool for approximating dynamic systems dealing with time and order dependent data such as video, audio and others. Long short-term memory (LSTM) is a recurrent neural network with a state memory and multilayer cell structure. Hardware acceleration of LSTM using memristor circuit is an emerging topic of study. In this work, we look at history and reasons why LSTM neural network has been developed. We provide a tutorial survey on the existing LSTM methods and highlight the recent developments in memristive LSTM architectures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. K. Adam, K. Smagulova, A.P. James, Memristive LSTM network hardware architecture for time-series predictive modeling problems, in 2018 IEEE Asia Pacific Conference on Circuits and Systems (APCCAS) (IEEE, 2018), pp. 459–462

  2. K. Adam, K. Smagulova, O. Krestinskaya, A.P. James, Wafer quality inspection using memristive LSTM, ANN, DNN, and HTM, https://arXiv:11809.10438 (2018)

  3. F. Conti, L. Cavigelli, G. Paulin, I. Susmelj, L. Benini, Chipmunk: A systolically scalable 0.9 mm 2, 3.08 Gop/s/mW at 1.2 mW accelerator for near-sensor recurrent neural network inference, in Custom Integrated Circuits Conference (CICC) (IEEE, 2018), pp. 1–4

  4. F.A. Gers, J. Schmidhuber, F. Cummins, Learning to forget: Continual prediction with LSTM (IEEE, London, 1999), pp. 850–855

  5. F.A. Gers, N.N. Schraudolph, J. Schmidhuber, J. Mach. Learn. Res. 3, 115 (2002)

    Google Scholar 

  6. A. Gomez, Backpropogating an LSTM: a numerical example, Aidan Gomez blog at Medium, 2016

  7. K. Greff, R.K. Srivastava, J. Koutnk, B.R. Steunebrink, J. Schmidhuber, IEEE Trans. Neural Netw. Learn. Syst. 28, 2222 (2017)

    Article  MathSciNet  Google Scholar 

  8. S. Hochreiter, J. Schmidhuber, Neural Comput. 9, 1735 (1997)

    Article  Google Scholar 

  9. A. Karpathy, The unreasonable effectiveness of recurrent neural networks, 2015 (2016), http://karpathy.github.io/2015/05/21/rnn-effectiveness

  10. C. Li, Z. Wang, M. Rao, D. Belkin, W. Song, H. Jiang, P. Yan, Y. Li, P. Lin, M. Hu, N. Ge, Nat. Mach. Intell. 1, 49 (2019)

    Article  Google Scholar 

  11. Z.C. Lipton, J. Berkowitz, C. Elkan, A critical review of recurrent neural networks for sequence learning, https://arXiv:1506.00019 (2015)

  12. C. Olah, Understanding LSTM networks (2015)

  13. K. Smagulova, K. Adam, O. Krestinskaya, A.P. James, Design of cmos-memristor circuits for lstm architecture, https://arXiv:1806.02366 (2018)

  14. K. Smagulova, O. Krestinskaya, A.P. James, Analog Integr. Circ. Sig. Process. 95, 467 (2018)

    Article  Google Scholar 

  15. Z. Sun, Y. Zhu, Y. Zheng, H. Wu, Z. Cao, P. Xiong, J. Hou, T. Huang, Z. Que, FPGA acceleration of lstm based on data for test flight, in 2018 IEEE International Conference on Smart Cloud (SmartCloud) (IEEE, 2018), pp. 1–6

  16. I. Sutskever, O. Vinyals, Q.V. Le, Sequence to sequence learning with neural networks, in Advances in Neural Information Processing Systems (2014), pp. 3104–3112

  17. Z. Zhao, A. Srivastava, L. Peng, Q. Chen, ACM J. Emerg. Technol. Comput. Syst. 15, 13 (2019)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alex Pappachen James.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Smagulova, K., James, A.P. A survey on LSTM memristive neural network architectures and applications. Eur. Phys. J. Spec. Top. 228, 2313–2324 (2019). https://doi.org/10.1140/epjst/e2019-900046-x

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1140/epjst/e2019-900046-x

Navigation