Restricted Echo State Networks

  • Aaron StockdillEmail author
  • Kourosh Neshatian
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9992)


Echo state networks are a powerful type of reservoir neural network, but the reservoir is essentially unrestricted in its original formulation. Motivated by limitations in neuromorphic hardware, we remove combinations of the four sources of memory—leaking, loops, cycles, and discrete time—to determine how these influence the suitability of the reservoir. We show that loops and cycles can replicate each other, while discrete time is a necessity. The potential limitation of energy conservation is equivalent to limiting the spectral radius.


  1. 1.
    Čerňanský, M., Makula, M.: Feed-forward echo state networks. In: Proceedings. 2005 IEEE International Joint Conference on Neural Networks 2005, vol. 3, pp. 1479–1482 (2005)Google Scholar
  2. 2.
    Čerňanský, M., Tiňo, P.: Comparison of echo state networks with simple recurrent networks and variable-length Markov models on symbolic sequences. In: Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 618–627. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-74690-4_63CrossRefGoogle Scholar
  3. 3.
    Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. GMD Report (2001)Google Scholar
  4. 4.
    Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, 2nd edn, pp. 659–686. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-35289-8_36CrossRefGoogle Scholar
  5. 5.
    Sillin, O.: H., Aguilera, R., Shieh, H.-H., Avizienis, A.V., Aono, M., Stieg, A.Z., Gimzewski, J.K.: A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing. Nanotechnology 24(38), 384004 (2013)CrossRefGoogle Scholar
  6. 6.
    Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)CrossRefGoogle Scholar
  7. 7.
    Williams, R.J.: Training recurrent networks using the extended Kalman filter. In: International Joint Conference on Neural Networks 1992. IJCNN, vol. 4, pp. 241–246 (1992)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (, which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Department of Computer Science and Software EngineeringUniversity of CanterburyChristchurchNew Zealand

Personalised recommendations