Advertisement

Restricted Echo State Networks

  • Aaron Stockdill
  • Kourosh Neshatian
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9992)

Abstract

Echo state networks are a powerful type of reservoir neural network, but the reservoir is essentially unrestricted in its original formulation. Motivated by limitations in neuromorphic hardware, we remove combinations of the four sources of memory—leaking, loops, cycles, and discrete time—to determine how these influence the suitability of the reservoir. We show that loops and cycles can replicate each other, while discrete time is a necessity. The potential limitation of energy conservation is equivalent to limiting the spectral radius.

Keywords

Spectral Radius Extend Kalman Filter Hardware Implementation Recurrent Neural Network Discrete Time Step 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Čerňanský, M., Makula, M.: Feed-forward echo state networks. In: Proceedings. 2005 IEEE International Joint Conference on Neural Networks 2005, vol. 3, pp. 1479–1482 (2005)Google Scholar
  2. 2.
    Čerňanský, M., Tiňo, P.: Comparison of echo state networks with simple recurrent networks and variable-length Markov models on symbolic sequences. In: Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 618–627. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-74690-4_63 CrossRefGoogle Scholar
  3. 3.
    Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. GMD Report (2001)Google Scholar
  4. 4.
    Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, 2nd edn, pp. 659–686. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-35289-8_36 Google Scholar
  5. 5.
    Sillin, O.: H., Aguilera, R., Shieh, H.-H., Avizienis, A.V., Aono, M., Stieg, A.Z., Gimzewski, J.K.: A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing. Nanotechnology 24(38), 384004 (2013)CrossRefGoogle Scholar
  6. 6.
    Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)CrossRefGoogle Scholar
  7. 7.
    Williams, R.J.: Training recurrent networks using the extended Kalman filter. In: International Joint Conference on Neural Networks 1992. IJCNN, vol. 4, pp. 241–246 (1992)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Department of Computer Science and Software EngineeringUniversity of CanterburyChristchurchNew Zealand

Personalised recommendations