Adaptive Critical Reservoirs with Power Law Forgetting of Unexpected Input Sequences

  • Norbert Michael Mayer
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8681)


The echo-state condition names an upper limit for the hidden layer connectivity in recurrent neural networks. If the network is below this limit there is an injective, continuous mapping from the recent input history to the internal state of the network. Above the network becomes chaotic, the dependence on the initial state of the network may never be washed out. I focus on the biological relevance of echo state networks with a critical connectivity strength at the separation line between these two conditions and discuss some related biological findings, i.e. there is evidence that the neural connectivity in cortical slices is tuned to a critical level. In addition, I propose a model that makes use of a special learning mechanism within the recurrent layer and the input connectivity. Results show that after adaptation indeed traces of single unexpected events stay for a longer time period than exponential in the network.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Beggs, J., Plenz, D.: Neuronal avalanches in neocortical curcuits. J. Neurosci. 24(22), 5216–5229 (2004)CrossRefGoogle Scholar
  2. 2.
    Levina, A., Herrmann, M.: Dynamical synapses give rise to a power-law distribution of neuronal avalanches. In: Weiss, Y., Schoellkopf, B., Patt, J. (eds.) Advances in Neural Information Processing Systems, vol. 18, pp. 771–778. MIT Press, Cambridge (2006)Google Scholar
  3. 3.
    Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127–149 (2009), doi:10.1016/j.cosrev.2009.03.005.CrossRefGoogle Scholar
  4. 4.
    Nikolić, D., Häusler, S., Singer, W., Maass, W.: Distributed fading memory for stimulus properties in the primary visual cortex. PLoS Biol 7(12), e1000260 (2009), doi:10.1371/journal.pbio.1000260.Google Scholar
  5. 5.
    Hajnal, M.A., Lőrincz, A.: Critical echo state networks. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. Part I. NCS, vol. 4131, pp. 658–667. Springer, Heidelberg (2006)Google Scholar
  6. 6.
    Boedecker, J., Obst, O., Lizier, J., Mayer, N., Asada, M.: Information processing in echo state networks at the edge of chaos. Theory in Biosciences 131, 205–213 (2012)CrossRefGoogle Scholar
  7. 7.
    Obst, O., Boedecker, J.: Guided self-organization of input-driven recurrent neural networks. In: Guided Self-Organization, pp. 319–340 (2014)Google Scholar
  8. 8.
    van Vreeswick, C., Sompolinsky, H.: Chaotic balanced state in a model of cortical circuits. Neural Computation 10, 1321–1371 (1998)CrossRefGoogle Scholar
  9. 9.
    Jaeger, H.: The ’echo state’ approach to analysing and training recurrent neural networks. In: GMD Report 148, GMD German National Research Insitute for Computer Science (2001),
  10. 10.
    Jaeger, H.: Adaptive nonlinear system identification with echo state networks. In: Becker, S., Thrun, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems 15 (NIPS 2002). MIT Press, Cambridge (2003)Google Scholar
  11. 11.
    Buechner, M., Young, P.: A Tighter Bound for the Echo State Property. IEEE Transaction on Neural Networks 17(3), 820–824 (2006)Google Scholar
  12. 12.
    Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Networks 35, 1–20 (2012)CrossRefzbMATHGoogle Scholar
  13. 13.
    Steil, J.: Online stability of backpropagation-decorrelation recurrent learning. Neurocomputing 69(79), 642–650 (2006), CrossRefGoogle Scholar
  14. 14.
    Mayer, N.M., Browne, M.: Self-prediction in echo state networks. In: Proceedings of The First International Workshop on Biological Inspired Approaches to Advanced Information Technology (BioAdIt 2004), Lausanne (2004)Google Scholar
  15. 15.
    Mayer, N.M., Obst, O., Yu-Chen, C.: Time series causality inference using echo state networks. In: Vigneron, V., Zarzoso, V., Moreau, E., Gribonval, R., Vincent, E. (eds.) LVA/ICA 2010. LNCS, vol. 6365, pp. 279–286. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  16. 16.
    Herbert, J.S., Pascalis, O.: Memory development. In: Slator, A., Lewis, M., eds.: Introduction to Infant Development. Oxford University Press (2007)Google Scholar
  17. 17.
    Uhlig, M., Levina, A., Geisel, T., Herrmann, J.M.: Critical dynamics in associative memory networks. Front Comput. Neurosci. 7(87) (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Norbert Michael Mayer
    • 1
  1. 1.Dept. of Electrical Engineering and Advanced Institute of Manufacturing with High-tech Innovations (AIM-HI)Nat’l. Chung Cheng UniversityChia-YiTaiwan

Personalised recommendations