Advertisement

KI - Künstliche Intelligenz

, Volume 26, Issue 4, pp 365–371 | Cite as

Reservoir Computing Trends

  • Mantas LukoševičiusEmail author
  • Herbert Jaeger
  • Benjamin Schrauwen
Fachbeitrag

Abstract

Reservoir Computing (RC) is a paradigm of understanding and training Recurrent Neural Networks (RNNs) based on treating the recurrent part (the reservoir) differently than the readouts from it. It started ten years ago and is currently a prolific research area, giving important insights into RNNs, practical machine learning tools, as well as enabling computation with non-conventional hardware. Here we give a brief introduction into basic concepts, methods, insights, current developments, and highlight some applications of RC.

Keywords

Reservoir computing Recurrent neural network Echo state network 

Notes

Acknowledgements

The authors acknowledge support by the European FP7 project ORGANIC (http://reservoir-computing.org/organic). Patent note. The basic ESN architecture and algorithm are protected for commercial use by international patents held by the Fraunhofer Society [18].

References

  1. 1.
    Atiya AF, Parlos AG (2000) New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans Neural Netw 11(3):697–709 CrossRefGoogle Scholar
  2. 2.
    Bengio Y, Simard P, Frasconi P (1994) Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw 5(2):157–166 CrossRefGoogle Scholar
  3. 3.
    Bernacchia A, Seo H, Lee D, Wang XJ (2011) A reservoir of time constants for memory traces in cortical neurons. Nat Neurosci 14(3):366–372 CrossRefGoogle Scholar
  4. 4.
    Buesing L, Bill J, Nessler B, Maass W (2011) Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons. PLoS Comput Biol 7(11):e1002211 MathSciNetCrossRefGoogle Scholar
  5. 5.
    Bunke H, Varga T (2007) Off-line roman cursive handwriting recognition. In: Chaudhuri BB (ed) Digital document processing, advances in pattern recognition. Springer, Berlin, pp 165–183 Google Scholar
  6. 6.
    Buonomano DV, Maass W (2009) State-dependent computations: spatiotemporal processing in cortical networks. Nat Rev, Neurosci 10(2):113–125. http://www.ncbi.nlm.nih.gov/pubmed/19145235 CrossRefGoogle Scholar
  7. 7.
    Buteneers P, Verstraeten D, van Mierlo P, Wyckhuys T, Stroobandt D, Raedt R, Hallez H, Schrauwen B (2011) Automatic detection of epileptic seizures on the intra-cranial electroencephalogram of rats using reservoir computing. Artif Intell Med 53(3):215–223 CrossRefGoogle Scholar
  8. 8.
    Dominey PF (1995) Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning. Biol Cybern 73:265–274 zbMATHCrossRefGoogle Scholar
  9. 9.
    Dominey PF (2005) From sensorimotor sequence to grammatical construction: evidence from simulation and neurophysiology. Adapt Behav 13(4):347–361 CrossRefGoogle Scholar
  10. 10.
    Dominey PF, Ramus F (2000) Neural network processing of natural language. I. Sensitivity to serial, temporal and abstract structure of language in the infant. Lang Cogn Processes 15(1):87–127 CrossRefGoogle Scholar
  11. 11.
    Doya K (1992) Bifurcations in the learning of recurrent neural networks. In: Proceedings of IEEE international symposium on circuits and systems 1992, vol 6, pp 2777–2780 CrossRefGoogle Scholar
  12. 12.
    Fernando C, Sojakka S (2003) Pattern recognition in a bucket. In: Proceedings of the 7th European conference on advances in artificial life (ECAL 2003). LNCS, vol 2801. Springer, Berlin, pp 588–597 Google Scholar
  13. 13.
    Hermans M, Schrauwen B (2012) Recurrent kernel machines: computing with infinite echo state networks. Neural Comput 24(1):104–133. doi: 10.1162/NECO_a_00200 MathSciNetzbMATHCrossRefGoogle Scholar
  14. 14.
    Hinaut X, Dominey PF (2011) A three-layered model of primate prefrontal cortex encodes identity and abstract categorical structure of behavioral sequences. J Physiol 105(1–3):16–24 Google Scholar
  15. 15.
    Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780 CrossRefGoogle Scholar
  16. 16.
    Ijspeert AJ (2008) Central pattern generators for locomotion control in animals and robots: a review. Neural Netw 21:642–653 CrossRefGoogle Scholar
  17. 17.
    Ilies I, Jaeger H, Kosuchinas O, Rincon M, Šakėnas V, Vaškevičius N (2007) Stepping forward through echoes of the past: forecasting with echo state networks. http://www.neural-forecasting-competition.com/downloads/NN3/methods/27-NN3_Herbert_Jaeger_report.pdf. Short report on the winning entry to the NN3 financial forecasting competition
  18. 18.
    Jaeger H (2000) A method for supervised teaching of a recurrent artificial neural network. International patent. http://www.wipo.int/patentscope/search/en/WO2002031764
  19. 19.
    Jaeger H (2001) The “echo state” approach to analysing and training recurrent neural networks. Tech Rep GMD report 148, German National Research Center for Information Technology. http://www.faculty.jacobs-university.de/hjaeger/pubs/EchoStatesTechRep.pdf
  20. 20.
    Jaeger H (2002) Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the echo state network approach. GMD Report 159, Fraunhofer Institute AIS. http://minds.jacobs-university.de/pubs
  21. 21.
    Jaeger H (2007) Echo state network. Scholarpedia 2(9):2330. http://www.scholarpedia.org/article/Echo_state_network CrossRefGoogle Scholar
  22. 22.
    Jaeger H, Haas H (2004) Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667):78–80. doi: 10.1126/science.1091277 CrossRefGoogle Scholar
  23. 23.
    Jaeger H, Lukoševičius M, Popovici D, Siewert U (2007) Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw 20(3):335–352 zbMATHCrossRefGoogle Scholar
  24. 24.
    Jalalvand A, Triefenbach F, Verstraeten D, Martens JP (2011) Connected digit recognition by means of reservoir computing. In: Proceedings of interspeech 2011, pp 1725–1728 Google Scholar
  25. 25.
    Kindermans PJ, Buteneers P, Verstraeten D, Schrauwen B (2010) An uncued brain-computer interface using reservoir computing. In: Proceedings of the workshop on machine learning for assistive technologies Google Scholar
  26. 26.
    Larger L, Soriano MC, Brunner D, Appeltant L, Gutierrez JM, Pesquera L, Mirasso CR, Fischer I (2012) Photonic information processing beyond Turing: an optoelectronic implementation of reservoir computing. Opt Express 20:3241–3249. doi: 10.1364/OE.20.003241 CrossRefGoogle Scholar
  27. 27.
    Legenstein R, Chase SM, Schwartz AB, Maass W (2010) A reward-modulated hebbian learning rule can explain experimentally observed network reorganization in a brain control task. J Neurosci 30(25):8400–8410 CrossRefGoogle Scholar
  28. 28.
    Li J, Jaeger H (2011) Minimal energy control of an ESN pattern generator. Technical report 26, Jacobs University Bremen, School of Engineering and Science Google Scholar
  29. 29.
    Lukoševičius M (2011) Reservoir computing and self-organized neural hierarchies. PhD Thesis, Jacobs University Bremen, Bremen, Germany Google Scholar
  30. 30.
    Lukoševičius M, Jaeger H (2009) Reservoir computing approaches to recurrent neural network training. Comput Sci Rev 3(3):127–149. doi: 10.1016/j.cosrev.2009.03.005 CrossRefGoogle Scholar
  31. 31.
    Lukoševičius M, Popovici D, Jaeger H, Siewert U (2006) Time warping invariant echo state networks. IUB technical report 2, International University Bremen. http://minds.jacobs-university.de/pubs
  32. 32.
    Maass W (2011) Motivation, theory, and applications of liquid state machines. In: Cooper B, Sorbi A (eds) Computability in context: computation and logic in the real world. Imperial College Press, London, pp 275–296 CrossRefGoogle Scholar
  33. 33.
    Maass W, Joshi P, Sontag E (2007) Computational aspects of feedback in neural circuits. PLoS Comput Biol 3(1):1–20 MathSciNetCrossRefGoogle Scholar
  34. 34.
    Maass W, Natschläger T, Markram H (2002) Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput 14(11):2531–2560. doi: 10.1162/089976602760407955 zbMATHCrossRefGoogle Scholar
  35. 35.
    Martens J, Sutskever I (2011) Learning recurrent neural networks with Hessian-free optimization. In: Proc 28th int conf on machine learning. http://www.icml-2011.org/papers/532_icmlpaper.pdf Google Scholar
  36. 36.
    Paquot Y, Duport F, Smerieri A, Dambre J, Schrauwen B, Haelterman M, Massar S (2012) Optoelectronic reservoir computing. Sci Rep 2:287. doi: 10.1038/srep00287. http://www.nature.com/srep/2012/120227/srep00287/full/srep00287.html CrossRefGoogle Scholar
  37. 37.
    Salmen M, Plöger P (2005) Echo state networks used for motor control. In: Proc IEEE int conf on robotics and automation (ICRA), pp 1953–1958 CrossRefGoogle Scholar
  38. 38.
    Schiller UD, Steil JJ (2005) Analyzing the weight dynamics of recurrent learning algorithms. Neurocomputing 63C:5–23 CrossRefGoogle Scholar
  39. 39.
    Schrauwen B, D‘Haene M, Verstraeten D, Stroobandt D (2008) Compact hardware liquid state machines on FPGA for real-time speech recognition. Neural Netw 21(2–3):511–523 CrossRefGoogle Scholar
  40. 40.
    Schürmann F, Meier K, Schemmel J (2005) Edge of chaos computation in mixed-mode VLSI—a hard liquid. In: Advances in neural information processing systems (NIPS 2004), vol 17. MIT Press, Cambridge, pp 1201–1208 Google Scholar
  41. 41.
    Shi Z, Han M (2007) Support vector echo-state machine for chaotic time-series prediction. IEEE Trans Neural Netw 18(2):359–372 CrossRefGoogle Scholar
  42. 42.
    Skowronski MD, Harris JG (2007) Automatic speech recognition using a predictive echo state network classifier. Neural Netw 20(3):414–423. doi: 10.1016/j.neunet.2007.04.006 zbMATHCrossRefGoogle Scholar
  43. 43.
    Steil JJ (2004) Backpropagation-decorrelation: recurrent learning with O(N) complexity. In: Proceedings of the IEEE international joint conference on neural networks (IJCNN 2004), vol 2, pp 843–848 Google Scholar
  44. 44.
    Stieg AZ, Avizienis AV, Sillin HO, Martin-Olmos C, Aono M, Gimzewski JK (2012) Emergent criticality in complex Turing B-type atomic switch networks. Adv Mater 24(2):286–293. doi: 10.1002/adma.201103053 CrossRefGoogle Scholar
  45. 45.
    Sussillo D, Abbott LF (2009) Generating coherent patterns of activity from chaotic neural networks. Neuron 63(4):544–557. doi: 10.1016/j.neuron.2009.07.018 CrossRefGoogle Scholar
  46. 46.
    Triefenbach F, Jalalvand A, Schrauwen B, Martens JP (2010) Phoneme recognition with large hierarchical reservoirs. In: Advances in neural information processing systems (NIPS 2010), vol 23. MIT Press, Cambridge, pp 2307–2315. 2011 Google Scholar
  47. 47.
    Vandoorne K, Dierckx W, Schrauwen B, Verstraeten D, Baets R, Bienstman P, Campenhout JV (2008) Toward optical signal processing using photonic reservoir computing. Opt Express 16(15):11182–11192 CrossRefGoogle Scholar
  48. 48.
    Verstraeten D (2009) Reservoir computing: computation with dynamical systems. PhD Thesis, Electronics and Information Systems, University of Ghent. http://organic.elis.ugent.be/biblio
  49. 49.
    Verstraeten D, Schrauwen B, D’Haene M, Stroobandt D (2007) An experimental unification of reservoir computing methods. Neural Netw 20(3):391–403 zbMATHCrossRefGoogle Scholar
  50. 50.
    Verstraeten D, Schrauwen B, Stroobandt D (2006) Reservoir-based techniques for speech recognition. In: Proceedings of the IEEE international joint conference on neural networks (IJCNN 2006), pp 1050–1053 CrossRefGoogle Scholar
  51. 51.
    Verstraeten D, Schrauwen B, Stroobandt D, Van Campenhout J (2005) Isolated word recognition with the liquid state machine: a case study. Inf Process Lett 95(6):521–528 zbMATHCrossRefGoogle Scholar
  52. 52.
    Werbos PJ (1990) Backpropagation through time: what it does and how to do it. Proc IEEE 78(10):1550–1560 CrossRefGoogle Scholar
  53. 53.
    Williams RJ, Zipser D (1989) A learning algorithm for continually running fully recurrent neural networks. Neural Comput 1:270–280 CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2012

Authors and Affiliations

  • Mantas Lukoševičius
    • 1
    Email author
  • Herbert Jaeger
    • 1
  • Benjamin Schrauwen
    • 2
  1. 1.Jacobs University BremenBremenGermany
  2. 2.Ghent UniversityGhentBelgium

Personalised recommendations