Skip to main content

Practical Fractional-Order Neuron Dynamics for Reservoir Computing

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11141))

Abstract

This paper proposes a practical reservoir computing with fractional-order leaky integrator neurons, which yield longer memory capacity rather than normal leaky integrator. In general, fractional-order derivative needs all memories leading to the current state from the initial state. Although this feature is useful as a viewpoint of memory capacity, to keep all memories is intractable, in particular, for reservoir computing with many neurons. A reasonable approximation to the fractional-order neuron dynamics is therefore introduced, thereby deriving a model that exponentially decays past memories before threshold. This derivation is regarded as natural extension of reservoir computing with leaky integrator that has been used most commonly. The proposed method is compared with reservoir computing methods with normal neurons and leaky integrator neurons by solving four kinds of regression and classification problems with time-series data. As a result, the proposed method shows superior results in all of problems.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Atiya, A.F., Parlos, A.G.: New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans. Neural Netw. 11(3), 697–709 (2000)

    Article  Google Scholar 

  2. Bacciu, D., Barsocchi, P., Chessa, S., Gallicchio, C., Micheli, A.: An experimental characterization of reservoir computing in ambient assisted living applications. Neural Comput. Appl. 24(6), 1451–1464 (2014)

    Article  Google Scholar 

  3. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)

  4. Diethelm, K., Ford, N.J., Freed, A.D., Luchko, Y.: Algorithms for the fractional calculus: a selection of numerical methods. Comput. Methods Appl. Mech. Eng. 194(6–8), 743–773 (2005)

    Article  MathSciNet  Google Scholar 

  5. Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268, 87–99 (2017)

    Article  Google Scholar 

  6. Hermans, M., Schrauwen, B.: Training and analysing deep recurrent neural networks. In: Advances in Neural Information Processing Systems, pp. 190–198 (2013)

    Google Scholar 

  7. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  8. Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004)

    Article  Google Scholar 

  9. Jaeger, H., Lukoševičius, M., Popovici, D., Siewert, U.: Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw. 20(3), 335–352 (2007)

    Article  Google Scholar 

  10. Kingma, D., Ba, J.: Adam: a method for stochastic optimization. In: International Conference for Learning Representations, pp. 1–15 (2015)

    Google Scholar 

  11. Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009)

    Article  Google Scholar 

  12. Lun, S.x., Yao, X.s., Hu, H.f.: A new echo state network with variable memory length. Inf. Sci. 370, 103–119 (2016)

    Article  Google Scholar 

  13. Maass, W., Markram, H.: On the computational power of circuits of spiking neurons. J. Comput. Syst. Sci. 69(4), 593–616 (2004)

    Article  MathSciNet  Google Scholar 

  14. Marinov, T., Ramirez, N., Santamaria, F.: Fractional integration toolbox. Fract. Calc. Appl. Anal. 16(3), 670–681 (2013)

    Article  MathSciNet  Google Scholar 

  15. Pahnehkolaei, S.M.A., Alfi, A., Machado, J.T.: Uniform stability of fractional order leaky integrator echo state neural network with multiple time delays. Inf. Sci. 418, 703–716 (2017)

    Article  Google Scholar 

  16. Palumbo, F., Gallicchio, C., Pucci, R., Micheli, A.: Human activity recognition using multisensor data fusion based on reservoir computing. J. Ambient. Intell. Smart Environ. 8(2), 87–107 (2016)

    Article  Google Scholar 

  17. Rodan, A., Tino, P.: Minimum complexity echo state network. IEEE Trans. Neural Netw. 22(1), 131–144 (2011)

    Article  Google Scholar 

  18. Schmidhuber, J., Wierstra, D., Gagliolo, M., Gomez, F.: Training recurrent networks by evolino. Neural Comput. 19(3), 757–779 (2007)

    Article  Google Scholar 

  19. Teka, W., Marinov, T.M., Santamaria, F.: Neuronal spike timing adaptation described with a fractional leaky integrate-and-fire model. PLoS Comput. Biol. 10(3), e1003526 (2014)

    Article  Google Scholar 

  20. Teka, W.W., Upadhyay, R.K., Mondal, A.: Fractional-order leaky integrate-and-fire model with long-term memory and power law dynamics. Neural Netw. 93, 110–125 (2017)

    Article  Google Scholar 

  21. Verstraeten, D., Schrauwen, B., Stroobandt, D., Van Campenhout, J.: Isolated word recognition with the liquid state machine: a case study. Inf. Process. Lett. 95(6), 521–528 (2005)

    Article  Google Scholar 

  22. Xue, F., Li, Q., Li, X.: The combination of circle topology and leaky integrator neurons remarkably improves the performance of echo state network on time series prediction. PloS one 12(7), e0181816 (2017)

    Article  Google Scholar 

  23. Yamazaki, T., Igarashi, J.: Realtime cerebellum: a large-scale spiking network model of the cerebellum that runs in realtime using a graphics processing unit. Neural Netw. 47, 103–111 (2013)

    Article  Google Scholar 

  24. Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Netw. 35, 1–9 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Taisuke Kobayashi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kobayashi, T. (2018). Practical Fractional-Order Neuron Dynamics for Reservoir Computing. In: Kůrková, V., Manolopoulos, Y., Hammer, B., Iliadis, L., Maglogiannis, I. (eds) Artificial Neural Networks and Machine Learning – ICANN 2018. ICANN 2018. Lecture Notes in Computer Science(), vol 11141. Springer, Cham. https://doi.org/10.1007/978-3-030-01424-7_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01424-7_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01423-0

  • Online ISBN: 978-3-030-01424-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics