Advertisement

Efficient Cross-Validation of Echo State Networks

  • Mantas LukoševičiusEmail author
  • Arnas Uselis
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11731)

Abstract

Echo State Networks (ESNs) are known for their fast and precise one-shot learning of time series. But they often need good hyper-parameter tuning for best performance. For this good validation is key, but usually, a single validation split is used. In this rather practical contribution we suggest several schemes for cross-validating ESNs and introduce an efficient algorithm for implementing them. The component that dominates the time complexity of the already quite fast ESN training remains constant (does not scale up with k) in our proposed method of doing k-fold cross-validation. The component that does scale linearly with k starts dominating only in some not very common situations. Thus in many situations k-fold cross-validation of ESNs can be done for virtually the same time complexity as a simple single split validation. Space complexity can also remain the same. We also discuss when the proposed validation schemes for ESNs could be beneficial and empirically investigate them on several different real-world datasets.

Keywords

Echo State Networks Reservoir computing Recurrent neural networks Cross-validation Time complexity 

Notes

Acknowledgments

This research was supported by the Research, Development and Innovation Fund of Kaunas University of Technology (grant No. PP-91K/19).

References

  1. 1.
    Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical report GMD Report 148, German National Research Center for Information Technology (2001)Google Scholar
  2. 2.
    Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004)CrossRefGoogle Scholar
  3. 3.
    Jaeger, H.: Echo state network. Scholarpedia 2(9), 2330 (2007)CrossRefGoogle Scholar
  4. 4.
    Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009)CrossRefGoogle Scholar
  5. 5.
    Stone, M.: Cross-validatory choice and assessment of statistical predictions. J. Roy. Stat. Soc.: Ser. B (Methodol.) 36(2), 111–133 (1974)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 659–686. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-35289-8_36CrossRefGoogle Scholar
  7. 7.
    Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Netw. 35, 1–9 (2012)CrossRefGoogle Scholar
  8. 8.
    Bergmeir, C., Hyndman, R.J., Koo, B.: A note on the validity of cross-validation for evaluating autoregressive time series prediction. Comput. Stat. Data Anal. 120, 70–83 (2018)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Taylor, J.W.: Short-term electricity demand forecasting using double seasonal exponential smoothing. J. Oper. Res. Soc. 54(8), 799–805 (2003)CrossRefGoogle Scholar
  10. 10.
    Jaeger, H., Lukoševičius, M., Popovici, D., Siewert, U.: Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw. 20(3), 335–352 (2007)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Kaunas University of TechnologyKaunasLithuania

Personalised recommendations