Abstract
For many years, Evolutionary Algorithms (EAs) have been applied to improve Neural Networks (NNs) architectures. They have been used for solving different problems, such as training the networks (adjusting the weights), designing network topology, optimizing global parameters, and selecting features. Here, we provide a systematic brief survey about applications of the EAs on the specific domain of the recurrent NNs named Reservoir Computing (RC). At the beginning of the 2000s, the RC paradigm appeared as a good option for employing recurrent NNs without dealing with the inconveniences of the training algorithms. RC models use a nonlinear dynamic system, with fixed recurrent neural network named the reservoir, and learning process is restricted to adjusting a linear parametric function. However, an RC model has several hyper-parameters, therefore EAs are helpful tools to figure out optimal RC architectures. We provide an overview of the results on the area, discuss novel advances, and we present our vision regarding the new trends and still open questions.
This is a preview of subscription content, access via your institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Maass, W.: Searching for principles of brain computation. Curr. Opin. Behav. Sci. 11, 81–92 (2016)
Jaeger, H.: Using conceptors to manage neural long-term memories for temporal patterns. J. Mach. Learn. Res. 18, 1–43 (2017)
Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127–149 (2009)
Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)
Graves, A.: Supervised Sequence Labelling with Recurrent Neural Networks. Studies in Computational Intelligence, 1st edn., vol. 385. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-24797-2
Maass, W.: Noisy spiking neurons with temporal coding have more computational power than sigmoidal neurons. Technical report TR-1999-037, Technische Universitaet Graz. Graz, Austria (1999)
Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical report 148, German National Research Center for Information Technology (2001)
Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 287–289 (2007)
Huang, G.-B., Zhu, Q., Siew, C.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)
Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268(C), 87–99 (2017). Advances in Artificial Neural Networks, Machine Learning and Computational Intelligence
Li, D., Han, M., Wang, J.: Chaotic time series prediction based on a novel robust echo state network. IEEE Trans. Neural Netw. Learn. Syst. 23(5), 787–799 (2012)
Basterrech, S., Krömer, P.: A nature-inspired biomarker for mental concentration using a single-channel EEG. Neural Comput. Appl. 32, 7941–7956 (2019)
Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304, 78–80 (2004)
Basterrech, S.: Empirical analysis of the necessary and sufficient conditions of the echo state property. In: International Joint Conference on Neural Networks, IJCNN 2017, pp. 888–896, Anchorage, AK, USA. IEEE Press (2017)
Butcher, J.B., Verstraeten, D., Schrauwen, B., Day, C.R., Haycock, P.W.: Reservoir computing and extreme learning machines for non-linear time-series data analysis. Neural Netw. 38, 76–89 (2013)
Hart, A., Hook, J., Dawes, J.: Embedding and approximation theorems for echo state networks. Neural Netw. 128, 234–247 (2020)
Tanaka, G., et al.: Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019)
Wainrib, G., Galtier, M.N.: A local Echo State Property through the largest Lyapunov exponent. Neural Netw. 76, 39–45 (2016)
Sun, C., Song, M., Hong, S., Li, H.: A review of designs and applications of echo state networks (2020). Available at Arxiv
Alexandre, L.A., Embrechts, M.J., Linton, J.: Benchmarking reservoir computing on time-independent classification tasks. In: Proceedings of the 2009 international joint conference on Neural Networks, IJCNN 2009, pp. 2376–2380, Piscataway, NJ, USA. IEEE Press (2009)
Ma, Q., Shen, L., Chen, W., Wang, J., Wei, J., Zhiwen, Yu.: Functional echo state network for time series classification. Inf. Sci. 373, 1–20 (2016)
Martens, J., Sutskever, I.: Training deep and recurrent networks with hessian-free optimization. In: Neural Networks: Tricks of the Trade, (2nd edn.), pp. 479–535. MIT Press (2012)
Pascanu, R., Gülçehre, Ç., Cho, K., Bengio, Y.: How to construct deep recurrent neural networks. In: Bengio,Y., LeCun, Y. (eds.) 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, 14–16 April 2014, Conference Track Proceedings (2014)
Rodan, A., Tin̆o, P.: Minimum complexity echo state network. IEEE Trans. Neural Netw. 22, 131–144 (2011)
Lukos̆evic̆ius, M.: Reservoir computing and self-organized neural hierarchies. Ph.D. thesis, School of Engineering and Science. Jacobs University, December 2011
Basterrech, S., Fyfe, C., Rubino, G.: Self-organizing maps and scale-invariant maps in echo state networks. In: 11th International Conference on Intelligent Systems Design and Applications, ISDA 2011, Córdoba, Spain, 22–24 November 2011, pp. 94–99, November 2011
Schrauwen, B., Wardermann, M., Verstraeten, D., Steil, J.J., Stroobandt, D.: Improving reservoirs using intrinsic plasticity. Neurocomputing 71, 1159–1171 (2008)
Basterrech, S., Rubino, G.: Echo state queueing networks: a combination of reservoir computing and random neural networks. Probab. Eng. Inf. Sci. 31, 457–476 (2017)
Basterrech, S., Rubino, G.: Echo state queueing network: a new reservoir computing learning tool. In: 10th IEEE Consumer Communications and Networking Conference, CCNC 2013, Las Vegas, NV, USA, 11–14 January 2013, pp. 118–123 (2013)
Yao, X.: Evolving artificial neural networks. Proc. IEEE 87(9), 1423–1447 (1999)
Schmidhuber, J., Wierstra, D., Gagliolo, M., Gomez, F.: Training recurrent networks by Evolino. Neural Netw. 19, 757–779 (2007)
Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res. 20, 1–21 (2019)
Stanley, K.O., Clune, J., Lehman, J., Miikkulainen, R.: Designing neural networks through neuroevolution. Nat. Mach. Learn. 1, 24–35 (2019)
Paugam-Moisy, H., Bohte, S.M.: Computing with spiking neuron networks. In: Rozenberg, G., Bäck, T., Kok, J.N. (eds.) Handbook of Natural Computing. Springer, Cham (2009). https://doi.org/10.1007/978-3-540-92910-9_10
Lukoševičius, M.: Echo state networks with trained feedbacks. Technical report No. 4, Jacobs University Bremen (2007)
Sergio, A.T., Ludermir, T.B.: PSO for reservoir computing optimization. In: Villa, A.E.P., Duch, W., Érdi, P., Masulli, F., Palm, G. (eds.) ICANN 2012. LNCS, vol. 7552, pp. 685–692. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33269-2_86
Ferreira, A.A., Ludermir, T.B.: Comparing evolutionary methods for reservoir computing pre-training. In: The 2011 International Joint Conference on Neural Networks (IJCNN), pp. 283–290, July 2011
Ferreira, A.A., Ludermir, T.B., De Aquino, R.R.B.: An approach to reservoir computing design and training. Expert Syst. Appl. 40(10), 4172–4182 (2013)
Ferreira, A.A., Ludermir, T.B.: Evolutionary strategy for simultaneous optimization of parameters, topology and reservoir weights in echo state networks. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–7, July 2010
Basterrech, S., Alba, E., Snášel, V.: An experimental analysis of the echo state network initialization using the particle swarm optimization. In: 2014 Sixth World Congress on Nature and Biologically Inspired Computing (NaBIC), pp. 214–219, July 2014
Chatzidimitriou, K.C., Mitkas, P.A.: A neat way for evolving echo state networks. In: Proceedings of the 2010 Conference on ECAI 2010: 19th European Conference on Artificial Intelligence, NLD, pp. 909–914. IOS Press (2010)
Matzner, F.: Neuroevolution on the edge of chaos. In: Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2017, pp. 465–472, New York, NY, USA. ACM (2017)
Gallicchio, C., Micheli, A.: Tree echo state networks. Neurocomputing 101, 319–337 (2013)
Gallicchio, C., Micheli, A.: Echo state property of deep reservoir computing networks. Cogn. Comput. 9(3), 337–350 (2017)
Ma, Q., Shen, L., Cottrell, G.W.: DeePr-ESN: a deep projection-encoding Echo State Network. Inf. Sci. 511, 152–171 (2020)
Dale, M.: Neuroevolution of hierarchical reservoir computers. In: Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2018, pp. 410–417, New York, NY, USA. ACM (2018)
Long, J., Zhang, S., Li, C.: Evolving deep echo state networks for intelligent fault diagnosis. IEEE Trans. Industr. Inf. 16(7), 4928–4937 (2020)
Basterrech, S., Rubino, G.: Evolutionary echo state network: evolving reservoirs in the Fourier space. In: 2022 IEEE International Joint Conference on Neural Networks (IJCNN), pp. 1–8 (2022)
Basterrech, S., Rubino, G.: Evolving reservoir weights in the frequency domain. In: 2021 Genetic and Evolutionary Computation Conference Companion (GECCO 2021 Companion). ACM (2021)
Rodan, A., Tiňo, P.: Simple deterministically constructed cycle reservoirs with regular jumps. Neural Comput. 24, 1822–1852 (2012)
Steil, J.J.: Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning. Neural Netw. 20, 353–364 (2007)
Maat, J.R., Gianniotis, N., Protopapas, P.: Efficient optimization of echo state networks for time series datasets. In: 2018 International Joint Conference on Neural Networks (IJCNN), pp. 1–7 (2018)
Ribeiro, G.T., Sauer, J.G., Fraccanabbia, N., Mariani, V.C., dos Santos Coelho, L.: Bayesian optimized echo state network applied to short-term load forecasting. Energies 13(9), 2390 (2020)
Gallicchio, C., Micheli, A.: Deep echo state network (DeepESN): a brief survey. CoRR, abs/1712.04323 (2017)
Gallicchio, C., Micheli, A.: Reservoir topology in deep echo state networks. In: Tetko, I.V., Kůrková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11731, pp. 62–75. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30493-5_6
Legenstein, R., Maass, W.: Edge of chaos and prediction of computational performance for neural circuit models. Neural Netw. 20(3), 323–334 (2007)
Burianek, T., Basterrech, S.: Quantifying the reservoir quality using dimensionality reduction techniques. In: ESANN 2018, pp. 443–448, Bruges, Belgium, April 2018
Lukoševičius, M.: On self-organizing reservoirs and their hierarchies. Technical report 25, Jacobs University, Bremen (2010)
Acknowledgements
This work was supported by GACR-Czech Science Foundation project No. 21-33574K “Lifelong Machine Learning on Data Streams”.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Basterrech, S., Sharma, T.K. (2023). Re-visiting Reservoir Computing Architectures Optimized by Evolutionary Algorithms. In: Abraham, A., Hanne, T., Gandhi, N., Manghirmalani Mishra, P., Bajaj, A., Siarry, P. (eds) Proceedings of the 14th International Conference on Soft Computing and Pattern Recognition (SoCPaR 2022). SoCPaR 2022. Lecture Notes in Networks and Systems, vol 648. Springer, Cham. https://doi.org/10.1007/978-3-031-27524-1_81
Download citation
DOI: https://doi.org/10.1007/978-3-031-27524-1_81
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-27523-4
Online ISBN: 978-3-031-27524-1
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)