Skip to main content
Log in

Orthogonal Echo State Networks and Stochastic Evaluations of Likelihoods

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

We report about probabilistic likelihood estimates that are performed on time series using an echo state network with orthogonal recurrent connectivity. The results from tests using synthetic stochastic input time series with temporal inference indicate that the capability of the network to infer depends on the balance between input strength and recurrent activity. This balance has an influence on the network with regard to the quality of inference from the short-term input history versus inference that accounts for influences that date back a long time. Sensitivity of such networks against noise and the finite accuracy of network states in the recurrent layer are investigated. In addition, a measure based on mutual information between the output time series and the reservoir is introduced. Finally, different types of recurrent connectivity are evaluated. Orthogonal matrices not only show the best results of all investigated connectivity types overall but also in the way how the network performance scales with the size of the recurrent layer.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. One possible explanation is that the parameter β affects the total activity of the network |x t |2. In the case of low activity, the non-linear component of the sigmoid transfer function is tiny. Thus, the resulting network is nearly a linear ESN, for which the highest values of memory capacity (MC)—near the theoretical limit—have already been found earlier [8]. On the other hand, the non-linear components are necessary to distinguish different vectors in the Hilbert space in order to have a sufficiently good approximation to model the probabilities.

References

  1. Boedecker J, Obst O, Mayer N M, Asada M. Studies on reservoir initialization and dynamics shaping in echo state networks. In: ESANN’2009 proceedings: 17th European symposium on artificial networks: advances in computational intelligence and learning, Bruges, Belgium, 22–24 April; 2009. p. 227– 232.

  2. Boedecker J, Obst O, Lizier J, Mayer N M, Asada M. Information processing in echo state networks at the edge of chaos. Theory Biosci 2012;131:205–13.

    Article  PubMed  Google Scholar 

  3. Bulsara A, Jacobs E W, Zhou T, Moss F, Kiss L. Stochastic resonance in a single neuron model: theory and analog simulation. J Theor Biol 1991;152(4):531–55.

    Article  CAS  PubMed  Google Scholar 

  4. Chen X-W, Anantha G, Lin X. Improving Bayesian network structure learning with mutual information-based node ordering in the k2 algorithm. IEEE Trans Knowl Data Eng 2008;20(5):628–40.

    Article  Google Scholar 

  5. Granger C W J. Investigating causal relations by econometric models and cross-spectral methods. Econometrica: J Econ Soc. 1969;37(3):424–438.

  6. Hammer B, Schrauwen B, Steil J J. Recent advances in efficient learning of recurrent networks. In: ESANN’2009 proceedings, European symposium on artificial neural networks—advances in computational intelligence and learning. 2009. p. 213–226.

  7. Hinton GE, Osindero S, Teh Y-W. A fast learning algorithm for deep belief nets. Neural Comput 2006; 18(7):1527–54.

    Article  PubMed  Google Scholar 

  8. Jaeger H. The ‘echo state’ approach to analysing and training recurrent neural networks. In: GMD Report 148, GMD German National Research Insitute for Computer Science. 2001.

  9. Jaeger H. Adaptive nonlinear system identification with echo state networks. In: Advances in neural information processing systems, NIPS 2002. Cambridge: MIT Press; 2003 2002. p. 593– 600.

  10. Jaeger H. Adaptive nonlineaer systems identification with echo state networks. In: Advances in neural information processing Systems; Proceedings of the NIPS 15. 2003. p. 609–615. AA14.

  11. Jang J-S R, Sun C-T, Mizutani E. Neuro-fuzzy and soft computing, a computational approach to learning and machine intelligence. Prentice Hall: Englewood Cliffs; 1997 (cf. p. 104 ff).

  12. Kingma DP, Welling M. 2013. Auto-encoding variational Bayes. arXiv:1312.6114.

  13. Løkse S, Bianchi F M, Jenssen R. Training echo state networks with regularization through dimensionality reduction. Cogn Comput. 2017;1–15.

  14. Manjunath G, Jaeger H. Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks. Neural Comput 2013;25(3):671–96.

    Article  CAS  PubMed  Google Scholar 

  15. Mayer N M. Adaptive critical reservoirs with power law forgetting of unexpected input events. Neural Comput. 2015;27:1102–1119.

  16. Mayer N M. Echo state condition at the critical point. Entropy. 2017;19(1).

  17. Mayer N M, Obst O, Chang Y-C. Time series causality inference using echo state networks. In: Vigneron V, et al, editors. Latent variable analysis and signal separation, LNCS 6365, pp 279–286. Berlin: Springer; 2010.

    Google Scholar 

  18. Pereda E, Quiroga RQ, Bhattacharya J. Nonlinear multivariate analysis of neurophysiological signals. Prog Neurobiol. 2005;77(1):1–37.

  19. Plesser H E, Gerstner W. Noise in integrate-and-fire neurons: from stochastic input to escape rates. Neural Comput 2000;12(2):367–84.

    Article  CAS  PubMed  Google Scholar 

  20. Scardapane S, Uncini A. Semi-supervised echo state networks for audio classification. Cogn Comput 2017;9 (1):125–35.

    Article  Google Scholar 

  21. Schreiber T. Measuring information transfer. Phys Rev Lett 2000;85(2):461–4.

    Article  CAS  PubMed  Google Scholar 

  22. Shibuya T, Harada T, Kuniyoshi Y. Causality quantification and its applications: structuring and modeling of multivariate time series. In: KDD ’09: Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining. New York: ACM; 2009. p. 787–796.

  23. Tiṅo P, Hammer B, Bodén M. Markovian bias of neural-based architectures with feedback connections. In: Perspectives of neural-symbolic integration. Springer; 2007. p. 95–133.

  24. Verstraeten D, Dambre J, Dutoit X, Schrauwen B. Memory versus non-linearity in reservoirs. In: The 2010 international joint conference on neural networks (IJCNN). IEEE; 2010. p. 1–8.

  25. White O L, Lee DD, Sompolinsky H. Short-term memory in orthogonal neural networks. Phys Rev Lett 2004;92:148102.

    Article  PubMed  Google Scholar 

  26. Yildiz I B, Jaeger H, Kiebel SJ. Re-visiting the echo state property. Neural Netw 2012;35:1–20.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

N.M.M. thanks Oliver Obst for his previous work and the Doctoral Program in Cognitive Sciences at the National Chung Cheng University for setting up an interesting forum for discussions. Earlier preparations for this paper had been funded by the National Science Council of Taiwan and the Ministry of Science and Technology of Taiwan.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to N. Michael Mayer.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Ethical approval

This article does not contain any studies with human participants or animals performed by the author.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mayer, N.M., Yu, YH. Orthogonal Echo State Networks and Stochastic Evaluations of Likelihoods. Cogn Comput 9, 379–390 (2017). https://doi.org/10.1007/s12559-017-9466-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-017-9466-4

Keywords

Navigation