Solar Irradiance Estimation Using the Echo State Network and the Flexible Neural Tree

Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 297)

Abstract

Two popular models for solving temporal learning problems are the Flexible Neural Tree (FNT) and the Echo State Network (ESN). Both models belong to the the Neural Network area. The ESN is based in the projection of a recurrent neural network to model the temporal dependencies of the data. The FNT uses heuristic techniques for finding a tree topology and its parameters. There are several examples in the Machine Learning literature that shown the success for solving learning tasks of both techniques. In this paper, we have studied the performance of these methods in a specific data set about renewable energy.

Keywords

Echo State Network Flexible Neural Tree Time-series forecasting Reservoir Computing Renewable energy 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Chen, Y., Yang, B., Dong, J.: Nonlinear System Modelling Via Optimal Design of Neural Trees. International Journal of Neural Systems 14(02), 125–137 (2004)CrossRefGoogle Scholar
  2. 2.
    Chen, Y., Yang, B., Dong, J., Abraham, A.: Time-series Forecasting using Flexible Neural Tree Model. Inf. Sci. 174(3-4), 219–235 (2005)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical Report 148, German National Research Center for Information Technology (2001)Google Scholar
  4. 4.
    Chen, Y., Abraham, A., Yang, B.: Hybrid Flexible Neural Tree based intrusion detection systems. International Journal of Intelligent Systems 22(4), 337–352 (2007)CrossRefMATHGoogle Scholar
  5. 5.
    Chen, Y., Abraham, A., Yang, B.: Feature selection and classification using Flexible Neural Tree. Neurocomputing 70(1-3), 305–313 (2006)CrossRefGoogle Scholar
  6. 6.
    Chen, Y., Abraham, A., Yang, J.: Feature selection and intrusion detection using hybrid flexible neural tree. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3498, pp. 439–444. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  7. 7.
    Chen, Y., Abraham, A., Yang, B.: Hybrid flexible neural-tree-based intrusion detection systems. International Journal of Intelligent Systems 22(4), 337–352 (2007)CrossRefMATHGoogle Scholar
  8. 8.
    Chen, Y., Yang, B., Dong, J.: Evolving Flexible Neural Networks Using ANT Programming and PSO Algorithm. In: Yin, F.-L., Wang, J., Guo, C. (eds.) ISNN 2004. LNCS, vol. 3173, pp. 211–216. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    Kennedy, J., Eberhart, R.: Particle Swarm Optimization. In: Proceedings of the IEEE International Conference on Neural Networks 1995, vol. 4, pp. 1942–1948 (1995)Google Scholar
  10. 10.
    Shi, Y., Eberhart, R.C.: Parameter Selection in Particle Swarm Optimization. In: Porto, V.W., Waagen, D. (eds.) EP 1998. LNCS, vol. 1447, pp. 591–600. Springer, Heidelberg (1998)CrossRefGoogle Scholar
  11. 11.
    Storn, R., Price, K.: Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. Journal of Global Optimization 11(4), 341–359 (1997)CrossRefMATHMathSciNetGoogle Scholar
  12. 12.
    Price, K., Storn, R.M., Lampinen, J.A.: Differential Evolution: A Practical Approach to Global Optimization. Natural Computing Series. Springer-Verlag New York, Inc., Secaucus (2005)Google Scholar
  13. 13.
    Chen, Y., Peng, L., Abraham, A.: Exchange rate forecasting using flexible neural trees. In: Wang, J., Yi, Z., Żurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3973, pp. 518–523. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  14. 14.
    Shi, Y., Eberhart, R.: A modified Particle Swarm Optimizer. In: The 1998 IEEE International Conference on Evolutionary Computation Proceedings, IEEE World Congress on Computational Intelligence, pp. 69–73 (1998)Google Scholar
  15. 15.
    Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for a neural computation based on perturbations. Neural Computation, 2531–2560 (November 2002)Google Scholar
  16. 16.
    Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review, 127–149 (2009)Google Scholar
  17. 17.
    Schrauwen, B., Wardermann, M., Verstraeten, D., Steil, J.J., Stroobandt, D.: Improving Reservoirs using Intrinsic Plasticity. Neurocomputing 71, 1159–1171 (2007)CrossRefGoogle Scholar
  18. 18.
    Steil, J.J.: Backpropagation-Decorrelation: online recurrent learning with O(N) complexity. In: Proceedings of IJCNN 2004, vol. 1 (2004)Google Scholar
  19. 19.
    Xue, Y., Yang, L., Haykin, S.: Decoupled Echo State Networks with lateral inhibition. Neural Networks (3), 365–376 (2007)Google Scholar
  20. 20.
    Jaeger, H., Lukoševičius, M., Popovici, D., Siewert, U.: Optimization and applications of Echo State Networks with leaky-integrator neurons. Neural Networks (3), 335–352 (2007)Google Scholar
  21. 21.
    Gagliolo, M., Schmidhuber, J., Wierstra, D., Gomez, F.: Training Recurrent Networks by Evolino. Neural Networks 19, 757–779 (2007)MATHGoogle Scholar
  22. 22.
    Basterrech, S., Rubino, G.: Echo State Queueing Network: a new Reservoir Computing learning tool. In: IEEE Consumer Comunications & Networking Conference (CCNC 2013) (January 2013)Google Scholar
  23. 23.
    Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Networks (3), 287–289 (2007)Google Scholar
  24. 24.
    Maass, W., Natschläger, T., Markram, H.: Computational models for generic cortical microcircuits. In: Neuroscience Databases. A Practical Guide, Boston, Usa, pp. 121–136. Kluwer Academic Publishers (June 2003)Google Scholar
  25. 25.
    Basterrech, S., Snášel, V.: Initializing Reservoirs With Exhibitory And Inhibitory Signals Using Unsupervised Learning Techniques. In: International Symposium on Information and Communication Technology (SoICT), Danang, Viet Nam. ACM Digital Library (December 2013)Google Scholar
  26. 26.
    Basterrech, S., Fyfe, C., Rubino, G.: Self-Organizing Maps and Scale-Invariant Maps in Echo State Networks. In: 2011 11th International Conference on IEEE Intelligent Systems Design and Applications (ISDA), pp. 94–99 (November 2011)Google Scholar
  27. 27.
    Rodan, A., Tiňo, P.: Minimum Complexity Echo State Network. IEEE Transactions on Neural Networks, 131–144 (2011)Google Scholar
  28. 28.
    Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade, 2nd edn. LNCS, vol. 7700, pp. 659–686. Springer, Heidelberg (2012)Google Scholar
  29. 29.
    Prokop, L., Misak, S., Snasel, V., Platos, J., Kroemer, P.: Supervised learning of photovoltaic power plant output prediction models. Neural Networks World 23(4), 321–338 (2013)Google Scholar
  30. 30.
    Basterrech, S., Prokop, L., Buriánek, T., Misak, S.: Optimal Design of Neural Tree for Solar Power Prediction. In: 15th Scientific Conference Electronic Power Engeneering, Brno, Czech Republic (May 2014)Google Scholar
  31. 31.
    Eberhart, R.C., Shi, Y.: Comparing inertia weights and constriction factors in particle swarm optimization. In: Proceedings of the 2000 Congress on Evolutionary Computation 2000, vol. 1, pp. 84–88 (2000)Google Scholar
  32. 32.
    Clerc, M.: The swarm and the queen: towards a deterministic and adaptive particle swarm optimization. In: Proceedings of the 1999 Congress on Evolutionary Computation, CEC 1999, vol. 3, pp. 1951–1957 (1999)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.IT4InnovationsVŠB–Technical University of OstravaOstravaCzech Republic

Personalised recommendations