Advertisement

A Radial Basis Function Neural Network-Based Coevolutionary Algorithm for Short-Term to Long-Term Time Series Forecasting

  • E. Parras-Gutierrez
  • V. M. Rivas
  • J. J. Merelo
Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 613)

Abstract

This work analyzes the behavior and effectiveness of the L-Co-R method using a growing horizon to predict. This algorithm performs a double goal, on the one hand, it builds the architecture of the net with a set of RBFNs, and on the other hand, it sets a group of time lags in order to forecast future values of a time series given. For that, it has been used a set of 20 time series, 6 different methods found in the literature, 4 distinct forecast horizons, and 3 distinct quality measures have been utilized for checking the results. In addition, a statistical study has been done to confirms the good results of the method L-Co-R.

Keywords

Time series forecasting Co-evolutionary algorithms Neural networks Significant lags 

Notes

Acknowledgments

This work has been supported by the regional projects TIC-3928 and -TIC-03903 (Feder Funds), the Spanish project TIN 2012-33856 (Feder Founds), TIN 2011-28627-C04-02 (Feder Funds).

References

  1. 1.
    Araújo, R.: A quantum-inspired evolutionary hybrid intelligent apporach fo stock market prediction. Int. J. Intell. Comput. Cybern. 3(10), 24–54 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Bowerman, B., O’Connell, R., Koehler, A.: Forecasting: Methods and Applications. Thomson Brooks/Cole, Belmont, CA (2004)Google Scholar
  3. 3.
    Box, G., Jenkins, G.: Time series analysis: forecasting and control. Holden Day, San Francisco (1976)zbMATHGoogle Scholar
  4. 4.
    Brockwell, P., Hyndman, R.: On continuous-time threshold autoregression. Int. J. Forecast. 8(2), 157–173 (1992)CrossRefGoogle Scholar
  5. 5.
    Broomhead, D., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Syst. 2, 321–355 (1988)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Carse, B., Fogarty, T.: Fast evolutionary learning of minimal radial basis function neural networks using a genetic algorithm. In: Proceedings of Evolutionary Computing. LNCS, vol. 1143, pp. 1–22 Springer, Heidelberg (1996)Google Scholar
  7. 7.
    Castillo, P., Arenas, M., Merelo, J., and Romero, G.: Cooperative co-evolution of multilayer perceptrons. In: Mira, J., lvarez, J.R. (eds.) Computational Methods in Neural Modeling, LNCS, vol. 2686, pp. 358–365. Springer, Heidelberg (2003)Google Scholar
  8. 8.
    Chan, K., Tong, H.: On estimating thresholds in autoregressive models. J. Time Ser. Anal. 7(3), 179–190 (1986)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Clements, M., Franses, P., Swanson, N.: Forecasting economic and financial time-series with non-linear models. Int. J. Forecast. 20(2), 169–183 (2004)CrossRefGoogle Scholar
  10. 10.
    Du, H., Zhang, N.: Time series prediction using evolving radial basis function networks with new encoding scheme. Neurocomputing 71(7–9), 1388–1400 (2008)CrossRefGoogle Scholar
  11. 11.
    Eshelman, L.: The chc adptive search algorithm: how to have safe search when engaging in nontraditional genetic recombination. In: Proceedings of 1st Workshop on Foundations of Genetic Algorithms, pp. 265–283 (1991)Google Scholar
  12. 12.
    García-Pedrajas, N., Hervas-Martínez, C., Ortiz-Boyer, D.: Cooperative coevolution of artificial neural network ensembles for pattern classification. IEEE Trans. Evol. Comput. 9(3), 271–302 (2005)CrossRefGoogle Scholar
  13. 13.
    Harpham, C., Dawson, C.: The effect of different basis functions on a radial basis function network for time series prediction: a comparative study. Neurocomputing 69(16–18), 2161–2170 (2006)Google Scholar
  14. 14.
    Holm, S.: A simple sequentially rejective multiple test procedure. Scand. J. Stat. 6(2), 65–70 (1979)MathSciNetzbMATHGoogle Scholar
  15. 15.
    Hyndman, R., Koehler, A.: Another look at measures of forecast accuracy. Int. J. Forecast. 22(4), 679–688 (2006)CrossRefGoogle Scholar
  16. 16.
    Hyndman, R.J., Khandakar, Y.: Automatic time series forecasting: the forecast package for r. J. Stat. Softw. 27(3), 1–22 (2008)Google Scholar
  17. 17.
    Jain, A., Kumar, A.: Hybrid neural network models for hydrologic time series forecasting. Appl. Soft Comput. 7(2), 585–592 (2007)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Li, M., Tian, J., Chen, F.: Improving multiclass pattern recognition with a co-evolutionary rbfnn. Pattern Recogn. Lett. 29(4), 392–406 (2008)CrossRefGoogle Scholar
  19. 19.
    Lukoseviciute, K., Ragulskis, M.: Evolutionary algorithms for the selection of time lags for time series forecasting by fuzzy inference systems. Neurocomputing 73(10–12), 2077–2088 (2010)CrossRefGoogle Scholar
  20. 20.
    Ma, X., Wu, H.: Power system short-term load forecasting based on cooperative co-evolutionary immune network model. In: Proceedings of 2nd International Conference on Education Technology and Computer, pp. 582–585 (2010)Google Scholar
  21. 21.
    Makridakis, S., Andersen, A., Carbone, R., Fildes, R., Hibon, M., Lewandowski, R., Newton, J., Parzen, E., Winkler, R.: The accuracy of extrapolation (time series) methods: Results of a forecasting competition. J. Forecast. 1(2), 111–153 (1982)CrossRefGoogle Scholar
  22. 22.
    Makridakis, S., Hibon, M.: The m3-competition: results, conclusions and implications. Int. J. Forecast. 16(4), 451–476 (2000)CrossRefGoogle Scholar
  23. 23.
    Maus, A., Sprott, J.C.: Neural network method for determining embedding dimension of a time series. Commun. Nonlinear Sci. Numer. Simul. 16(8), 3294–3302 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Parras-Gutierrez, E., Garcia-Arenas, M., Rivas, V., del Jesus, M.: Coevolution of lags and rbfns for time series forecasting: L-co-r algorithm. Soft Comput. 16(6), 919–942 (2012)CrossRefGoogle Scholar
  25. 25.
    Pea, D.: Análisis de Series Temporales. Alianza Editorial (2005)Google Scholar
  26. 26.
    Potter, M., De Jong, K.: A cooperative coevolutionary approach to function optimization. In: Proceedings of Parallel Problem Solving from Nature, LNCS, vol. 866, pp. 249–257. Springer, Heidelberg (1994)Google Scholar
  27. 27.
    Rivas, V., Merelo, J., Castillo, P., Arenas, M., Castellano, J.: Evolving rbf neural networks for time-series forecasting with evrbf. Inf. Sci. 165(3–4), 207–220 (2004)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Sheskin, D.: Handbook of parametric and nonparametric statistical procedures. Chapman & Hall/CRC, Boca Raton (2004)Google Scholar
  29. 29.
    Snyder, R.: Recursive estimation of dynamic linear models. J. Roy. Stat. Soc. Ser. B (Methodological) 47(2), 272–276 (1985)Google Scholar
  30. 30.
    Takens, F.: Dynamical systems and turbulence, Lecture Notes In Mathematics, vol. 898, Chapter Detecting Strange Attractor in Turbulence, pp. 366–381. Springer, New York, NY (1980)Google Scholar
  31. 31.
    Tong, H.: On a threshold model. Pattern Recogn. signal process. NATO ASI Ser. E: Appl. Sc. 29, 575–586 (1978)Google Scholar
  32. 32.
    Tong, H.: Threshold models in non-linear time series analysis. Springer, Berlin (1983)Google Scholar
  33. 33.
    Whitehead, B., Choate, T.: Cooperative-competitive genetic evolution of radial basis function centers and widths for time series prediction. IEEE Trans. Neural Netw. 7(4), 869–880 (1996)CrossRefGoogle Scholar
  34. 34.
    Winters, P.: Forecasting sales by exponentially weighted moving averages. Manage.Sci. 6(3), 324–342 (1960)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • E. Parras-Gutierrez
    • 1
  • V. M. Rivas
    • 1
  • J. J. Merelo
    • 2
  1. 1.Department of Computer SciencesUniversity of JaenJaenSpain
  2. 2.Department of Computers, Architecture and TechnologyUniversity of GranadaGranadaSpain

Personalised recommendations