Time Series Forecasting Through a Dynamic Weighted Ensemble Approach

Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 43)

Abstract

Time series forecasting has crucial significance in almost every practical domain. From past few decades, there is an ever-increasing research interest on fruitfully combining forecasts from multiple models. The existing combination methods are mostly based on time-invariant combining weights. This paper proposes a dynamic ensemble approach that updates the weights after each new forecast. The weight of each component model is changed on the basis of its past and current forecasting performances. Empirical analysis with real time series shows that the proposed method has substantially improved the forecasting accuracy. In addition, it has also outperformed each component model as well as various existing static weighted ensemble schemes.

Keywords

Time series forecasting Forecasts combination Changing weights Forecasting accuracy 

References

  1. 1.
    Wu, S.F., Lee, S.J.: Employing local modeling in machine learning based methods for time-series prediction. Expert Syst. Appl. 42(1), 341–354 (2015)CrossRefGoogle Scholar
  2. 2.
    Lemke, C., Gabrys, B.: Meta-learning for time series forecasting and forecast combination. Neurocomputing 73(10), 2006–2016 (2010)CrossRefGoogle Scholar
  3. 3.
    de Gooijer, J.G., Hyndman, R.J.: 25 years of time series forecasting. Int. J. Forecast. 22(3), 443–473 (2006)CrossRefGoogle Scholar
  4. 4.
    Gheyas, I.A., Smith, L.S.: A novel neural network ensemble architecture for time series forecasting. Neurocomputing 74(18), 3855–3864 (2011)CrossRefGoogle Scholar
  5. 5.
    Clemen, R.T.: Combining forecasts: a review and annotated bibliography. Int. J. Forecast. 5(4), 559–583 (1989)CrossRefGoogle Scholar
  6. 6.
    Andrawis, R.R., Atiya, A.F., El-Shishiny, H.: Forecast combinations of computational intelligence and linear models for the NN5 time series forecasting competition. Int. J. Forecast. 27(3), 672–688 (2011)CrossRefGoogle Scholar
  7. 7.
    De Menezes, L.M., Bunn, D.W., Taylor, J.W.: Review of guidelines for the use of combined forecasts. Eur. J. Oper. Res. 120(1), 190–204 (2000)MATHCrossRefGoogle Scholar
  8. 8.
    Jose, V.R.R., Winkler, R.L.: Simple robust averages of forecasts: some empirical results. Int. J. Forecast. 24(1), 163–169 (2008)CrossRefGoogle Scholar
  9. 9.
    Deutsch, M., Granger, C.W.J., Teräsvirta, T.: The combination of forecasts using changing weights. Int. J. Forecast. 10(1), 47–57 (1994)CrossRefGoogle Scholar
  10. 10.
    Fiordaliso, A.: A nonlinear forecasts combination method based on Takagi-Sugeno fuzzy systems. Int. J. Forecast. 14(3), 367–379 (1998)CrossRefGoogle Scholar
  11. 11.
    Zou, H., Yang, Y.: Combining time series models for forecasting. Int. J. Forecast. 20(1), 69–84 (2004)CrossRefGoogle Scholar
  12. 12.
    Adhikari, R., Agrawal, R.K.: Performance evaluation of weights selection schemes for linear combination of multiple forecasts. Artif. Intell. Rev. 42(4), 1–20 (2012)Google Scholar
  13. 13.
    Granger, C.W.J., Ramanathan, R.: Improved methods of combining forecasts. J. Forecast. 3(2), 197–204 (1984)CrossRefGoogle Scholar
  14. 14.
    Bunn, D.W.: A Bayesian approach to the linear combination of forecasts. Oper. R. Q. 26(2), 325–329 (1975)MATHCrossRefGoogle Scholar
  15. 15.
    Hamzaçebi, C.: Improving artificial neural networks’ performance in seasonal time series forecasting. Inf. Sci. 178(23), 4550–4559 (2008)CrossRefGoogle Scholar
  16. 16.
    Golub, G.H., Van Loan, C.F.: Matrix computations, vol. 3, 3rd edn. The John Hopkins University Press, Baltimore, USA (2012)Google Scholar
  17. 17.
    Zhang, G.P.: Time series forecasting using a hybrid arima and neural network model. Neurocomputing 50, 159–175 (2003)MATHCrossRefGoogle Scholar
  18. 18.
    Box, G.E.P., Jenkins, G.M., Reinsel, G.C.: Time series analysis: forecasting and control. Prentice-Hall, Englewood Cliffs (1994)MATHGoogle Scholar
  19. 19.
    Adhikari, R., Agrawal, R.K.: A combination of artificial neural network and random walk models for financial time series forecasting. Neural Comput. Appl. 24(6), 1441–1449 (2014)CrossRefGoogle Scholar
  20. 20.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer-Verlag, New York (1995)MATHCrossRefGoogle Scholar
  21. 21.
    Suykens, J.A.K., Vandewalle, J.: Least squares support vector machines classifiers. Neural Process. Lett. 9(3), 293–300 (1999)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Zhao, J., Zhu, X., Wang, W., Liu, Y.: Extended kalman filter-based elman networks for industrial time series prediction with GPU acceleration. Neurocomputing 118, 215–224 (2013)CrossRefGoogle Scholar
  23. 23.
    Hamzaçebi, C., Akay, D., Kutay, F.: Comparison of direct and iterative artificial neural network forecast approaches in multi-periodic time series forecasting. Expert Syst. Appl. 36(2), 3839–3844 (2009)CrossRefGoogle Scholar
  24. 24.
    Demuth, H., Beale, M., Hagan, M.: Neural Network Toolbox User’s Guide. The MathWorks, Natic (2010)Google Scholar
  25. 25.
    Pelckmans, K., Suykens, J.A., Van Gestel, T., De Brabanter, J., Lukas, L., Hamers, B., De Moor, B., Vandewalle, J.: LS-SVMlab toolbox user’s guide. Pattern Recogn. Lett. 24, 659–675 (2003)CrossRefGoogle Scholar
  26. 26.
    Data market (2014): http://datamarket.com/
  27. 27.
    Open Govt. data platform, India (2014): http://data.gov.in
  28. 28.
    Yahoo! Finance (2014): http://finance.yahoo.com

Copyright information

© Springer India 2016

Authors and Affiliations

  1. 1.Department of Computer Science & EngineeringThe LNM Institute of Information TechnologyJaipurIndia

Personalised recommendations