Arbitrated Ensemble for Time Series Forecasting

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10535)


This paper proposes an ensemble method for time series forecasting tasks. Combining different forecasting models is a common approach to tackle these problems. State-of-the-art methods track the loss of the available models and adapt their weights accordingly. Metalearning strategies such as stacking are also used in these tasks. We propose a metalearning approach for adaptively combining forecasting models that specializes them across the time series. Our assumption is that different forecasting models have different areas of expertise and a varying relative performance. Moreover, many time series show recurring structures due to factors such as seasonality. Therefore, the ability of a method to deal with changes in relative performance of models as well as recurrent changes in the data distribution can be very useful in dynamic environments. Our approach is based on an ensemble of heterogeneous forecasters, arbitrated by a metalearning model. This strategy is designed to cope with the different dynamics of time series and quickly adapt the ensemble to regime changes. We validate our proposal using time series from several real world domains. Empirical results show the competitiveness of the method in comparison to state-of-the-art approaches for combining forecasters.


Dynamic ensembles Metalearning Time series Numerical prediction Reproducible research 



This work is financed by the ERDF - European Regional Development Fund through the Operational Programme for Competitiveness and Internationalisation - COMPETE 2020 Programme within project POCI-01-0145-FEDER-006961, and by National Funds through the FCT - Fundação para a Ciência e a Tecnologia (Portuguese Foundation for Science and Technology) as part of project UID/EEA/50014/2013; Project “NORTE-01-0145-FEDER-000036” is financed by the North Portugal Regional Operational Programme (NORTE 2020), under the PORTUGAL 2020 Partnership Agreement, and through the European Regional Development Fund (ERDF). This work was partly funded by the ECSEL Joint Undertaking, program for research and innovation horizon 2020 (20142020) under grant agreement number 662189-MANTIS-2014-1.


  1. 1.
    ADDP: Oporto water consumption. Accessed 21 Nov 2016
  2. 2.
    Aiolfi, M., Timmermann, A.: Persistence in forecasting performance and conditional combination strategies. J. Econometr. 135(1), 31–53 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Wilcox, S., Andreas, A.: Solar Radiation Monitoring Station (SoRMS): Humboldt State University, Arcata, California; NREL Rep DA-5500-56515 (2007)Google Scholar
  4. 4.
    Brazdil, P., Carrier, C.G., Soares, C., Vilalta, R.: Metalearning: Applications to Data Mining. Springer Science & Business Media, Heidelberg (2008)zbMATHGoogle Scholar
  5. 5.
    Brown, G., Kuncheva, L.I.: “Good” and “Bad” diversity in majority vote ensembles. In: El Gayar, N., Kittler, J., Roli, F. (eds.) MCS 2010. LNCS, vol. 5997, pp. 124–133. Springer, Heidelberg (2010). CrossRefGoogle Scholar
  6. 6.
    Cerqueira, V., Torgo, L., Soares, C.: Arbitrated ensemble for solar radiation forecasting. In: Rojas, I., Joya, G., Catala, A. (eds.) IWANN 2017. LNCS, vol. 10305, pp. 720–732. Springer, Cham (2017). CrossRefGoogle Scholar
  7. 7.
    Clemen, R.T., Winkler, R.L.: Combining economic forecasts. J. Bus. Econ. Stat. 4(1), 39–46 (1986)Google Scholar
  8. 8.
    Crane, D.B., Crotty, J.R.: A two-stage forecasting model: exponential smoothing and multiple regression. Manag. Sci. 13(8), B-501 (1967)CrossRefGoogle Scholar
  9. 9.
    Dawid, A.P.: Present position and potential developments: some personal views: statistical theory: the prequential approach. J. Roy. Stat. Soc. Ser. A (Gener.) 278–292 (1984)Google Scholar
  10. 10.
    EERE: Commercial and residential hourly load profiles TMY3 location in the USA. Accessed 21 Nov 2016
  11. 11.
    Friedman, J., Hastie, T., Tibshirani, R.: Regularization paths for generalized linear models via coordinate descent. J. Stat. Softw. 33(1), 1–22 (2010)CrossRefGoogle Scholar
  12. 12.
    Gama, J., Kosina, P.: Tracking recurring concepts with meta-learners. In: Lopes, L.S., Lau, N., Mariano, P., Rocha, L.M. (eds.) EPIA 2009. LNCS (LNAI), vol. 5816, pp. 423–434. Springer, Heidelberg (2009). CrossRefGoogle Scholar
  13. 13.
    Hyndman, R.J., with contributions from Athanasopoulos, G., Razbash, S., Schmidt, D., Zhou, Z., Khan, Y., Bergmeir, C., Wang, E.: forecast: Forecasting functions for time series and linear models (2014). R package version 5.6Google Scholar
  14. 14.
    Jose, V.R.R., Winkler, R.L.: Simple robust averages of forecasts: some empirical results. Int. J. Forecast. 24(1), 163–169 (2008)CrossRefGoogle Scholar
  15. 15.
    Karatzoglou, A., Smola, A., Hornik, K., Zeileis, A.: kernlab - an S4 package for kernel methods in R. J. Stat. Softw. 11(9), 1–20 (2004)CrossRefGoogle Scholar
  16. 16.
    Kuhn, M., Weston, S., Keefer, C., Coulter, N., C code for Cubist by Ross Quinlan: Cubist: Rule- and Instance-Based Regression Modeling (2014). R package version 0.0.18Google Scholar
  17. 17.
    Lichman, M.: UCI Machine Learning Repository (2013).
  18. 18.
    Milborrow, S.: earth: Multivariate Adaptive Regression Spline Models. Derived from mda:mars by Trevor Hastie and Rob Tibshirani (2012)Google Scholar
  19. 19.
    Newbold, P., Granger, C.W.: Experience with forecasting univariate time series and the combination of forecasts. J. Roy. Stat. Soc. Ser. A (Gener.) 131–165 (1974)Google Scholar
  20. 20.
    Oliveira, M., Torgo, L.: Ensembles for time series forecasting. In: ACML Proceedings of Asian Conference on Machine Learning, JMLR: Workshop and Conference Proceedings (2014)Google Scholar
  21. 21.
    Ortega, J., Koppel, M., Argamon, S.: Arbitrating among competing classifiers using learned referees. Knowl. Inf. Syst. 3(4), 470–490 (2001)CrossRefzbMATHGoogle Scholar
  22. 22.
    Pinto, F., Soares, C., Mendes-Moreira, J.: CHADE: metalearning with classifier chains for dynamic combination of classifiers. In: Frasconi, P., Landwehr, N., Manco, G., Vreeken, J. (eds.) ECML PKDD 2016. LNCS (LNAI), vol. 9851, pp. 410–425. Springer, Cham (2016). CrossRefGoogle Scholar
  23. 23.
    R Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2013)Google Scholar
  24. 24.
    Ridgeway, G.: GBM: Generalized Boosted Regression Models (2015). R package version 2.1.1Google Scholar
  25. 25.
    Sánchez, I.: Adaptive combination of forecasts with application to wind energy. Int. J. Forecast. 24(4), 679–693 (2008)CrossRefGoogle Scholar
  26. 26.
    Timmermann, A.: Forecast combinations. In: Handbook of Economic Forecasting, vol. 1, pp. 135–196 (2006)Google Scholar
  27. 27.
    Timmermann, A.: Elusive return predictability. Int. J. Forecast. 24(1), 1–18 (2008)CrossRefGoogle Scholar
  28. 28.
    Torgo, L.: An Infra-structure for Performance Estimation and Experimental Comparison of Predictive Models (2013). R package version 0.1.1Google Scholar
  29. 29.
    Venables, W.N., Ripley, B.D.: Modern Applied Statistics with S, 4th edn. Springer, New York (2002). ISBN 0-387-95457-0CrossRefzbMATHGoogle Scholar
  30. 30.
    Wolpert, D.H.: Stacked generalization. Neural Netw. 5(2), 241–259 (1992)CrossRefGoogle Scholar
  31. 31.
    Wright, M.N.: ranger: A Fast Implementation of Random Forests (2015). R packageGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.University of PortoPortoPortugal
  2. 2.INESC TECPortoPortugal

Personalised recommendations