Abstract
Time series forecasting methods have been widely implemented in various domains across industry and academics. For decision-makers in the forecasting sector, decision processes like planning of facilities, an optimal day-to-day operation within the domain, etc. are complex with several different levels to be considered. These decisions address widely different time horizons and aspects of the system, making it difficult to model. The advent of deep learning in forecasting solved the need for expensive hand-crafted features and deep domain knowledge. This chapter aims at giving a structure to the existing literature for time series forecasting in deep learning. Based on the underlying structures of the technique, such as RNN, CNN, and transformer, we have categorized various deep learning-based time series forecasting techniques and provided a consolidated report. Additionally, we have performed experiments to compare these techniques on four different publicly available datasets. Finally, based on these experiments, we provide an intuitive reasoning behind these performances. We believe that this chapter shall help the researchers in choosing relevant techniques for future research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Andersen, T.G., et al.: Volatility forecasting. Tech. Rep. National Bureau of Economic Research (2005)
Böose, J.H., et al.: Probabilistic demand forecasting at scale. Proc. VLDB Endowment 10(12), 1694–1705 (2017)
Mudelsee, M.: Trend analysis of climate time series: a review of methods. Earth Sci. Rev. 190, 310–322 (2019)
Kalman, R.E.: A new approach to linear filtering and prediction problems. In: (1960)
Gardner, E.S., Jr.: Exponential smoothing: the state of the art. J. Forecasting 4(1), 1–28 (1985)
Harvey, A.C.: Forecasting, Structural Time Series Models and the Kalman Filter. Cambridge university press (1990)
Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)
Yu, H.F., Rao, N., Dhillon, I.S.: Temporal regularized matrix factorization for high-dimensionaltime series prediction. In: Advances in Neural Information Processing Systems, pp. 847–855 (2016)
Salinas, D., et al.: DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int. J. Forecasting (2019)
Lai, G., et al.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. ACM, pp. 95–104 (2018)
Sen, R., Yu, H.F., Dhillon, I.S.: Thinkglobally, act locally: a deep neural network approach to high-dimensional time series forecasting. In: Advances in Neural Information Processing Systems, pp. 4838–4847 (2019)
Gamboa, J.C.B.: Deep learning for time-series analysis (2017). arXiv preprint arXiv:1701.01887
Lim, B., Zohren, S.: Time series forecasting with deep learning: a survey (2020). arXiv preprint arXiv:2004.13408
Rangapuram, S.S., et al.: Deep state space models for time series forecasting. In: Advances in Neural Information Processing Systems, pp. 7785–7794 (2018)
Moghram I., Rahman, S: Analysis and evaluation of five short-term load forecasting techniques. IEEE Trans. Power Syst. 4(4), 1484–1491 (1989)
Barakat, E.H., et al.: Short-term peak demand forecasting in fast developing utility with inherit dynamic load characteristics. I. Application of classical time-series methods. II. Improved modelling of system dynamic load characteristics. IEEE Trans. Power Syst. 5(3), 813–824 (1990)
West, M., Harrison, J.: Bayesian Forecasting and Dynamic Models. Springer Science & Business Media (2013)
Chen, Z., Cichocki, A.: Nonnegative matrix factorization with temporal smoothness and/or spatial decorrelation constraints. In: Laboratory for Advanced Brain Signal Processing, RIKEN, Tech. Rep. 68 (2005)
Rallapalli, S., et al.: Exploiting temporal stability and low-rank structure for localization in mobile networks. In: Proceedings of the Sixteenth Annual International Conference on Mobile Computing and Networking, pp. 161–172 (2010)
Roughan, M., et al.: Spatio-temporal compressive sensing and internet traffic matrices (extended version). IEEE/ACM Trans. Networking 20(3), 662–676 (2011)
Smola, A.J., Kondor, R.: Kernels and regularization on graphs”. In: Learning Theory and Kernel Machines. Springer, pp. 144–158 (2003)
Edwards, T., et al.: Traffic trends analysis using neural networks. In: Proceedings of the International Workshop on Applications of Neural Networks to Telecommunications (1997)
Patterson, D.W., Chan, K.H., Tan, C.M.: Time series forecasting with neural nets: a comparative study. In: Proceedings of the International Conference on Neural Network Applictions to Signal Processing, pp. 269–274 (1993)
Bengio, S., Fessant, F., Collobert, D.: A connectionist system for medium-term horizon time series prediction. In: Proceedings of the International Workshop Application Neural Networks to Telecoms, pp. 308–315 (1995)
Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 115–143 (2002)
Malhotra, P., et al.: Long short term memory networks for anomaly detection in time series. In: Proceedings. Presses universitaires de Louvain, p. 89 (2015)
Guo, T., et al.: Robust online time series prediction with recurrent neural networks. In: 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA), pp. 816–825. IEEE (2016)
Hsu, D.: Multi-period time series modeling with sparsity via Bayesian variational inference (2017). arXiv preprint arXiv:1707.00666
Cinar, Y.G., et al.: Time series forecasting using RNNs: an extended attention mechanism to model periods and handle missing values (2017). arXiv preprint arXiv:1703.10089
Bandara, K., Bergmeir, C., Smyl, S: Forecasting across time series databases using recurrent neural networks on groups of similar series: a clustering approach. Expert Syst. Appl. 140, 112896 (2020)
Laptev, N., et al.: Time-series extreme event forecasting with neural networks at uber. In: International Conference on Machine Learning, vol. 34, pp. 1–5 (2017)
Che, Z., et al.: Recurrent neural networks for multi-variate time series with missing values. Sci. Rep. 8 (1), 6085 (2018)
Karpathy, A., Johnson, J., Fei-Fei, L.: Visualizing and understanding recurrent networks (2015). arXiv preprint arXiv:1506.02078
van der Westhuizen, J., Lasenby, J.: Visualizing LSTM decisions. In: stat1050, p. 23 (2017)
Greff, K., et al.: LSTM: a search space odyssey. IEEE Trans. Neural Networks Learn. Syst. 28(10), 2222–2232 (2016)
Chang, C.H., Rampasek, L., Goldenberg, A.: Dropout feature ranking for deep learning models (2017). arXiv preprint arXiv:1712.08645
Hinton, G.E., et al.: Improving neural networks by preventing co-adaptation of feature detectors (2012). arXiv preprint arXiv:1207.0580
Zhu, L., Laptev, N.: Deep and confident prediction for time series at uber. In: 2017 IEEE International Conference on Data Mining Workshops (ICDMW), pp. 103–110. IEEE (2017)
Caley, J.A., Lawrance, N.R.J., Hollinger, G.A.: Deep networks with confidence bounds for robotic information gathering (2017)
Faloutsos, C., et al.: Forecasting big time series: old and new. Proc. VLDB Endowment 11(12), 2102–2105 (2018)
Alexandrov, A., et al.: GluonTS: probabilistic time series models in Python (2019). arXiv preprint arXiv:1906.05264
Wen, R., et al.: A multi-horizon quantile recurrent forecaster (2017). arXiv preprint arXiv:1711.11053
Oreshkin, B.N., et al.: N-BEATS: neural basis expansion analysis for interpretable time series forecasting (2019). arXiv preprint arXiv:1905.10437
van den Oord, A., et al.: Wavenet: a generative model for raw audio (2016). arXiv preprint arXiv:1609.03499
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)
Devlin, J., et al.: Bert: pretraining of deep bidirectional transformers for language understanding (2018). arXiv preprint arXiv:1810.04805
Radford, A., et al.: Language models are unsupervised multitask learners (2019)
Yang, Z., et al.: XLNet: generalized autoregressive pretraining for language understanding (2019. arXiv preprint arXiv:1906.08237
Qin, Y., et al.: A dual-stage attention-based recurrent neural network for time series prediction (2017). arXiv preprint arXiv:1704.02971
Tao, Y., et al.: Hierarchical attention-based recurrent high-way networks for time series prediction (2018). arXiv preprint arXiv:1806.00685
Fan, C., et al.: Multi-horizon time series forecasting with temporal attention learning. In: Proceedings of the 25th ACMSIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 2527–2535 (2019)
Cinar, Y.G., et al.: Position-based content attention for time series forecasting with sequence-to-sequence RNNs. In: International Conference on Neural Information Processing, pp. 533–544. Springer (2017)
Shih, S.Y., Sun, F.K., Lee, H.Y.: Temporal pattern attention for multivariate time series forecasting. Mach. Learn. 108(8-9), 1421–1441 (2019)
Huang, S., et al.: DSANet: dual self-attention network for multivariate time series forecasting. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management, pp. 2129–2132 (2019)
Krishnan, R.G., Shalit, U., Sontag, D.: Deep Kalman filters.(2015). arXiv preprint arXiv:1511.05121
Krishnan, R.G., Shalit, U., Sontag, D.: Structured inference networks for nonlinear state space models. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)
Fraccaro, M., et al.: Sequential neural models with stochastic layers. In: Advances in Neural Information Processing Systems, pp. 2199–2207 (2016)
Fraccaro, M., et al.: A disentangled recognition and nonlinear dynamics model for unsupervised learning. In: Advances in Neural Information Processing Systems, pp. 3601–3610 (2017)
Maddix, D.C., Wang, Y., Smola, A.: Deep factors with gaussian processes for forecasting (2018). arXiv preprint arXiv:1812.00098
Smyl, S., Ranganathan, J., Pasqua, A.: M4 forecasting competition: introducing a new hybrid ES-RNN model. (2018). URL: https://eng.uber.com/m4-forecasting-competition/
Trindade A.: Electricity load diagrams 2011–2014 data set
Cuturi, M.: Fast global alignment kernels. In: Proceedings of the 28th International Conference on Machine Learning(ICML-11), pp. 929–936 (2011)
MOFC: The dataset (2018). URL: https://mofc.unic.ac.cy/the-dataset/
Wani, M.A., Bhat, F.A., Afzal, S., Khan, A.I.: Advances in deep learning
Wani, M.A., Kantardzic, M., Sayed-Mouchaweh, M.: Deep learning applications
Wani, M.A., Khoshgoftaar, T.M., Palade, V.: Deep learning applications, vol. 2
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Agarwal, K., Dheekollu, L., Dhama, G., Arora, A., Asthana, S., Bhowmik, T. (2022). Deep Learning-Based Time Series Forecasting. In: Wani, M.A., Raj, B., Luo, F., Dou, D. (eds) Deep Learning Applications, Volume 3. Advances in Intelligent Systems and Computing, vol 1395. Springer, Singapore. https://doi.org/10.1007/978-981-16-3357-7_6
Download citation
DOI: https://doi.org/10.1007/978-981-16-3357-7_6
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-3356-0
Online ISBN: 978-981-16-3357-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)