Abstract
The problem of forecasting a time series with a neural network is well-defined when considering a single step-ahead prediction. The situation becomes more tangled in the prediction on a multiple-step horizon and consequently the task can be framed in different ways. For example, one can develop a single-step predictor to be used recursively along the forecasting horizon (recursive approach) or develop a multi-output model that directly forecasts the entire sequence of output values (multi-output approach). Additionally, the internal structure of each predictor may be constituted by a classical feed-forward (FF) or by a recurrent architecture, such as the long short-term memory (LSTM) nets. The latter are traditionally trained with the teacher forcing algorithm (LSTM-TF) to speed up the convergence of the optimization, or without it (LSTM-no-TF), in order to avoid the issue of exposure bias. Time series forecasting requires organizing the available data into input-output sequences for parameter training, hyperparameter tuning and performance testing. An additional developers’ choice explored in the chapter is the definition of the similarity index (error metric) that the training procedure must optimize and the other performance indicators that may be used to examine how well the prediction replicates test data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bengio, S., et al. (2015). Scheduled sampling for sequence prediction with recurrent neural networks. Proceedings of the 29th Conference on Neural Information Processing Systems, 28, 1171–1179.
Bontempi, G., Ben Taieb, S., & Le Borgne, Y.-A. (2012). Machine learning strategies for time series forecasting. In European business intelligence summer school (pp. 62–77). Springer.
Chollet, F., et al. (2018). Keras: The python deep learning library. Astro- physics Source Code Library.
Dercole, F., Sangiorgio, M., & Schmirander, Y. (2020). An empirical assessment of the universality of ANNs to predict oscillatory time series. IFAC-PapersOnLine, 53.2, 1255–1260.
Goodfellow, I., Bengio, Y., & Courville, A. (2015). Deep learning. MIT Press.
Guariso, G., Nunnari, G., & Sangiorgio, M. (2020). Multi-step solar irradiance forecasting and domain adaptation of deep neural networks. Energies, 13.15, 3987.
Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9.8, 1735–1780.
Kennel, M. B., Brown, R., & Abarbanel, H. D. (1992). Determining embedding dimension for phase-space reconstruction using a geometrical construction. Physical Review A, 45.6, 3403.
Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv:1412.6980.
Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2018). Statistical and Machine Learning forecasting methods: Concerns and ways forward. PloS one, 13.3, e0194889.
McCuen, R. H., Knight, Z., & Cutter, G. (2006). Evaluation of the Nash-Sutcliffe efficiency index. Journal of Hydrologic Engineering, 11.6, 597–602.
Mihaylova, T., & Martins, A. F. T. (2019). Scheduled Sampling for Transformers. arXiv:1906.07651.
Nash, J. E., & Sutcliffe, J. V. (1970). River flow forecasting through conceptual models part I-A discussion of principles. Journal of Hydrology, 10.3, pp. 282–290.
Pan, S., & Duraisamy, K. (2018). Long-time predictive modeling of nonlinear dynamical systems using neural networks. Complexity.
Paszke, A., et al. (2017). Automatic differentiation in PyTorch. In Proceedings of the Thirty-fifth Conference on Neural Information Processing Systems.
Pathak, J., et al. (2017). Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 27.12, 121102.
Ranzato, M., et al. (2015). Sequence level training with recurrent neural networks. arXiv:1511.06732.
Rasp, S., et al. (2020). WeatherBench: A benchmark dataset for data-driven weather forecasting. Journal of Advances in Modeling Earth Systems, 12.1.
Sangiorgio, M. (2021). Deep learning in multi-step forecasting of chaotic dynamics. Ph.D. thesis. Department of Electronics, Information and Bioengineering, Politecnico di Milano.
Sangiorgio, M., & Dercole, F. (2020). Robustness of LSTM neural networks for multi-step forecasting of chaotic time series. Chaos, Solitons and Fractals, 139, 110045.
Sangiorgio, M., Dercole, F., & Guariso, G. (2021). Forecasting of noisy chaotic systems with deep neural networks. Chaos, Solitons & Fractals, 153, 111570.
Sutskever, I., Vinyals, O., & Le, Q.V. (2014). Sequence to sequence learning with neural networks. Proceedings of the 28th Conference on Neural Information Processing Systems, 27, 3104–3112.
Takens, F. (1981). Detecting strange attractors in turbulence. Dynamical systems and turbulence, Warwick 1980 (pp. 366–381). Springer.
Wang, R., Kalnay, E., & Balachandran, B. (2019). Neural machine-based forecasting of chaotic dynamics. Nonlinear Dynamics, 98.4, 2903–2917.
Williams, R. J., & Zipser, D. (1989). A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 1.2, 270–280.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Sangiorgio, M., Dercole, F., Guariso, G. (2021). Neural Approaches for Time Series Forecasting. In: Deep Learning in Multi-step Prediction of Chaotic Dynamics. SpringerBriefs in Applied Sciences and Technology(). Springer, Cham. https://doi.org/10.1007/978-3-030-94482-7_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-94482-7_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-94481-0
Online ISBN: 978-3-030-94482-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)