Abstract
Time series forecasting techniques range from ARIMA over exponential smoothing to neural approaches, such as convolutional neural networks. However, most of them were designed to work with regularly sampled and complete time series, i.e., time series which can be represented as a sequence of numbers without missing values. In contrast, we consider the task of forecasting irregularly sampled time series in this paper. We argue that, compared with “usual” convolution, sparsity-invariant convolution is better suited for the case of irregularly sampled time series, therefore, we propose to use neural networks with sparsity-invariant convolution. We perform experiments on 30 publicly-available real-world time series datasets and show that sparsity-invariant convolution significantly improves the performance of convolutional neural networks in case of forecasting irregularly sampled time series. In order to support reproduction, independent validation and follow-up works, we made our implementation (software code) publicly available at https://github.com/kr7/timeseriesforecast-siconv.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
In our case, each time series dataset contains several time series. For example, the ECG5000 dataset contains in total 5000 time series, and each of these 5000 time series have a length of 140. In order to avoid data leakage [7], when partitioning data, an entire time series is assigned to one of the splits. For each time series belonging to the test split, we aim to predict its last h values. The segment we aim to predict is unknown to the model.
- 3.
- 4.
References
Borovykh, A., Bohte, S., Oosterlee, C.W.: Dilated convolutional neural networks for time series forecasting. J. Comput. Finan. Forthcoming (2018)
Box, G.E., Jenkins, G.M., Reinsel, G.C., Ljung, G.M.: Time Series Analysis: Forecasting and Control. Wiley, Hoboken (2015)
Buza, K., Antal, M.: Convolutional neural networks with dynamic convolution for time series classification. In: Wojtkiewicz, K., Treur, J., Pimenidis, E., Maleszka, M. (eds.) ICCCI 2021. CCIS, vol. 1463, pp. 304–312. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-88113-9_24
Chatfield, C.: Time-Series Forecasting. Chapman and Hall/CRC, Boca Raton (2000)
Che, Z., Purushotham, S., Cho, K., Sontag, D., Liu, Y.: Recurrent neural networks for multivariate time series with missing values. Sci. Rep. 8(1), 6085 (2018)
Chen, Y., Kang, Y., Chen, Y., Wang, Z.: Probabilistic forecasting with temporal convolutional neural network. Neurocomputing 399, 491–501 (2020)
David, Z.: Information leakage in financial machine learning research. Algorithmic Finan. 8(1–2), 1–4 (2019)
Gardner, E.S., Jr.: Exponential smoothing: the state of the art-part ii. Int. J. Forecast. 22(4), 637–666 (2006)
Kaushik, S., et al.: Ai in healthcare: time-series forecasting using statistical, neural, and ensemble architectures. Front. Big Data 3, 4 (2020)
Kim, K.j.: Financial time series forecasting using support vector machines. Neurocomputing 55(1–2), 307–319 (2003)
Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Lim, B., Zohren, S.: Time series forecasting with deep learning: a survey. Phil. Trans. R. Soc. A 379(2194), 20200209 (2021)
Ramesh, A.N., Giovanneschi, F., González-Huici, M.A.: SIUNet: sparsity invariant u-net for edge-aware depth completion. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 5818–5827 (2023)
Ravuri, S., et al.: Skilful precipitation nowcasting using deep generative models of radar. Nature 597(7878), 672–677 (2021)
Seeger, M.W., Salinas, D., Flunkert, V.: Bayesian intermittent demand forecasting for large inventories. In: Advances in Neural Information Processing Systems, vol. 29 (2016)
Sen, R., Yu, H.F., Dhillon, I.S.: Think globally, act locally: a deep neural network approach to high-dimensional time series forecasting. In: Advances in Neural Information Processing Systems, vol. 32 (2019)
Sezer, O.B., Gudelek, M.U., Ozbayoglu, A.M.: Financial time series forecasting with deep learning: a systematic literature review: 2005–2019. Appl. Soft Comput. 90, 106181 (2020)
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)
Torres, J.F., Hadjout, D., Sebaa, A., Martínez-Álvarez, F., Troncoso, A.: Deep learning for time series forecasting: a survey. Big Data 9(1), 3–21 (2021)
Uhrig, J., Schneider, N., Schneider, L., Franke, U., Brox, T., Geiger, A.: Sparsity invariant CNNs. In: 2017 International Conference on 3D Vision (3DV), pp. 11–20. IEEE (2017)
Wang, Z., Yan, W., Oates, T.: Time series classification from scratch with deep neural networks: A strong baseline. In: 2017 International Joint Conference on Neural Networks (IJCNN), pp. 1578–1585. IEEE (2017)
Yadav, P., Steinbach, M., Kumar, V., Simon, G.: Mining electronic health records (EHRs) a survey. ACM Comput. Surv. (CSUR) 50(6), 1–40 (2018)
Yan, L., Liu, K., Belyaev, E.: Revisiting sparsity invariant convolution: a network for image guided depth completion. IEEE Access 8, 126323–126332 (2020)
Acknowledgement
This work was supported by the European Union through GraphMassivizer EU HE project under grant agreement No 101093202.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Buza, K. (2023). Sparsity-Invariant Convolution for Forecasting Irregularly Sampled Time Series. In: Nguyen, N.T., et al. Computational Collective Intelligence. ICCCI 2023. Lecture Notes in Computer Science(), vol 14162. Springer, Cham. https://doi.org/10.1007/978-3-031-41456-5_12
Download citation
DOI: https://doi.org/10.1007/978-3-031-41456-5_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-41455-8
Online ISBN: 978-3-031-41456-5
eBook Packages: Computer ScienceComputer Science (R0)