Abstract
Multivariate time series (MTS) prediction has always been an important part of sequence prediction. Recently, many researchers have proposed many deep learning models for multivariate time series prediction. Transformer-based models have shown great potential in this regard, as they can better capture the long-term dependencies between sequences, which has great advantages in sequence prediction tasks. However, many existing models focus on encoding and embedding time positions when processing time series, ignoring the dependencies between different dimensions at different times. MTS is not only related in the temporal dimension, but also in the spatial dimension, therefore we propose the TSEGformer. This is a transformer-based model that considers not only temporal and positional information, but also information between different dimensions in the sequence embedding section. Dimension Segment Mean Fusion (DSMF) is proposed, and the input MTS is embedded into a new 2D vector matrix by a module containing temporal and spatial information. Two Part Attention (TPA) layer has also been proposed to effectively capture the relationships between sequences across time and space dimensions. We established the Encoder-Decoder framework and conducted experiments on 5 real datasets, which yielded impressive results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Anderson, O., Kendall, M.: Time-series, 2nd edn. J. R. Stat. Soc. (Series D) (1976)
Patton, A.: Copula methods for forecasting multivariate time series. Handb. Econ. Forecast. (2013)
Demirel, O.F., Zaim, S., Caliskan, A., Ozuyar, P.: Forecasting natural gas consumption in Istanbul using neural networks and multivariate time series methods. Turk. J. Electr. Eng. Comput. Sci. (2012)
Angryk, R.A., et al.: Multivariate time series dataset for space weather data analytics. Sci. Data (2020)
Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting (2019). arXiv:1907.00235
Vaswani, A., et al.: Attention is all you need. In: NeurIPS (2017)
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: AAAI (2021)
Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting
Liu, S., et al.: PYRAFORMER: low-complexity pyramidal attention for long-range time series modeling and forecasting. In: International Conference on Learning Representations (ICLR) (2021a)
Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning (ICML) (2022)
Kitaev, N., Kaiser, L., Levskaya, A: Reformer: the efficient transformer. In: ICLR (2020)
Du, D., Su, B., Wei, Z.: Preformer: predictive transformer with multi-scale segment wise correlations for long-term time series forecasting (2022). arXiv preprint arXiv:2202.11356v1
Wu, Z., Pan, S., Long, G., Jiang, J., Chang, X., Zhang, C.: Connecting the dots: multivariate time series forecasting with grap neural networks. In: ACM SIGKDD International Conference on Knowledge Discovery Data Mining (KDD) (2020)
Ariyo, A.A., Adewumi, A.O., Ayo, C.K.: Stock price prediction using the ARIMA model. In The 16th International Conference on Computer Modelling and Simulation, pp. 106–112. IEEE (2014). cating Backpropagation Through Time to Control Gradient Bias. arXiv:1905.07473
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. (1997)
Lea, C., Flynn, M.D., Vidal, R., Reiter, A., Hager, G.D.: Temporal convolutional networks for action segmentation and detection. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2017)
Sen, R., Yu, H.F., Dhillon, I.S.: Think globally, act locally: a deep neural network approach to high-dimensional time series forecasting. In: NeurIPS (2019)
Lai, G., Chang, W.C., Yang, Y., Liu, H.: Modeling long- and short-term temporal patterns with deep neural networks. In: International ACM SIGIR Conference on Research Development in Information Retrieval (SIGIR) (2018)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (2019)
Dosovitskiy, A., et al.: An image is worth \(16 \times 16\) words: transformers for image recognition at scale. In: International Conference on Learning Representations (ICLR) (2021)
Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: NeurIPS (2019)
Yunhao, Z., Junchi, Y.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting
Zhang, Y., Yan, J.: Crossformer: transformer utilizing crossdimension dependency for multivariate time series forecasting
Taylor, S.J., Letham, B.: Forecasting at scale. Am. Stat. (2018)
Shih, S.-Y., Sun, F.-K., Lee, H.-Y.: Temporal pattern attention for multivariate time series forecasting. Mach. Learn. (2019)
Song, H., Rajan, D., Thiagarajan, J., Spanias, A.: Attend and diagnose: clinical time series analysis using attention models. In: AAAI (2018)
Angryk, R.A., et al.: Multivariate time series dataset for space weather data analytics. Sci. Data (2020)
Paszke, A., et al.: Pytorch: an imperative style, high-performance deep learning library. In: NeurIPS (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Feng, Y., Yu, Q. (2023). TSEGformer: Time-Space Dimension Dependency Transformer for Use in Multivariate Time Series Prediction. In: Zhang, F., Wang, H., Barhamgi, M., Chen, L., Zhou, R. (eds) Web Information Systems Engineering – WISE 2023. WISE 2023. Lecture Notes in Computer Science, vol 14306. Springer, Singapore. https://doi.org/10.1007/978-981-99-7254-8_38
Download citation
DOI: https://doi.org/10.1007/978-981-99-7254-8_38
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-7253-1
Online ISBN: 978-981-99-7254-8
eBook Packages: Computer ScienceComputer Science (R0)