Advertisement

Position-Based Content Attention for Time Series Forecasting with Sequence-to-Sequence RNNs

  • Yagmur Gizem CinarEmail author
  • Hamid Mirisaee
  • Parantapa Goswami
  • Eric Gaussier
  • Ali Aït-Bachir
  • Vadim Strijov
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10638)

Abstract

We propose here an extended attention model for sequence-to-sequence recurrent neural networks (RNNs) designed to capture (pseudo-)periods in time series. This extended attention model can be deployed on top of any RNN and is shown to yield state-of-the-art performance for time series forecasting on several univariate and multivariate time series.

Keywords

Recurrent neural networks Attention model Time series 

References

  1. 1.
    De Gooijer, J.G., Hyndman, R.J.: 25 years of time series forecasting. Int. J. Forecast. 22(3), 443–473 (2006)CrossRefGoogle Scholar
  2. 2.
    Bontempi, G., Ben Taieb, S., Le Borgne, Y.-A.: Machine learning strategies for time series forecasting. In: Aufaure, M.-A., Zimányi, E. (eds.) eBISS 2012. LNBIP, vol. 138, pp. 62–77. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-36318-4_3 CrossRefGoogle Scholar
  3. 3.
    Graves, A.: Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850 (2013)
  4. 4.
    Weston, J., Chopra, S., Bordes, A.: Memory networks. CoRR abs/1410.3916 (2014)Google Scholar
  5. 5.
    Weston, J., Bordes, A., Chopra, S., Mikolov, T.: Towards AI-Complete question answering: A set of prerequisite toy tasks. CoRR abs/1502.05698 (2015)Google Scholar
  6. 6.
    Graves, A., Wayne, G., Reynolds, M., Harley, T., Danihelka, I., Grabska-Barwinska, A., Colmenarejo, S.G., Grefenstette, E., Ramalho, T., Agapiou, J., Badia, A.P., Hermann, K.M., Zwols, Y., Ostrovski, G., Cain, A., King, H., Summerfield, C., Blunsom, P., Kavukcuoglu, K., Hassabis, D.: Hybrid computing using a neural network with dynamic external memory. Nature 538(7626), 471–476 (2016)CrossRefGoogle Scholar
  7. 7.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
  8. 8.
    Vinyals, O., Fortunato, M., Jaitly, N.: Pointer networks. In: NIPS, pp. 2692–2700 (2015)Google Scholar
  9. 9.
    Walker, G.: On periodicity in series of related terms. In: Proceedings of Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character, vol. 131, no. 818, pp. 518–532 (1931)Google Scholar
  10. 10.
    Slutzky, E.: The summation of random causes as the source of cyclic processes. Econometrica: J. Econometr. Soc. 5(2), 105–146 (1937)CrossRefzbMATHGoogle Scholar
  11. 11.
    Box, G.E., Jenkins, G.M.: Some recent advances in forecasting and control. J. Roy. Stat. Soc.: Ser. C (Appl. Stat.) 17(2), 91–109 (1968)MathSciNetGoogle Scholar
  12. 12.
    Tiao, G.C., Box, G.E.: Modeling multiple time series with applications. J. Am. Stat. Assoc. 76(376), 802–816 (1981)MathSciNetzbMATHGoogle Scholar
  13. 13.
    Sapankevych, N.I., Sankar, R.: Time series prediction using support vector machines: a survey (2009)Google Scholar
  14. 14.
    Creamer, G.G., Freund, Y.: Predicting performance and quantifying corporate governance risk for Latin American ADRs and banks (2004)Google Scholar
  15. 15.
    Kusiak, A., Verma, A., Wei, X.: A data-mining approach to predict influent quality. Environ. Monit. Assess. 185(3), 2197–2210 (2013)CrossRefGoogle Scholar
  16. 16.
    Kane, M.J., Price, N., Scotch, M., Rabinowitz, P.: Comparison of arima and random forest time series models for prediction of avian influenza H5N1 outbreaks. BMC Bioinform. 15(1), 276 (2014)CrossRefGoogle Scholar
  17. 17.
    Connor, J., Atlas, L.E., Martin, D.R.: Recurrent networks and NARMA modeling. In: NIPS, pp. 301–308 (1991)Google Scholar
  18. 18.
    Giles, C.L., Lawrence, S., Tsoi, A.C.: Noisy time series prediction using recurrent neural networks and grammatical inference. Mach. Learn. 44(1–2), 161–183 (2001)CrossRefzbMATHGoogle Scholar
  19. 19.
    Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004)CrossRefGoogle Scholar
  20. 20.
    Hsieh, T.J., Hsiao, H.F., Yeh, W.C.: Forecasting stock markets using wavelet transforms and recurrent neural networks: an integrated system based on artificial bee colony algorithm. Appl. Soft Comput. 11(2), 2510–2525 (2011)CrossRefGoogle Scholar
  21. 21.
    Längkvist, M., Karlsson, L., Loutfi, A.: A review of unsupervised feature learning and deep learning for time-series modeling. Pattern Recogn. Lett. 42, 11–24 (2014)CrossRefGoogle Scholar
  22. 22.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  23. 23.
    Gers, F.A., Eck, D., Schmidhuber, J.: Applying LSTM to time series predictable through time-window approaches. In: Tagliaferri, R., Marinaro, M. (eds.) Neural Nets WIRN Vietri-01, pp. 669–676. Springer, London (2001). doi: 10.1007/978-1-4471-0219-9_20 Google Scholar
  24. 24.
    Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res. 3(8), 115–143 (2002)MathSciNetzbMATHGoogle Scholar
  25. 25.
    Ranzato, M., Szlam, A., Bruna, J., Mathieu, M., Collobert, R., Chopra, S.: Video (language) modeling: a baseline for generative models of natural videos. arXiv preprint arXiv:1412.6604 (2014)
  26. 26.
    Xingjian, S., Chen, Z., Wang, H., Yeung, D.Y., Wong, W.K., Woo, W.c.: Convolutional LSTM network: a machine learning approach for precipitation nowcasting. In: Advances in Neural Information Processing Systems, pp. 802–810 (2015)Google Scholar
  27. 27.
    Lipton, Z.C., Kale, D.C., Elkan, C., Wetzell, R.: Learning to diagnose with LSTM recurrent neural networks. arXiv preprint arXiv:1511.03677 (2015)
  28. 28.
    Riemer, M., Vempaty, A., Calmon, F.P., Heath III., F.F., Hull, R., Khabiri, E.: Correcting forecasts with multifactor neural attention. In: Proceedings of The 33rd International Conference on Machine Learning, pp. 3010–3019 (2016)Google Scholar
  29. 29.
    Choi, E., Bahadori, M.T., Sun, J., Kulas, J., Schuetz, A., Stewart, W.: RETAIN: an interpretable predictive model for healthcare using reverse time attention mechanism. In: NIPS, pp. 3504–3512 (2016)Google Scholar
  30. 30.
    Schuster, M., Paliwal, K.K.: Bidirectional recurrent neural networks. IEEE Trans. Sig. Process. 45(11), 2673–2681 (1997)CrossRefGoogle Scholar
  31. 31.
  32. 32.
    Hyndman, R., Khandakar, Y.: Automatic time series forecasting: the forecast package for R. J. Stat. Softw. 27(3), 1–22 (2008). https://www.jstatsoft.org/v027/i03
  33. 33.
    Kingma, D., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Yagmur Gizem Cinar
    • 1
    Email author
  • Hamid Mirisaee
    • 1
  • Parantapa Goswami
    • 2
  • Eric Gaussier
    • 1
  • Ali Aït-Bachir
    • 3
  • Vadim Strijov
    • 4
  1. 1.Univ. Grenoble Alpes, CNRS, Grenoble INP, LIGGrenobleFrance
  2. 2.Viseo R&DGrenobleFrance
  3. 3.CoservitGrenobleFrance
  4. 4.Moscow Institute of Physics and TechnologyMoscowRussia

Personalised recommendations