Advertisement

Transfer Learning for Financial Time Series Forecasting

  • Qi-Qiao He
  • Patrick Cheong-Iao Pang
  • Yain-Whar SiEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11671)

Abstract

Time-series are widely used for representing non-stationary data such as weather information, health related data, economic and stock market indexes. Many statistical methods and traditional machine learning techniques are commonly used for forecasting time series. With the development of deep learning in artificial intelligence, many researchers have adopted new models from artificial neural networks for forecasting time series. However, poor performance of applying deep learning models in short time series hinders the accuracy in time series forecasting. In this paper, we propose a novel approach to alleviate this problem based on transfer learning. Existing work on transfer learning uses extracted features from a source dataset for prediction task in a target dataset. In this paper, we propose a new training strategy for time-series transfer learning with two source datasets that outperform existing approaches. The effectiveness of our approach is evaluated on financial time series extracted from stock markets. Experiment results show that transfer learning based on 2 data sets is superior than other base-line methods.

Keywords

Transfer learning Financial time series Forecasting Artificial neural networks 

Notes

Acknowledgement

The research was funded by the Research Committee of University of Macau, Grant MYRG2018-00246-FST.

References

  1. 1.
    Keras (2015). https://keras.io/
  2. 2.
    Abadi, M., et al.: TensorFlow: a system for large-scale machine learning. In: \(12^{th}\) USENIX Symposium on Operating Systems Design and Implementation (OSDI 2016), pp. 265–283 (2016)Google Scholar
  3. 3.
    Amaral, T., Silva, L.M., Alexandre, L.A., Kandaswamy, C., de Sá, J.M., Santos, J.M.: Transfer learning using rotated image data to improve deep neural network performance. In: Campilho, A., Kamel, M. (eds.) ICIAR 2014. LNCS, vol. 8814, pp. 290–300. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-11758-4_32CrossRefGoogle Scholar
  4. 4.
    Bengio, Y., Lamblin, P., Popovici, D., Larochelle, H.: Greedy layer-wise training of deep networks. In: Advances in Neural Information Processing Systems, pp. 153–160 (2007)Google Scholar
  5. 5.
    Berndt, D., Clifford, J.: Using dynamic time warping to find patterns in time series. In: KDD Workshop, vol. 10, no. 16, pp. 359–370 (1994)Google Scholar
  6. 6.
    Deng, L., Yu, D.: Deep learning for signal and information processing. Found. Trends Signal Process. 2–3, 197–387 (2013)Google Scholar
  7. 7.
    Ding, X., Zhang, Y., Liu, T., Duan, J.: Deep learning for event-driven stock prediction. In: Proceedings of the 24th International Conference on Artificial Intelligence, pp. 2327–2333. AAAI Press (2015)Google Scholar
  8. 8.
    Fawaz, H.I., Forestier, G., Weber, J., Idoumghar, L., Muller, P.A.: Transfer learning for time series classification. In: 2018 IEEE International Conference on Big Data (Big Data). pp. 1367–1376. IEEE (2018)Google Scholar
  9. 9.
    Gardner Jr., E.S.: Exponential smoothing: the state of the art. Int. J. Forecast. 4(1), 1–28 (1985)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Gardner Jr., E.S.: Exponential smoothing: the state of the art–part ii. Int. J. Forecast. 22(4), 637–666 (2006)CrossRefGoogle Scholar
  11. 11.
    Glorot, X., Bordes, A., Bengio, Y.: Domain adaptation for large-scale sentiment classification: a deep learning approach. In: Proceedings of the 28th International Conference on Machine Learning (ICML-11), pp. 513–520 (2011)Google Scholar
  12. 12.
    Haque, A.U., Nehrir, M.H., Mandal, P.: A hybrid intelligent model for deterministic and quantile regression approach for probabilistic wind power forecasting. IEEE Trans. Power Syst. 29(4), 1663–1672 (2014)CrossRefGoogle Scholar
  13. 13.
    Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Hu, Q., Zhang, R., Zhou, Y.: Transfer learning for short-term wind speed prediction with deep neural networks. Renew. Energy 85, 83–95 (2016)CrossRefGoogle Scholar
  15. 15.
    Huang, J.T., Li, J., Yu, D., Deng, L., Gong, Y.: Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 7304–7308. IEEE (2013)Google Scholar
  16. 16.
    Hyndman, R., Khandakar, Y.: Automatic time series forecasting: the forecast package for R. J. Stat. Softw. 27(3), 1–22 (2008)CrossRefGoogle Scholar
  17. 17.
    Karlik, B., Olgac, A.V.: Performance analysis of various activation functions in generalized mlp architectures of neural networks. Int. J. Artif. Intell. Expert Syst. 1(4), 111–122 (2011)Google Scholar
  18. 18.
    Makridakis, S., Spiliotis, E., Assimakopoulos, V.: Statistical and machine learning forecasting methods: Concerns and ways forward. PLOS One 13(3), 1–26 (2018)CrossRefGoogle Scholar
  19. 19.
    Ortiz-García, E.G., Salcedo-Sanz, S., Pérez-Bellido, Á.M., Gascón-Moreno, J., Portilla-Figueras, J.A., Prieto, L.: Short-term wind speed prediction in wind farms based on banks of support vector machines. Wind Energy 14(2), 193–207 (2011)CrossRefGoogle Scholar
  20. 20.
    Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2009)CrossRefGoogle Scholar
  21. 21.
    Ramachandran, P., Liu, P., Le, Q.: Unsupervised pretraining for sequence to sequence learning. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 383–391. Association for Computational Linguistics (2017)Google Scholar
  22. 22.
    Rosenstein, M.T., Marx, Z., Kaelbling, L.P., Dietterich, T.G.: To transfer or not to transfer. In: NIPS 2005 Workshop on Transfer Learning, vol. 898, pp. 1–4 (2005)Google Scholar
  23. 23.
    Vu, N.T., Imseng, D., Povey, D., Motlicek, P., Schultz, T., Bourlard, H.: Multilingual deep neural network based acoustic modeling for rapid language adaptation. In: 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 7639–7643. IEEE (2014)Google Scholar
  24. 24.
    Wang, B., Huang, H., Wang, X.: A novel text mining approach to financial time series forecasting. Neurocomputing 83, 136–145 (2012)CrossRefGoogle Scholar
  25. 25.
    Weiss, K., Khoshgoftaar, T.M., Wang, D.: A survey of transfer learning. J. Big Data 3(1), 9 (2016).  https://doi.org/10.1186/s40537-016-0043-6CrossRefGoogle Scholar
  26. 26.
    Wu, D.D., Olson, D.L.: Financial risk forecast using machine learning and sentiment analysis. In: Wu, D.D., Olson, D.L. (eds.) Enterprise Risk Management in Finance, pp. 32–48. Springer, London (2015).  https://doi.org/10.1057/9781137466297_5CrossRefGoogle Scholar
  27. 27.
    Ye, R., Dai, Q.: A novel transfer learning framework for time series forecasting. Knowl. Based Syst. 156, 74–99 (2018)CrossRefGoogle Scholar
  28. 28.
    Yoshihara, A., Fujikawa, K., Seki, K., Uehara, K.: Predicting stock market trends by recurrent deep neural networks. In: Pham, D.-N., Park, S.-B. (eds.) PRICAI 2014. LNCS (LNAI), vol. 8862, pp. 759–769. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-13560-1_60CrossRefGoogle Scholar
  29. 29.
    Yosinski, J., Clune, J., Bengio, Y., Lipson, H.: How transferable are features in deep neural networks? In: Advances in Neural Information Processing Systems, pp. 3320–3328 (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Qi-Qiao He
    • 1
  • Patrick Cheong-Iao Pang
    • 2
  • Yain-Whar Si
    • 1
    Email author
  1. 1.Department of Computer and Information ScienceUniversity of MacauTaipaMacau
  2. 2.School of Computing and Information SystemsThe University of MelbourneParkvilleAustralia

Personalised recommendations