Advertisement

Extreme Market Prediction for Trading Signal with Deep Recurrent Neural Network

  • Zhichen Lu
  • Wen Long
  • Ying Guo
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10861)

Abstract

Recurrent neural network are a type of deep learning units that are well studied to extract features from sequential samples. They have been extensively applied in forecasting univariate financial time series, however their application to high frequency multivariate sequences has been merely considered. This paper solves a classification problem in which recurrent units are extended to deep architecture to extract features from multi-variance market data in 1-minutes frequency and extreme market are subsequently predicted for trading signals. Our results demonstrate the abilities of deep recurrent architecture to capture the relationship between the historical behavior and future movement of high frequency samples. The deep RNN is compared with other models, including SVM, random forest, logistic regression, using CSI300 1-minutes data over the test period. The result demonstrates that the capability of deep RNN generating trading signal based on extreme movement prediction support more efficient market decision making and enhance the profitability.

Keywords

Recurrent neural networks Deep learning High frequency trading Financial time series 

Notes

Acknowledgement

This research was partly supported by the grants from National Natural Science Foundation of China (No. 71771204, 71331005, 91546201).

References

  1. 1.
    Bhattacharya, A., Parlos, A.G., Atiya, A.F.: Prediction of MPEG-coded video source traffic using recurrent neural networks. IEEE Trans. Signal Process. 51(8), 2177–2190 (2002)CrossRefGoogle Scholar
  2. 2.
    Cheng, W., Wagner, L., Lin, C.H.: Forecasting the 30-year us treasury bond with a system of neural networks. Neuroizest J. 4, 10–16 (1996)Google Scholar
  3. 3.
    Dauphin, Y., Yao, K., Bengio, Y., Deng, L., Hakkani-Tur, D., He, X., Heck, L., Tur, G., Yu, D., Zweig, G.: Using recurrent neural networks for slot filling in spoken language understanding. IEEE/ACM Trans. Audio Speech Lang. Process. 23(3), 530–539 (2015)CrossRefGoogle Scholar
  4. 4.
    Emam, A.: Optimal artificial neural network topology for foreign exchange forecasting. In: Proceedings of the 46th Annual Southeast Regional Conference on XX, pp. 63–68. ACM (2008)Google Scholar
  5. 5.
    Graves, A., Mohamed, A., Hinton, G.: Speech recognition with deep recurrent neural networks. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 6645–6649. IEEE (2013)Google Scholar
  6. 6.
    Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International Conference on Machine Learning, pp. 448–456 (2015)Google Scholar
  7. 7.
    Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)
  8. 8.
    Kingma, D., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
  9. 9.
    Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)Google Scholar
  10. 10.
    Mikolov, T., Karafit, M., Burget, L., Cernock, J., Khudanpur, S.: Recurrent neural network based language model. In: INTERSPEECH 2010, Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September, pp. 1045–1048 (2010)Google Scholar
  11. 11.
    Nag, A.K., Mitra, A.: Forecasting daily foreign exchange rates using genetically optimized neural networks. J. Forecast. 21(7), 501–511 (2002)CrossRefGoogle Scholar
  12. 12.
    Panda, C., Narasimhan, V.: Forecasting exchange rate better with artificial neural network. J. Policy Model. 29(2), 227–236 (2007)CrossRefGoogle Scholar
  13. 13.
    Sharda, R., Patil, R.B.: Connectionist approach to time series prediction: an empirical test. J. Intell. Manuf. 3(5), 317–323 (1992)CrossRefGoogle Scholar
  14. 14.
    Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  15. 15.
    Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., Rabinovich, A.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–9 (2015)Google Scholar
  16. 16.
    Van Eyden, R.J.: The Application of Neural Networks in the Forecasting of Share Prices (1996)Google Scholar
  17. 17.
    Weigend, A.S.: Predicting sunspots and exchange rates with connectionist networks. In: Nonlinear Modeling and Forecasting, pp. 395–432 (1992)Google Scholar
  18. 18.
    Weigend, A.S., Rumelhart, D.E., Huberman, B.A.: Generalization by weight-elimination with application to forecasting. In: Advances in Neural Information Processing Systems, pp. 875–882 (1991)Google Scholar
  19. 19.
    White, H.: Economic prediction using neural networks: the case of IBM daily stock returns. In: IEEE International Conference on Neural Networks, vol. 2, pp. 451–458 (1988)Google Scholar
  20. 20.
    Williams, R.J., Zipser, D.: A Learning Algorithm for Continually Running Fully Recurrent Neural Networks. MIT Press, Cambridge (1989)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of Economics and ManagementUniversity of Chinese Academy of SciencesBeijingPeople’s Republic of China
  2. 2.Research Center on Fictitious Economy and Data ScienceChinese Academy of SciencesBeijingPeople’s Republic of China
  3. 3.Key Laboratory of Big Data Mining and Knowledge ManagementChinese Academy of SciencesBeijingPeople’s Republic of China

Personalised recommendations