Skip to main content
Log in

Deep Heuristic Evolutionary Regression Model Based on the Fusion of BiGRU and BiLSTM

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

The input for stock market prediction is usually a period of stock price data with time series characteristics, which will keep changing over time and have more complex background relationships. How to effectively mine and fuse multiple heterogeneous data of the stock market is difficult to be handled by traditional recurrent neural networks (RNN). To solve this problem, we divide the regression model into an encoder and decoder structure. In this paper, we first use RNN technique for missing value complementation, then use the fusion model of bidirectional gate recurrent unit (BiGRU) and bidirectional long short-term memory network (BiLSTM) as the encoder to extract. Finally, the group method of data handling (GMDH) model is used as a decoder to obtain stock market prediction results based on the time series data features. A deep heuristic evolutionary regression model (BBGMDH) based on the fusion of BiGRU and BiLSTM is proposed by the above process. We have conducted extensive experiments on four real stock data, and the results show that BBGMDH significantly outperforms existing methods, verifying the effectiveness of the encoding-decoding stepwise regression model in stock price prediction tasks. The reason is that the encoding layer utilizes the powerful time series data processing technology of RNN to effectively extract the hidden features of stock data, and the decoding layer utilizes the GMDH heuristic evolutionary computation method to simulate the “genetic mutation selection evolution” process of an organism for the regression task of stock market prediction, making full use of their complementary properties. We provide a new solution to the regression prediction problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data Availability

The datasets generated during and analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Chen Y, Dong G, Han J, Wah BW, Wang J. Multi-dimensional regression analysis of time-series data streams. In: International Conference on VLDB. 2002.  p 323–34.

  2. Weigend AS. Time series prediction: forecasting the future and understanding the past. 2018.

  3. Cook RD. Detection of influential observation in linear regression. Technometrics. 1977;19(1):15–8.

    MathSciNet  MATH  Google Scholar 

  4. Bates DM, Watts DG. Nonlinear regression analysis and its applications. 1981.

  5. Rendle S. Factorization machines. In: Proceedings of the 2010 IEEE International Conference on Data Mining, ICDM ’10. IEEE Computer Society, 2010. p 995–1000.

  6. Cordell HJ, Clayton DG. A unified stepwise regression procedure for evaluating the relative effects of polymorphisms within a gene using case/control or family data: application to HLA in type 1 diabetes. Am J Hum Genet. 2002;70(1):124–41.

    Article  Google Scholar 

  7. Hoerl AE, Kennard RW. Ridge regression: biased estimation for nonorthogonal problems. Technometrics a Journal of Stats for the Physical Chemical & Engineering Sciences. 2000;42.

  8. Hans C. Bayesian lasso regression. Biometrika. 2009;96(4):835–45.

    Article  MathSciNet  MATH  Google Scholar 

  9. Hui Z, Trevor H. Regression shrinkage and selection via the elastic net, with applications to microarrays. JR Stat Soc Ser B. 2003;67:301–20.

    Google Scholar 

  10. Scott M. Six approaches to calculating standardized logistic regression coefficients. Am Stat. 2004;58(4):364–364.

    MathSciNet  Google Scholar 

  11. Prasad AM, Iverson LR, Liaw A. Newer classification and regression tree techniques: bagging and random forests for ecological prediction. Ecosystems. 2006;9:181–99.

    Article  Google Scholar 

  12. Cherkassky V, Ma Y. Practical selection of svm parameters and noise estimation for svm regression. Neural Netw. 2004;17(1):113–26.

    Article  MATH  Google Scholar 

  13. Specht DF. The general regression neural network-rediscovered. Neural Netw. 1993;6(7):1033–4.

    Article  Google Scholar 

  14. Werbos JP. Backpropagation through time: what it does and how to do it. Proc IEEE. 1990;78(10):1550–60.

    Article  Google Scholar 

  15. Williams RJ, Zipser D. A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1998;1(2).

  16. Sepp H. Untersuchungen zu dynamischen neuronalen Netzen, vol 1 [Master’s thesis]. Institut fur Informatik, Technische Universitat, Munchen; 1991. p 1–150.

  17. Mozer MC. Induction of multiscale temporal structure. Morgan Kaufmann Publishers Inc. 1997.

  18. Gf A, Schmidhuber J, Cummins F. Learning to forget: continual prediction with LSTM. In: Istituto Dalle Molle Di Studi Sull Intelligenza Artificiale. 1999.

  19. Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput. 1997;9(8):1735–80.

    Article  Google Scholar 

  20. Gers FA, Schraudolph NN, Schmidhuber J. Learning precise timing with lstm recurrent networks. J Mach Learn Res. 2003;3(1):115–43.

    MathSciNet  MATH  Google Scholar 

  21. Pérez-Ortiz JA, Gers FA, Eck D, Schmidhuber J. Kalman filters improve LSTM network performance in problems unsolvable by traditional recurrent nets. Neural Netw. 2003;16(2):241–50.

    Article  Google Scholar 

  22. Graves A, Schmidhuber J. Framewise phoneme classification with bidirectional LSTM networks. In: IEEE International Joint Conference on Neural Networks. 2005.

  23. Xu Y, Chhim L, Zheng B, Nojima Y. Stacked deep learning structure with bidirectional long-short term memory for stock market prediction. In: International Conference on Neural Computing for Advanced Applications. Springer; 2020. p 447–60.

  24. Chen Q, Zhang W, Lou Y. Forecasting stock prices using a hybrid deep learning model integrating attention mechanism, multi-layer perceptron, and bidirectional long-short term memory neural network. IEEE Access. 2020;PP(99):1–1.

  25. Lu W, Li J, Wang J, Qin L. A CNN-BILSTM-AM method for stock price prediction. Neural Comput Appl. 2021;33(10):4741–53.

    Article  Google Scholar 

  26. Lai G, Chang WC, Yang Y, Liu H. Modeling long- and short-term temporal patterns with deep neural networks. In: International ACM SIGIR Conference on Research and Development in Information Retrieval. 2018.

  27. Shih SY, Sun FK, Lee HY. Temporal pattern attention for multivariate time series forecasting. Mach Learn. 2019;108(8–9):1421–41.

    Article  MathSciNet  MATH  Google Scholar 

  28. Nama S, Saha AK. A bio-inspired multi-population-based adaptive backtracking search algorithm. Cogn Comput. 2022;14(2):900–25.

    Article  Google Scholar 

  29. Martínez-Cagigal V, Santamaría-Vázquez E, H Roberto. Brain-computer interface channel selection optimization using meta-heuristics and evolutionary algorithms. Appl Soft Comput. 2022;115:108176.

  30. Nawaz MS, Nawaz MZ, Hasan O, Fournier-Viger P, Sun M. Proof searching and prediction in HOL4 with evolutionary/heuristic and deep learning techniques. Appl Intell. 2021;51:1580–1601.

  31. Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput. 1997;9(8):1735–80.

    Article  Google Scholar 

  32. Cho K, Van Merrienboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. Comput Sci. 2014.

  33. Schuster M, Paliwal KK. Bidirectional recurrent neural networks. IEEE Trans Signal Process. 1997;45(11):2673–81.

    Article  Google Scholar 

  34. Band SS, Mohammadzadeh A, Csiba P, Mosavi A, Varkonyi-Koczy AR. Voltage regulation for photovoltaics-battery-fuel systems using adaptive group method of data handling neural networks (GMDH-NN). IEEE Access. 2020.

  35. Ivakhnenko AG. Sorting methods for modeling and clusterization (survey of GMDH papers for the years 1983–1988). The present stage of GMDH development. Soviet Journal of Automation and Information Sciences (English translation of Avtomatyka). 1988;21(4).

  36. Yang CH, Liao MY, Chen PL, Huang MT, Huang CW, Huang JS, Chung JB. Constructing financial distress prediction model using group method of data handling technique. In: International Conference on Machine Learning & Cybernetics. 2009.

  37. Xu L, Lu XW, Xiao BJ, Qi L, Enhong C, Xiaoyi J, Bin L. Probabilistic SVM classifier ensemble selection based on GMDH-type neural network. Pattern Recogn. 2020;106:107373.

  38. Xu L, Lu B, Xiao J, Liu Q, Chen E, Wang X, Tang Y. Multiple graph kernel learning based on GMDH-type neural network. Information Fusion. 2021;66:100–10.

    Article  Google Scholar 

  39. Radman A, Suandi SA. BILSTM regression model for face sketch synthesis using sequential patterns. Neural Comput Appl. 2021;33:12689–702.

    Article  Google Scholar 

  40. Gupta B, Prakasam P, Velmurugan T. Integrated BERT embeddings, BILSTM-BIGRU and 1-D CNN model for binary sentiment classification analysis of movie reviews. Multimed Tools Appl. 2022;81(23):33067–86.

Download references

Funding

This work was financially supported by Key International Cooperation Projects of National Natural Science Foundation of China (61860206004), National Natural Science Foundation of China (62176085, 62172458, 61672114), Nature Science Research Project of Anhui province (1908085MF185), Talent Fund of Hefei University (20RC25). Industry-University-Research Cooperation Project under Grant No.GP/026/2020 and HF-010-2021 Zhuhai City, Guangdong Province, China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bin Luo.

Ethics declarations

Ethical Approval

This article does not contain any studies with human or animal subjects performed by any of the authors.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, L., Xu, W., Cui, Q. et al. Deep Heuristic Evolutionary Regression Model Based on the Fusion of BiGRU and BiLSTM. Cogn Comput 15, 1672–1686 (2023). https://doi.org/10.1007/s12559-023-10135-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-023-10135-6

Keywords

Navigation