Multistage Neural Network Metalearning with Application to Foreign Exchange Rates Forecasting

  • Kin Keung Lai
  • Lean Yu
  • Wei Huang
  • Shouyang Wang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4293)


In this study, we propose a multistage neural network metalearning technique for financial time series predication. First of all, an interval sampling technique is used to generate different training subsets. Based on the different training subsets, the different neural network models with different training subsets are then trained to formulate different base models. Subsequently, to improve the efficiency of metalearning, the principal component analysis (PCA) technique is used as a pruning tool to generate an optimal set of base models. Finally, a neural-network-based metamodel can be produced by learning from the selected base models. For illustration, the proposed metalearning technique is applied to foreign exchange rate predication.


Root Mean Square Error Neural Network Model ARIMA Model Financial Time Series Foreign Exchange Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    McCulloch, W.S., Pitts, W.: A Logical Calculus of the Ideas Imminent in Nervous Activity. Bulletin and Mathematical Biophysics 5, 115–133 (1943)MATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison-Wesley, Reading (1989)Google Scholar
  3. 3.
    Tsaih, R., Hsu, Y., Lai, C.C.: Forecasting S&P 500 Stock Index Futures with a Hybrid AI System. Decision Support Systems 23, 161–174 (1998)CrossRefGoogle Scholar
  4. 4.
    Yu, L., Lai, K.K., Wang, S.Y., Huang, W.: A Bias-Variance-Complexity Trade-off Framework for Complex System Modeling. In: Gavrilova, M.L., Gervasi, O., Kumar, V., Tan, C.J.K., Taniar, D., Laganá, A., Mun, Y., Choo, H. (eds.) ICCSA 2006. LNCS, vol. 3980, pp. 518–527. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  5. 5.
    Chan, P., Stolfo, S.: Meta-learning for Multistrategy and Parallel Learning. In: Proceedings of the Second International Workshop on Multistrategy Learning, pp. 150–165 (1993)Google Scholar
  6. 6.
    White, H.: Connectionist Nonparametric Regression: Multilayer Feedforward Networks Can Learn Arbitrary Mappings. Neural Networks 3, 535–549 (1990)CrossRefGoogle Scholar
  7. 7.
    Hornik, K., Stinchocombe, M., White, H.: Multilayer Feedforward Networks are Universal Approximators. Neural Networks 2, 359–366 (1989)CrossRefGoogle Scholar
  8. 8.
    Broomhead, D.S., Lowe, D.: Multivariable Functional Interpolation and Adaptive Networks. Complex Systems 2, 321–355 (1988)MATHMathSciNetGoogle Scholar
  9. 9.
    Mackay, D.J.C.: The Evidence Framework Applied to Classification Problems. Natural Computation 4, 720–736 (1992)CrossRefGoogle Scholar
  10. 10.
    Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1986)Google Scholar
  11. 11.
    Yu, L., Wang, S.Y., Lai, K.K.: A Novel Nonlinear Ensemble Forecasting Model Incorporating GLAR and ANN for Foreign Exchange Rates. Computers & Operations Research 32, 2523–2541 (2005)MATHCrossRefGoogle Scholar
  12. 12.
    Lai, K.K., Yu, L., Wang, S.Y., Huang, W.: A Novel Nonlinear Neural Network Ensemble Model for Financial Time Series Forecasting. In: Alexandrov, V.N., van Albada, G.D., Sloot, P.M.A., Dongarra, J. (eds.) ICCS 2006. LNCS, vol. 3991, pp. 790–793. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  13. 13.
    Lai, K.K., Yu, L., Wang, S.Y., Huang, W.: Hybridizing Exponential Smoothing and Neural Network for Financial Time Series Prediction. In: Alexandrov, V.N., van Albada, G.D., Sloot, P.M.A., Dongarra, J. (eds.) ICCS 2006. LNCS, vol. 3994, pp. 493–500. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  14. 14.
    Hansen, L.K., Salamon, P.: Neural Network Ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence 12, 993–1001 (1990)CrossRefGoogle Scholar
  15. 15.
    Benediktsson, J.A., Sveinsson, J.R., Ersoy, O.K., Swain, P.H.: Parallel Consensual Neural Networks. IEEE Transactions on Neural Networks 8, 54–64 (1997)CrossRefGoogle Scholar
  16. 16.
    Yu, L., Wang, S.Y., Lai, K.K.: A Novel Adaptive Learning Algorithm for Stock Market Prediction. In: Deng, X., Du, D.-Z. (eds.) ISAAC 2005. LNCS, vol. 3827, pp. 443–452. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  17. 17.
    Yu, L., Wang, S.Y., Lai, K.K.: An Adaptive BP Algorithm with Optimal Learning Rates and Directional Error Correction for Foreign Exchange Market Trend Prediction. In: Wang, J., Yi, Z., Żurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3973, pp. 498–503. Springer, Heidelberg (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Kin Keung Lai
    • 1
    • 2
  • Lean Yu
    • 2
    • 3
  • Wei Huang
    • 4
  • Shouyang Wang
    • 1
    • 3
  1. 1.College of Business AdministrationHunan UniversityChangshaChina
  2. 2.Department of Management SciencesCity University of Hong KongKowloonHong Kong
  3. 3.Institute of Systems Science, Academy of Mathematics and Systems ScienceChinese Academy of SciencesBeijingChina
  4. 4.School of ManagementHuazhong University of Science and TechnologyWuhanChina

Personalised recommendations