Advertisement

Statistics and Computing

, Volume 24, Issue 3, pp 365–375 | Cite as

A novel Hybrid RBF Neural Networks model as a forecaster

  • Oguz AkbilgicEmail author
  • Hamparsum Bozdogan
  • M. Erdal Balaban
Article

Abstract

We introduce a novel predictive statistical modeling technique called Hybrid Radial Basis Function Neural Networks (HRBF-NN) as a forecaster. HRBF-NN is a flexible forecasting technique that integrates regression trees, ridge regression, with radial basis function (RBF) neural networks (NN). We develop a new computational procedure using model selection based on information-theoretic principles as the fitness function using the genetic algorithm (GA) to carry out subset selection of best predictors. Due to the dynamic and chaotic nature of the underlying stock market process, as is well known, the task of generating economically useful stock market forecasts is difficult, if not impossible. HRBF-NN is well suited for modeling complex non-linear relationships and dependencies between the stock indices. We propose HRBF-NN as our forecaster and a predictive modeling tool to study the daily movements of stock indices. We show numerical examples to determine a predictive relationship between the Istanbul Stock Exchange National 100 Index (ISE100) and seven other international stock market indices. We select the best subset of predictors by minimizing the information complexity (ICOMP) criterion as the fitness function within the GA. Using the best subset of variables we construct out-of-sample forecasts for the ISE100 index to determine the daily directional movements. Our results obtained demonstrate the utility and the flexibility of HRBF-NN as a clever predictive modeling tool for highly dependent and nonlinear data.

Keywords

Forecasting Stock markets Neural networks Variable selection Radial basis functions 

Notes

Acknowledgements

This work was supported by Scientific Research Projects Coordination Unit of Istanbul University under project number 17708. We further acknowledge the valuable comments of the three anonymous referees and the Associate Editor which resulted to a much improved paper.

References

  1. Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Petrox, B.N., Csaki, F. (eds.) Second International Symposium on Information Theory, pp. 267–281. Akad. Kiadó, Budapest (1973) Google Scholar
  2. Akbilgic, O., Bozdogan, H.: Predictive subset selection using regression trees and RBF neural networks hybridized with the genetic algorithm. Eur. J. Pure Appl. Math. 4(4), 467–485 (2011) MathSciNetGoogle Scholar
  3. Bhandarkar, S., Zhang, Y., Potter, W.: Edge detection technique using genetic algorithm-based optimization. Pattern Recognit. 27, 1159–1180 (1994) CrossRefGoogle Scholar
  4. Bishop, C.: Improving the generalization properties of radial basis function neural networks. Neural Comput. 3(4), 579–588 (1991) CrossRefGoogle Scholar
  5. Boyacioglu, M., Avci, D.: An adaptive network-based fuzzy inference systems (ANFIS) for prediction of stock market return: the case of Istanbul stock exchange. In: Expert Systems with Applications, vol. 37, pp. 7902–7912 (2010) Google Scholar
  6. Bozdogan, H.: ICOMP: a new model-selection criteria. In: Bock, H.H. (ed.) Classification and Related Methods of Data Analysis. North-Holland, Amsterdam (1988) Google Scholar
  7. Bozdogan, H.: Mixture-model cluster analysis using a new informational complexity and model selection criteria. In: Proceedings of the First US/Japan Conference on the Frontiers of Statistical Modeling: An Informational Approach. Multivariate Statistical Modeling, vol. 2, pp. 69–113. Kluwer Academic, Norwell (1994) CrossRefGoogle Scholar
  8. Bozdogan, H.: Akaike’s information criterion and recent developments in informational complexity. J. Math. Psychol. 44, 62–91 (2000) CrossRefzbMATHMathSciNetGoogle Scholar
  9. Bozdogan, H.: Intelligent statistical data mining with information complexity and genetic algorithms. In: Bozdogan, H. (ed.) Statistical Data Mining and Knowledge Discovery, pp. 15–56. Chapman & Hall, London (2004) Google Scholar
  10. Bozdogan, H., Howe, J.A.: Misspecified multivariate regression models using the genetic algorithm and information complexity as the fitness function. Eur. J. Pure Appl. Math. 5(2), 211–249 (2012) MathSciNetGoogle Scholar
  11. Breiman, L., Freidman, J., Stone, J.C., Olsen, R.A.: Classification and Regression Trees. Chapman & Hall, London (1984) zbMATHGoogle Scholar
  12. Broomhead, D.S., Lowe, D.: Multi-variable functional interpolation and adaptive networks. Complex Syst. 11, 321–355 (1988) MathSciNetGoogle Scholar
  13. Burns, P.: A genetic algorithm for robust regression estimation. Technical report from Statistical Sciences, Inc. (1992) Google Scholar
  14. Cinko, M., Avci, E.: A comparison of neural network and linear regression forecast of the ISE100 index. Öneri 7(28), 301–307 (2007) Google Scholar
  15. De Jong, K.A.: An analysis of the behavior of a class of genetic adaptive systems. Ph.D. Dissertation, University of Michigan (1975) Google Scholar
  16. De Jong, K.A., Spears, W.M.: Using genetic algorithms to solve NP-complete problems. In: Schaffer, J.D. (ed.) Third Conference on Genetic Algorithms, pp. 124–132. Morgan Kaufmann, San Mateo (1989) Google Scholar
  17. Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Berlin (2010) Google Scholar
  18. Fouskakis, D., Draper, D.: Stochastic optimization: a review. Int. Stat. Rev. 70(2), 315–349 (2002) zbMATHGoogle Scholar
  19. Hamada, M., Martz, H., Reese, C., Wilson, A.: Finding near-optimal Bayesian experimental designs via genetic algorithms. Am. Stat. 55(3), 175–181 (2001) CrossRefzbMATHMathSciNetGoogle Scholar
  20. Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall, New Jersey (1999) zbMATHGoogle Scholar
  21. Hoerl, A.E., Kennard, R.W., Baldwin, K.F.: Ridge regression: some simulations. Commun. Stat. 4, 105–123 (1975) CrossRefzbMATHGoogle Scholar
  22. Howe, J.A., Bozdogan, H.: Predictive subset VAR modeling using the genetic algorithm and information complexity. Eur. J. Pure Appl. Math. 3(3), 382–405 (2010) MathSciNetGoogle Scholar
  23. Howlett, R.J., Jain, L.C.: Radial Basis Function Networks 1: Recent Developments in Theory and Applications. Physica-Verlag, New York (2001) CrossRefGoogle Scholar
  24. Korkmaz, T., Cevik, E., Birkan, E., Ozatac, N.: Causality in mean and variance between ISE100 and S&P 500: Turkcell case. Afr. J. Bus. Manag. 5(5), 1673–1683 (2011) Google Scholar
  25. Kubat, M.: Decision trees can initialize radial basis function networks. IEEE Trans. Neural Netw. 9(5), 813–821 (1998) CrossRefMathSciNetGoogle Scholar
  26. Kullback, A., Leibler, R.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951) CrossRefzbMATHMathSciNetGoogle Scholar
  27. Liang, F., Wong, W.: Real-parameter evolutionary Monte Carlo with applications to Bayesian mixture models. J. Am. Stat. Assoc. 96(454), 653–666 (2001) CrossRefzbMATHMathSciNetGoogle Scholar
  28. Lin, C.-T., Lee, C.S.G.: Neural Fuzzy Systems: A Neuro-Fuzzy Synergism to Intelligent Systems. Prentice Hall, New York (1996) Google Scholar
  29. Liu, Z., Bozdogan, H.: Improving the performance of radial basis function classification using information criteria. In: Bozdogan, H. (ed.) Statistical Data Mining and Knowledge Discovery, pp. 193–216. Chapman & Hall, London (2004) Google Scholar
  30. Lo, A.W., MacKinlay, C.: Stock market prices do not follow random walks: evidence from a simple specification test. Rev. Financ. Stud. 1, 41–66 (1988) CrossRefGoogle Scholar
  31. Meng, K., Dong, Z.Y., Wong, K.P.: Self-adaptive radial basis function neural networks for short-term electricity price forecasting. IEE Proc., Gener. Transm. Distrib. 3(4), 325–335 (2008) CrossRefGoogle Scholar
  32. Neely, C., Weller, P., Dittmar, R.: Is technical analysis in the foreign exchange market profitable? A genetic programming approach. J. Financ. Quant. Anal. 32(4), 405–426 (1997) CrossRefGoogle Scholar
  33. Orr, M.: Combining regression trees and RBFs. Int. J. Neural Syst. 10(6), 453–465 (2000) CrossRefGoogle Scholar
  34. Ozdemir, A.K., Tolun, S., Demirci, E.: Endeks getirisi yonunun ikili siniflandirma yontemiyle tahmin edilmesi: IMKB100 endeksi ornegi. Nigde Univ. IIBF Derg. 4(2), 45–59 (2011) Google Scholar
  35. Ozun, A.: Are the reactions of emerging equity markets to the volatility in advanced markets similar? Comparative evidence from Brazil and Turkey. Int. Res. J. Finance Econ. 9, 220–230 (2007) Google Scholar
  36. Poggio, T., Girosi, F.: Regularization algorithms for learning that are equivalent to multilayer networks. Science 247(4945), 978–982 (1990) CrossRefzbMATHMathSciNetGoogle Scholar
  37. Rivas, V.M., Merelo, J.J., Castillo, P.A., Arenas, M.G., Castellano, J.G.: Evolving RBF neural networks for time-series forecasting with EvRBF. Inf. Sci. 1655(53–54), 207–220 (2004) CrossRefMathSciNetGoogle Scholar
  38. Routledge, B.: Adaptive learning in financial markets. Rev. Financ. Stud. 12(5), 1165–1202 (1999). Oxford University Press CrossRefGoogle Scholar
  39. Srinivas, M., Patnaik, L.M.: Adaptive probabilities of crossover and mutation in genetic algorithms. IEEE Trans. Syst. Man Cybern. 24(4), 656–667 (1994) CrossRefGoogle Scholar
  40. Sun, Y.F., Liang, Y.C., Zhang, W.L., Lee, H.P., Lin, W.Z., Cao, L.J.: Optimal partition algorithm of the RBF neural network and its application to financial time series forecasting. Neural Comput. Appl. 14, 35–44 (2005) Google Scholar
  41. Tikhonov, A.N., Arsenin, V.Y.: Solutions of Ill-Posed Problems. Wiley, New York (1977) zbMATHGoogle Scholar
  42. Vuran, B.: The determination of long-run relationship between ISE100 and international equity indices using cointegration analysis. Istanb. Univ. J. Sch. Bus. Adm. 39(1), 154–168 (2000) Google Scholar
  43. White, H.: Maximum likelihood estimation of misspecified models. Econometrica 50, 1–25 (1982) CrossRefzbMATHMathSciNetGoogle Scholar
  44. Zhang, J., Chung, H.S., Lo, W.: Clustering-based adaptive crossover and mutation probabilities for genetic algorithms. IEEE Trans. Evol. Comput. 11(3), 326–335 (2007) CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Oguz Akbilgic
    • 1
    Email author
  • Hamparsum Bozdogan
    • 2
  • M. Erdal Balaban
    • 1
  1. 1.Istanbul University School of Business AdministrationIstanbulTurkey
  2. 2.Statistics, Operations, and Management Science, and Center for Intelligent Systems and Machine Learning (CISML)The University of TennesseeKnoxvilleUSA

Personalised recommendations