Advertisement

Applied Intelligence

, Volume 49, Issue 2, pp 532–554 | Cite as

A meta extreme learning machine method for forecasting financial time series

  • César FernándezEmail author
  • Luis Salinas
  • Claudio E. Torres
Article
  • 235 Downloads

Abstract

In the last decade, the problem of forecasting time series in very different fields has received increasing attention due to its many real-world applications. In particular, in the very challenging case of financial time series, the underlying phenomenon of stock time series exhibits complex behaviors, including non-stationary, non-linearity and non-trivial scaling properties. In the literature, a wide-used strategy to improve the forecasting capability is the combination of several models. However, the majority of the published researches in the field of financial time series use different machine learning models where only one type of predictor, either linear or nonlinear, is considered. In this paper we first measure relevant features present in the underlying process to propose a forecast method. We select the Sample Entropy and Hurst Exponent to characterize the behavior of stock time series. The characterization reveals the presence of moderate randomness, long-term memory and scaling properties. Thus, based on the measured properties, this paper proposes a novel one-step-ahead off-line meta-learning model, called μ-XNW, for the prediction of the next value xt+1 of a financial time series \(x_{t}\), t = 1, 2, 3, … , that integrates a naive or linear predictor (LP), for which the predicted value of \(x_{t + 1}\) is just repeating the last value \(x_{t}\), an extreme learning machine (ELM) and a discrete wavelet transform (DWT), both based on the nprevious values of \(x_{t + 1}\). LP, ELM and DWT are the constituent of the proposed model μ-XNW. We evaluate the proposed model using four well-known performance measures and validated the usefulness of the model using six high-frequency stock time series belong to the technology sector. The experimental results validate that including internal estimators that are able to the capture the relevant features measured (randomness, long-term memory and scaling properties) successfully improve the accuracy of the forecasting over methods that do not include them.

Keywords

Financial time series Forecasting Extreme learning machine Discrete wavelet transform 

Notes

Acknowledgments

This work has been partially funded by the Centro Científico Tecnológico de Valparaíso – CCTVal, CONICYT PIA/Basal Funding FB0821, FONDECYT 1150810, FONDECYT 11160744 and UTFSM PIIC2015. The authors gratefully thanks Alejandro Cañete from IFITEC S.A. – Financial Technology, for providing the stock time series for this study.

Compliance with Ethical Standards

Conflict of interests

The authors declare that they have no conflicts of interest.

References

  1. 1.
    Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. https://www.tensorflow.org/. Software available from tensorflow.org
  2. 2.
    Adhikari R (2015) A neural network based linear ensemble framework for time series forecasting. Neurocomputing 157:231–242.  https://doi.org/10.1016/j.neucom.2015.01.012 CrossRefGoogle Scholar
  3. 3.
    Adhikari R, Agrawal RK (2014) A combination of artificial neural network and random walk models for financial time series forecasting. Neural Comput Applic 24(6):1441–1449.  https://doi.org/10.1007/s00521-013-1386-y CrossRefGoogle Scholar
  4. 4.
    Aldridge I (2013) High-Frequency Trading: a practical guide to algorithmic strategies and trading systems. Wiley, Hoboken, NJGoogle Scholar
  5. 5.
    Arthur D, Vassilvitskii S (2007) k-means++: the advantages of careful seeding. In: SODA ’07: Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms, pp 1027–1035. Society for Industrial and Applied Mathematics, PhiladelphiaGoogle Scholar
  6. 6.
    Atsalakis GS, Valavanis KP (2009) Surveying stock market forecasting techniques - part ii: Soft computing methods. Expert Syst Appl 36(3):5932–5941CrossRefGoogle Scholar
  7. 7.
    Bahrammirzaee A (2010) A comparative survey of artificial intelligence applications in finance: Artificial neural networks, expert system and hybrid intelligent systems. Neural Comput Appl 19(8):1165–1195CrossRefGoogle Scholar
  8. 8.
    Bishop CM (1996) Neural networks for pattern recognition. Oxford University Press, USAzbMATHGoogle Scholar
  9. 9.
    Blatter C (2013) Wavelets: Eine einführung (Advanced Lectures in Mathematics) (German Edition) Vieweg+Teubner VerlagGoogle Scholar
  10. 10.
    Bollerslev T (1986) Generalized autoregressive conditional heteroskedasticity. J Econ 31(3):307–327MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Box GEP, Jenkins G (1970) Time series analysis, forecasting and control. Holden-Day, IncorporatedGoogle Scholar
  12. 12.
    Box GEP, Jenkins G (1970) Time series analysis, forecasting and control. Holden-Day, IncorporatedGoogle Scholar
  13. 13.
    Broomhead D, Lowe D (1988) Multivariable functional interpolation and adaptive networks. Complex Systems 2:321–355MathSciNetzbMATHGoogle Scholar
  14. 14.
    Cavalcante RC, Brasileiro RC, Souza VL, Nobrega JP, Oliveira AL (2016) Computational intelligence and financial markets: a survey and future directions. Expert Syst Appl 55:194–211.  https://doi.org/10.1016/j.eswa.2016.02.006 CrossRefGoogle Scholar
  15. 15.
    Chollet F et al (2015). https://github.com/fchollet/keras
  16. 16.
    Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Mathematics of Control Signals, and Systems 2:303–314MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Dacorogna MM, Gencay R, Muller U, Olsen RB, Olsen OV (2001) An introduction to high frequency finance. Academic Press, New YorkGoogle Scholar
  18. 18.
    Daubechies I (1992) Ten lectures on wavelets. Society for industrial and applied mathematics, PhiladelphiaCrossRefzbMATHGoogle Scholar
  19. 19.
    Doucoure B, Agbossou K, Cardenas A (2016) Time series prediction using artificial wavelet neural network and multi-resolution analysis: Application to wind speed data. Renew Energy 92:202–211.  https://doi.org/10.1016/j.renene.2016.02.003 CrossRefGoogle Scholar
  20. 20.
    Durbin M (2010) All About High-Frequency Trading (All About Series). McGraw-Hill, New YorkGoogle Scholar
  21. 21.
    Elman JL (1990) Finding structure in time. Cogn Sci 14(2):179–211CrossRefGoogle Scholar
  22. 22.
    Engle RF (1982) Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 50(4):987–1007MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Fan J, Yao Q (2005) Nonlinear time series: Nonparametric and parametric methods (springer series in statistics). SpringerGoogle Scholar
  24. 24.
    Gooijer JGD (2017) Elements of nonlinear time series analysis and forecasting (springer series in statistics). SpringerGoogle Scholar
  25. 25.
    Gooijer JGD, Hyndman RJ (2006) 25 years of time series forecasting. Int J Forecast 22(3):443–473.  https://doi.org/10.1016/j.ijforecast.2006.01.001 CrossRefGoogle Scholar
  26. 26.
    Granger C, Andersen A (1978) An introduction to bilinear time series models gottingenGoogle Scholar
  27. 27.
    Guillaume DM, Dacorogna MM, Davé RR, Muller UA, Olsen RB, Pictet OV (1997) From the bird’s eye to the microscope: A survey of new stylized facts of the intra-daily foreign exchange markets. Finance Stochast 1:95–129CrossRefzbMATHGoogle Scholar
  28. 28.
    Hamilton JD (1989) A New Approach to the Economic Analysis of Nonstationary Time Series and the Business Cycle. Econometrica 57(2):357–384MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780CrossRefGoogle Scholar
  30. 30.
    Hornik K (1991) Approximation capabilities of multilayer feedforward networks. Neural Netw 4(2):251–257CrossRefGoogle Scholar
  31. 31.
    Hornik K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2(5):359–366CrossRefzbMATHGoogle Scholar
  32. 32.
    Hornik K, Stinchcombe MB, White H (1990) Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Netw 3(5):551–560CrossRefGoogle Scholar
  33. 33.
    Huang GB, Wang D, Lan Y (2011) Extreme learning machines: a survey. Int J Machine Learning & Cybernetics 2(2):107–122CrossRefGoogle Scholar
  34. 34.
    Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of the 2004 IEEE international joint conference on Neural networks, 2004, vol 2, pp 985–990Google Scholar
  35. 35.
    Hurst H (1956) Methods of using long-term storage in reservoirs. ICE Proceedings 5:519–543Google Scholar
  36. 36.
    Hyndman RJ, Khandakar Y (2008) Automatic time series forecasting: The forecast package for r. J Stat Softw 27(3):1–22CrossRefGoogle Scholar
  37. 37.
    Hyndman RJ, Koehler AB, Snyder RD, Grose S (2002) A state space framework for automatic forecasting using exponential smoothing methods. Int J Forecast 18(3):439–454.  https://doi.org/10.1016/S0169-2070(01)00110-8 CrossRefGoogle Scholar
  38. 38.
    In F, Kim S (2006) Multiscale hedge ratio between the australian stock and futures markets: Evidence from wavelet analysis. Journal of Multinational Financial Management 16(4):411–423.  https://doi.org/10.1016/j.mulfin.2005.09.002 MathSciNetCrossRefGoogle Scholar
  39. 39.
    Javed K, Gouriveau R, Zerhouni N (2014) Sw-elm: A summation wavelet extreme learning machine algorithm with a priori parameter initialization. Neurocomputing 123:299–307.  https://doi.org/10.1016/j.neucom.2013.07.021. http://www.sciencedirect.com/science/article/pii/S0925231213007649. Contains Special issue articles: Advances in Pattern Recognition Applications and MethodsCrossRefGoogle Scholar
  40. 40.
    Richman JS, Moorman JR (2000) Physiological time-series analysis using approximate entropy and sample entropy. American Physiological Society 278(6):H2039–H2049Google Scholar
  41. 41.
    Richman JS, Lake DE, Moorman JR (2004) Sample entropy. Methods Enzymol 384:172–184.  https://doi.org/10.1016/S0076-6879(04)84011-4. Numerical Computer Methods, Part ECrossRefGoogle Scholar
  42. 42.
    Kantz H, Schreiber T (2004) Nonlinear time series analysis. Cambridge University Press, CambridgezbMATHGoogle Scholar
  43. 43.
    Karuppiah J, Los CA (2005) Wavelet multiresolution analysis of high-frequency asian fx rates, summer 1997. International Review of Financial Analysis 14(2):211–246CrossRefGoogle Scholar
  44. 44.
    Lahmiri S (2014) Wavelet low- and high-frequency components as features for predicting stock prices with backpropagation neural networks. Journal of King Saud University - Computer and Information Sciences 26 (2):218–227.  https://doi.org/10.1016/j.jksuci.2013.12.001 CrossRefGoogle Scholar
  45. 45.
    Lai TL, Xing H (2008) Statistical models and methods for financial markets (springer texts in statistics). SpringerGoogle Scholar
  46. 46.
    Li S, Goel L, Wang P (2016) An ensemble approach for short-term load forecasting by extreme learning machine. Appl Energy 170:22–29.  https://doi.org/10.1016/j.apenergy.2016.02.114 CrossRefGoogle Scholar
  47. 47.
    Liao S, Feng C (2014) Meta-elm: {ELM} with {ELM} hidden nodes. Neurocomputing 128:81–87CrossRefGoogle Scholar
  48. 48.
    Ma J, Li Y (2017) Gauss-jordan elimination method for computing all types of generalized inverses related to the 1-inverse. J Comput Appl Math 321:26–43.  https://doi.org/10.1016/j.cam.2017.02.010 MathSciNetCrossRefzbMATHGoogle Scholar
  49. 49.
    Makridakis S, Spiliotis E, Assimakopoulos V (2018) Statistical and machine learning forecasting methods: Concerns and ways forward. PLOS ONE 13(3):1–26.  https://doi.org/10.1371/journal.pone.0194889 CrossRefGoogle Scholar
  50. 50.
    Mallat S (1989) A theory for multiresolution signal decomposition: The wavelet representation. IEEE Transactions on Pattern Analysis and Machine Intelligence 11:674–693CrossRefzbMATHGoogle Scholar
  51. 51.
    Mallat S (1999) A Wavelet Tour of Signal Processing. Academic Press, San DiegozbMATHGoogle Scholar
  52. 52.
    Mandelbrot BB, Wallis JR (1969) Robustness of the rescaled range r/s in the measurement of noncyclic long run statistical dependence. Water Resour Res 5(5):967–988.  https://doi.org/10.1029/WR005i005p00967 CrossRefGoogle Scholar
  53. 53.
    Mariano RS, kuen Tse Y (2008) Econometric forecasting and High-Frequency data analysis (lecture notes series, institute for mathematical sciences, national university of singapore). World Scientific Publishing CompanyGoogle Scholar
  54. 54.
    Montavon G, Orr G B, Müller KR (eds.) (2012) Neural Networks: Tricks of the Trade - Second Edition, Lecture Notes in Computer Science, vol. 7700 SpringerGoogle Scholar
  55. 55.
    Müller UA, Dacorogna MM, Olsen RB, Pictet OV, Schwarz M, Morgenegg C (1990) Statistical study of foreign exchange rates, empirical evidence of a price change scaling law, and intraday analysis. J Bank Financ 14 (6):1189–1208CrossRefGoogle Scholar
  56. 56.
    Palit AK, Popovic D (2010) Computational intelligence in time series forecasting: Theory and engineering applications (advances in industrial control). Springer, BerlinzbMATHGoogle Scholar
  57. 57.
    Percival DB, Walden AT (2006) Wavelet methods for time series analysis (cambridge series in statistical and probabilistic mathematics). Cambridge University Press, CambridgezbMATHGoogle Scholar
  58. 58.
    Pincus SM (1991) Approximate entropy as a measure of system complexity. Proc Natl Acad Sci 88(6):2297–2301.  https://doi.org/10.1073/pnas.88.6.2297 MathSciNetCrossRefzbMATHGoogle Scholar
  59. 59.
    Priestley MB (1980) State-Dependent Models: A general approach to non-linear time series analysis. J Time Series Anal 1(1):47–71MathSciNetCrossRefzbMATHGoogle Scholar
  60. 60.
  61. 61.
    Qiu T, Guo L, Chen G (2008) Scaling and memory effect in volatility return interval of the chinese stock market. Physica A: Statistical Mechanics and its Applications 387(27):6812– 6818CrossRefGoogle Scholar
  62. 62.
    Rao CR, Mitra SK (1972) Generalized inverse of matrices and its applications (probability & mathematical statistics). Wiley, New YorkGoogle Scholar
  63. 63.
    Sauer T (2011) Numerical analysis, 2nd edn. Addison-Wesley Publishing Company, USAGoogle Scholar
  64. 64.
    Shin Y, Ghosh J (1991) The pi-sigma network : an efficient higher-order neural network for pattern classification and function approximation. In: Proceedings of the international joint conference on neural networks, pp 13–18Google Scholar
  65. 65.
    Shrivastava NA, Panigrahi BK (2014) A hybrid wavelet-elm based short term price forecasting for electricity markets. Int J Electr Power Energy Syst 55:41–50CrossRefGoogle Scholar
  66. 66.
    Shumway RH, Stoffer DS (2006) Time Series Analysis and Its Applications With R Examples. Springer, Berlin. ISBN 978-0-387-29317-2zbMATHGoogle Scholar
  67. 67.
    Strang G, Nguyen T (1997) Wavelets and filter banks. Wellesley-Cambridge Press, CambridgezbMATHGoogle Scholar
  68. 68.
    Sun ZL, Choi TM, Au KF, Yu Y (2008) Sales forecasting using extreme learning machine with applications in fashion retailing. Decis Support Syst 46(1):411–419CrossRefGoogle Scholar
  69. 69.
    Tong H (1983) Threshold models in nonlinear time series analysis. Springer, BerlinCrossRefzbMATHGoogle Scholar
  70. 70.
    Trefethen LN, Bau D (1997) Numerical linear algebra. SIAMGoogle Scholar
  71. 71.
    Tsay RS (2012) An introduction to analysis of financial data with R. wiley, New YorkGoogle Scholar
  72. 72.
    Zhang Q, Benveniste A (1992) Wavelet networks. IEEE Trans Neural Networks 3(6):889–898CrossRefGoogle Scholar
  73. 73.
    Zubulake P, Lee S (2011) The high frequency game changer: How automated trading strategies have revolutionized the markets (wiley trading), Wiley, New YorkGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • César Fernández
    • 1
    • 2
    Email author
  • Luis Salinas
    • 1
    • 2
  • Claudio E. Torres
    • 1
    • 2
  1. 1.Departamento de informáticaUniversidad Técnica Federico Santa MaríaValparaísoChile
  2. 2.CCTVal – Centro Científico Tecnológico de ValparaísoUniversidad Técnica Federico Santa MaríaValparaísoChile

Personalised recommendations