Skip to main content
Log in

A novel error-output recurrent neural network model for time series forecasting

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

It is a well-known fact that improving forecasting accuracy is an important yet often challenging issue. Extensive research has been conducted using neural networks (NNs) to improve their forecasting accuracy. In general, the inputs to NNs are the auto-regressive (i.e. lagged variables) of one or more time series. In addition, either network outputs or network errors have been used as extra inputs to NNs. In this paper, however, we propose a novel recurrent neural network forecasting model which is called the ridge polynomial neural network with error-output feedbacks (RPNN-EOF). RPNN-EOF has two main types of inputs: auto-regressive and moving-average inputs. The former is represented by the lagged variables of a time series, while the latter is represented by feeding back network error to the input layer. In addition, network output is fed back to the input layer. The proposed recurrent model has the ability to produce more accurate forecasts due to the advantages of learning temporal dependence and the direct modelling of the moving-average component. A comparative analysis of RPNN-EOF with five neural network models was completed using ten time series. Simulation results have shown that RPNN-EOF is the most accurate model among all the compared models with the time series used. This shows that employing auto-regressive and moving-average inputs together helps to produce more accurate forecasts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Abdel-Nasser M, Mahmoud K (2017) Accurate photovoltaic power forecasting models using deep LSTM-RNN. Neural Comput Appl. https://doi.org/10.1007/s00521-017-3225-z

    Article  Google Scholar 

  2. Aizenberg I, Luchetta A, Manetti S (2012) A modified learning algorithm for the multilayer neural network with multi-valued neurons based on the complex QR decomposition. Soft Comput 16(4):563–575. https://doi.org/10.1007/s00500-011-0755-7

    Article  Google Scholar 

  3. Akdeniz E, Egrioglu E, Bas E, Yolcu U (2018) An ARMA type pi-sigma artificial neural network for nonlinear time series forecasting. J Artif Intell Soft Comput Res 8(2):121–132

    Article  Google Scholar 

  4. Al-Jumeily D, Ghazali R, Hussain A (2014) Predicting physical time series using dynamic ridge polynomial neural networks. PLoS ONE 9(8):1–15. https://doi.org/10.1371/journal.pone.0105766

    Article  Google Scholar 

  5. Almaraashi M, John R (2011) Tuning of type-2 fuzzy systems by simulated annealing to predict time series. Proc World Congr Eng 2:976–980

    Google Scholar 

  6. Behera NKS, Behera HS (2014) Firefly based ridge polynomial neural network for classification. In: 2014 IEEE international conference on advanced communications, control and computing technologies, pp 1110–1113. https://doi.org/10.1109/ICACCCT.2014.7019270

  7. Bodyanskiy Y, Vynokurova O, Pliss I, Peleshko D, Rashkevych Y (2018) Deep stacking convex neuro-fuzzy system and its on-line learning. In: Advances in dependability engineering of complex systems: proceedings of the twelfth international conference on dependability and complex systems DepCoS-RELCOMEX, July 2–6, 2017, Brunów, Poland, Springer, pp 49–59. https://doi.org/10.1007/978-3-319-59415-6_5

  8. Bouaziz S, Alimi AM, Abraham A (2013) Extended immune programming and opposite-based PSO for evolving flexible beta basis function neural tree. In: 2013 IEEE international conference on cybernetics (CYBCO), pp 13–18. https://doi.org/10.1109/CYBConf.2013.6617425

  9. Bouaziz S, Dhahri H, Alimi AM, Abraham A (2013) A hybrid learning algorithm for evolving flexible beta basis function neural tree model. Neurocomputing 117:107–117. https://doi.org/10.1016/j.neucom.2013.01.024

    Article  Google Scholar 

  10. Box GE, Jenkins GM, Reinsel GC, Ljung GM (2015) Time series analysis: forecasting and control. Wiley, Hoboken

    MATH  Google Scholar 

  11. Burgess A, Refenes AP (1999) Modelling non-linear moving average processes using neural networks with error feedback: an application to implied volatility forecasting. Signal Process 74(1):89–99. https://doi.org/10.1016/S0165-1684(98)00202-3

    Article  MATH  Google Scholar 

  12. Cass R, Radl B (1996) Adaptive process optimization using functional-link networks and evolutionary optimization. Control Eng Pract 4(11):1579–1584. https://doi.org/10.1016/0967-0661(96)00173-6

    Article  Google Scholar 

  13. Chakra NC, Song KY, Gupta MM, Saraf DN (2013) An innovative neural forecast of cumulative oil production from a petroleum reservoir employing higher-order neural networks (honns). J Pet Sci Eng 106:18–33. https://doi.org/10.1016/j.petrol.2013.03.004

    Article  Google Scholar 

  14. Chandra R (2015) Competition and collaboration in cooperative coevolution of Elman recurrent neural networks for time-series prediction. IEEE Trans Neural Netw Learn Syst 26(12):3123–3136. https://doi.org/10.1109/TNNLS.2015.2404823

    Article  MathSciNet  Google Scholar 

  15. Chandra R, Ong YS, Goh CK (2017) Co-evolutionary multi-task learning with predictive recurrence for multi-step chaotic time series prediction. Neurocomputing 243:21–34. https://doi.org/10.1016/j.neucom.2017.02.065

    Article  Google Scholar 

  16. Cheng CT, Wu XY, Chau KW (2005) Multiple criteria rainfall-runoff model calibration using a parallel genetic algorithm in a cluster of computers. Hydrol Sci J 50(6):1069–1087. https://doi.org/10.1623/hysj.2005.50.6.1069

    Article  Google Scholar 

  17. Cheng R, Hu H, Tan X, Bai Y (2015) Initialization by a novel clustering for wavelet neural network as time series predictor. Intell Neurosci 2015:48. https://doi.org/10.1155/2015/572592

    Article  Google Scholar 

  18. Comon P, Qi Y, Usevich K (2016) X-rank and identifiability for a polynomial decomposition model. arXiv:160301566

  19. Connor JT, Martin RD, Atlas LE (1994) Recurrent neural networks and robust time series prediction. IEEE Trans Neural Netw 5(2):240–254. https://doi.org/10.1109/72.279188

    Article  Google Scholar 

  20. Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst 2(4):303–314. https://doi.org/10.1007/BF02551274

    Article  MathSciNet  MATH  Google Scholar 

  21. Dehuri S, Cho SB (2010) A comprehensive survey on functional link neural networks and an adaptive PSO–BP learning for CFLNN. Neural Comput Appl 19(2):187–205. https://doi.org/10.1007/s00521-009-0288-5

    Article  Google Scholar 

  22. Dong Y, Zhang J (2014) An improved boosting scheme based ensemble of fuzzy neural networks for nonlinear time series prediction. In: 2014 international joint conference on neural networks (IJCNN), pp 157–164. https://doi.org/10.1109/IJCNN.2014.6889431

  23. Egrioglu E, Yolcu U, Aladag CH, Bas E (2015) Recurrent multiplicative neuron model artificial neural network for non-linear time series forecasting. Neural Process Lett 41(2):249–258. https://doi.org/10.1007/s11063-014-9342-0

    Article  Google Scholar 

  24. Friedman M (1937) The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J Am Stat Assoc 32(200):675–701. https://doi.org/10.1080/01621459.1937.10503522

    Article  MATH  Google Scholar 

  25. Gao Y, Er MJ (2005) Narmax time series model prediction: feedforward and recurrent fuzzy neural network approaches. Fuzzy Sets Syst 150(2):331–350. https://doi.org/10.1016/j.fss.2004.09.015

    Article  MathSciNet  MATH  Google Scholar 

  26. García S, Fernández A, Luengo J, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf Sci 180(10):2044–2064

    Article  Google Scholar 

  27. Ghazali R, Hussain AJ, Liatsis P, Tawfik H (2008) The application of ridge polynomial neural network to multi-step ahead financial time series prediction. Neural Comput Appl 17(3):311–323. https://doi.org/10.1007/s00521-007-0132-8

    Article  Google Scholar 

  28. Ghazali R, Hussain AJ, Nawi NM, Mohamad B (2009) Non-stationary and stationary prediction of financial time series using dynamic ridge polynomial neural network. Neurocomputing 72(10):2359–2367. https://doi.org/10.1016/j.neucom.2008.12.005

    Article  Google Scholar 

  29. Ghazali R, Hussain AJ, Liatsis P (2011) Dynamic ridge polynomial neural network: forecasting the univariate non-stationary and stationary trading signals. Expert Syst Appl 38(4):3765–3776. https://doi.org/10.1016/j.eswa.2010.09.037

    Article  Google Scholar 

  30. Ghazvinei PT, Darvishi HH, Mosavi A, Yusof KBW, Alizamir M, Shamshirband S, Chau KW (2018) Sugarcane growth prediction based on meteorological parameters using extreme learning machine and artificial neural network. Eng Appl Comput Fluid Mech 12(1):738–749. https://doi.org/10.1080/19942060.2018.1526119

    Article  Google Scholar 

  31. Ghorbani MA, Kazempour R, Chau KW, Shamshirband S, Ghazvinei PT (2018) Forecasting pan evaporation with an integrated artificial neural network quantum-behaved particle swarm optimization model: a case study in Talesh, Northern Iran. Eng Appl Comput Fluid Mech 12(1):724–737. https://doi.org/10.1080/19942060.2018.1517052

    Article  Google Scholar 

  32. Giles CL, Maxwell T (1987) Learning, invariance, and generalization in high-order neural networks. Appl Opt 26(23):4972–4978. https://doi.org/10.1364/AO.26.004972

    Article  Google Scholar 

  33. Hacib T, Bihan YL, Smail MK, Mekideche MR, Meyer O, Pichon L (2011) Microwave characterization using ridge polynomial neural networks and least-square support vector machines. IEEE Trans Magn 47(5):990–993. https://doi.org/10.1109/TMAG.2010.2087743

    Article  Google Scholar 

  34. Han J, Pei J, Kamber M (2011) Data mining: concepts and techniques. Elsevier, Amsterdam

    MATH  Google Scholar 

  35. Han MF, Lin CT, Chang JY (2013) Efficient differential evolution algorithm-based optimisation of fuzzy prediction model for time series forecasting. Int J Intell Inf Database Syst 7(3):225–241

    Google Scholar 

  36. Haykin S (2009) Neural networks and learning machines, vol 3. Pearson Education, Upper Saddle River

    Google Scholar 

  37. Herrera LJ, Pomares H, Rojas I, Guillén A, González J, Awad M, Herrera A (2007) Multigrid-based fuzzy systems for time series prediction: CATS competition. Neurocomputing 70(13):2410–2425. https://doi.org/10.1016/j.neucom.2006.09.014

    Article  Google Scholar 

  38. Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6(2):65–70

    MathSciNet  MATH  Google Scholar 

  39. Huang SC, Chuang PJ, Wu CF, Lai HJ (2010) Chaos-based support vector regressions for exchange rate forecasting. Expert Syst Appl 37(12):8590–8598. https://doi.org/10.1016/j.eswa.2010.06.001

    Article  Google Scholar 

  40. Husaini NA, Ghazali R, Mohd Nawi N, Ismail LH (2011) Jordan pi-sigma neural network for temperature prediction. Springer, Berlin, pp 547–558

    Google Scholar 

  41. Husaini NA, Ghazali R, Ismail LH, Herawan T (2014) A Jordan pi-sigma neural network for temperature forecasting in Batu Pahat region. Springer, Cham, pp 11–24

    Google Scholar 

  42. Hussain A, Liatsis P (2009) A novel recurrent polynomial neural network for financial time series prediction. In: Artificial higher order neural networks for economics and business. IGI Global, pp 190–211. https://doi.org/10.4018/978-1-59904-897-0.ch009

  43. Hussain AJ, Knowles A, Lisboa PJ, El-Deredy W (2008a) Financial time series prediction using polynomial pipelined neural networks. Expert Syst Appl 35(3):1186–1199. https://doi.org/10.1016/j.eswa.2007.08.038

    Article  Google Scholar 

  44. Hussain AJ, Liatsis P, Tawfik H, Nagar AK, Al-Jumeily D (2008b) Physical time series prediction using recurrent pi-sigma neural networks. Int J Artif Intell Soft Comput 1(1):130–145

    Article  Google Scholar 

  45. Hyndman RJ, Athanasopoulos G (2016) Forecasting: principles and practice. OTexts, Melbourne

    Google Scholar 

  46. Ko CN, Fu YY, Liu GY, Lee CM (2011) Identification of time-delay chaotic system with outliers: fuzzy neural networks using hybrid learning algorithm. In: 2011 IEEE international conference on fuzzy systems (FUZZ-IEEE 2011), pp 2827–2832. https://doi.org/10.1109/FUZZY.2011.6007456

  47. Lazzús JA (2011) Predicting natural and chaotic time series with a swarm-optimized neural network. Chin Phys Lett 28(11):110504. https://doi.org/10.1088/0256-307X/28/11/110504

    Article  MATH  Google Scholar 

  48. Lendasse A, Lee J, Wertz V, Verleysen M (2002) Forecasting electricity consumption using nonlinear projection and self-organizing maps. Neurocomputing 48(1):299–311. https://doi.org/10.1016/S0925-2312(01)00646-4

    Article  MATH  Google Scholar 

  49. Li C, Chiang TW (2013) Complex neurofuzzy ARIMA forecasting—a new approach using complex fuzzy sets. IEEE Trans Fuzzy Syst 21(3):567–584. https://doi.org/10.1109/TFUZZ.2012.2226890

    Article  Google Scholar 

  50. Li C, Hu JW (2012) A new ARIMA-based neuro-fuzzy approach and swarm intelligence for time series forecasting. Eng Appl Artif Intell 25(2):295–308. https://doi.org/10.1016/j.engappai.2011.10.005

    Article  MathSciNet  Google Scholar 

  51. Liatsis P, Hussain AJ (1999) Nonlinear 1D DPCM image prediction using polynomial neural networks. https://doi.org/10.1117/12.341124

  52. Lin CM, Boldbaatar EA (2015) Autolanding control using recurrent wavelet Elman neural network. IEEE Trans Syst Man Cybern Syst 45(9):1281–1291. https://doi.org/10.1109/TSMC.2015.2389752

    Article  Google Scholar 

  53. Lin L, Guo F, Xie X, Luo B (2015) Novel adaptive hybrid rule network based on TS fuzzy rules using an improved quantum-behaved particle swarm optimization. Neurocomputing 149:1003–1013. https://doi.org/10.1016/j.neucom.2014.07.033

    Article  Google Scholar 

  54. Lu C, Han H, Qiao J, Yang C (2016) Design of a self-organizing recurrent RBF neural network based on spiking mechanism. In: 2016 35th Chinese control conference (CCC), pp 3624–3629. https://doi.org/10.1109/ChiCC.2016.7553916

  55. Makridakis S, Andersen A, Carbone R, Fildes R, Hibon M, Lewandowski R, Newton J, Parzen E, Winkler R (1982) The accuracy of extrapolation (time series) methods: results of a forecasting competition. J Forecast 1(2):111–153

    Article  Google Scholar 

  56. Marcek D (2017) Forecasting of financial data: a novel fuzzy logic neural network based on error-correction concept and statistics. Complex Intell Syst. https://doi.org/10.1007/s40747-017-0056-6

    Article  Google Scholar 

  57. Moazenzadeh R, Mohammadi B, Shamshirband S, Chau KW (2018) Coupling a firefly algorithm with support vector regression to predict evaporation in Northern Iran. Eng Appl Comput Fluid Mech 12(1):584–597. https://doi.org/10.1080/19942060.2018.1482476

    Article  Google Scholar 

  58. Mosavi MR (2011) Error reduction for GPS accurate timing in power systems using Kalman filters and neural networks. J Electr Rev 87(12):161–168

    Google Scholar 

  59. Nand R (2016) Neuron-synapse level problem decomposition method for cooperative coevolution of recurrent networks for time series prediction. In: 2016 IEEE congress on evolutionary computation (CEC), pp 3102–3109. https://doi.org/10.1109/CEC.2016.7744181

  60. Nguyen SD, Choi SB (2015) Design of a new adaptive neuro-fuzzy inference system based on a solution for clustering in a data potential field. Fuzzy Sets Syst 279:64–86. https://doi.org/10.1016/j.fss.2015.02.012

    Article  MathSciNet  Google Scholar 

  61. Panda C, Narasimhan V (2007) Forecasting exchange rate better with artificial neural network. J Policy Model 29(2):227–236. https://doi.org/10.1016/j.jpolmod.2006.01.005

    Article  Google Scholar 

  62. Pao Y (1989) Adaptive pattern recognition and neural networks. Addison-Wesley, Boston, MA

  63. Parsapoor M, Bilstrup U (2013) Chaotic time series prediction using brain emotional learning-based recurrent fuzzy system (BELRFS). Int J Reason Based Intell Syst 5(2):113–126. https://doi.org/10.1504/IJRIS.2013.057273

    Article  Google Scholar 

  64. Piotrowski AP, Napiorkowski JJ (2012) Product-units neural networks for catchment runoff forecasting. Adv Water Resour 49:97–113. https://doi.org/10.1016/j.advwatres.2012.05.016

    Article  Google Scholar 

  65. Pouzols FM, Lendasse A (2010) Evolving fuzzy optimally pruned extreme learning machine for regression problems. Evol Syst 1(1):43–58. https://doi.org/10.1007/s12530-010-9005-y

    Article  Google Scholar 

  66. Sarıca B, Eğrioğlu E, Aşıkgil B (2018) A new hybrid method for time series forecasting: AR–ANFIS. Neural Comput Appl 29(3):749–760. https://doi.org/10.1007/s00521-016-2475-5

    Article  Google Scholar 

  67. Schmitt M (2002) On the complexity of computing and learning with multiplicative neural networks. Neural Comput 14(2):241–301. https://doi.org/10.1162/08997660252741121

    Article  MATH  Google Scholar 

  68. Sermpinis G, Dunis C, Laws J, Stasinakis C (2012) Forecasting and trading the EUR/USD exchange rate with stochastic neural network combination and time-varying leverage. Decis Support Syst 54(1):316–329. https://doi.org/10.1016/j.dss.2012.05.039

    Article  Google Scholar 

  69. Sermpinis G, Laws J, Karathanasopoulos A, Dunis CL (2012) Forecasting and trading the EUR/USD exchange rate with gene expression and psi sigma neural networks. Expert Syst Appl 39(10):8865–8877. https://doi.org/10.1016/j.eswa.2012.02.022

    Article  Google Scholar 

  70. Sermpinis G, Laws J, Dunis CL (2013) Modelling commodity value at risk with psi sigma neural networks using open–high–low–close data. Eur J Finance 21(4):316–336. https://doi.org/10.1080/1351847X.2012.744763

    Article  Google Scholar 

  71. Sermpinis G, Stasinakis C, Dunis C (2014) Stochastic and genetic neural network combinations in trading and hybrid time-varying leverage effects. J Int Financ Mark Inst Money 30:21–54. https://doi.org/10.1016/j.intfin.2014.01.006

    Article  Google Scholar 

  72. Shin Y, Ghosh J (1991) The pi-sigma network: an efficient higher-order neural network for pattern classification and function approximation. In: IJCNN-91-Seattle international joint conference on neural networks, vol 1, pp 13–18. https://doi.org/10.1109/IJCNN.1991.155142

  73. Shin Y, Ghosh J (1995) Ridge polynomial networks. IEEE Trans Neural Netw 6(3):610–622. https://doi.org/10.1109/72.377967

    Article  Google Scholar 

  74. Shoorehdeli MA, Teshnehlab M, Sedigh AK, Khanesar MA (2009) Identification using ANFIS with intelligent hybrid stable learning algorithm approaches and stability analysis of training methods. Appl Soft Comput 9(2):833–850. https://doi.org/10.1016/j.asoc.2008.11.001

    Article  Google Scholar 

  75. Taormina R, Chau KW, Sivakumar B (2015) Neural network river forecasting through baseflow separation and binary-coded swarm optimization. J Hydrol 529:1788–1797. https://doi.org/10.1016/j.jhydrol.2015.08.008

    Article  Google Scholar 

  76. Thenmozhi M, Chand GS (2016) Forecasting stock returns based on information transmission across global markets using support vector machines. Neural Comput Appl 27(4):805–824. https://doi.org/10.1007/s00521-015-1897-9

    Article  Google Scholar 

  77. Tikka J, Hollmén J (2008) Sequential input selection algorithm for long-term prediction of time series. Neurocomputing 71(13):2604–2615. https://doi.org/10.1016/j.neucom.2007.11.037

    Article  Google Scholar 

  78. Tukey J (1977) Exploratory data analysis. Addison-Wesley series in behavioral science. Addison-Wesley Publishing Company, Boston

    Google Scholar 

  79. Waheeb W, Ghazali R (2016a) Chaotic time series forecasting using higher order neural networks. Int J Adv Sci Eng Inf Technol 6(5):624–629

    Article  Google Scholar 

  80. Waheeb W, Ghazali R (2016b) Multi-step time series forecasting using ridge polynomial neural network with error-output feedbacks. In: Berry MW, Mohamed AHJ, Yap BW (eds) Soft computing in data science. Springer, Singapore, pp 48–58

    Chapter  Google Scholar 

  81. Waheeb W, Ghazali R (2019) A new genetically optimized tensor product functional link neural network: an application to the daily exchange rate forecasting. Evol Intell. https://doi.org/10.1007/s12065-019-00261-2

  82. Waheeb W, Ghazali R, Herawan T (2016) Ridge polynomial neural network with error feedback for time series forecasting. PLoS ONE 11(12):1–34. https://doi.org/10.1371/journal.pone.0167248

    Article  Google Scholar 

  83. Waheeb W, Ghazali R, Hussain AJ (2017) Dynamic ridge polynomial neural network with Lyapunov function for time series forecasting. Appl Intell. https://doi.org/10.1007/s10489-017-1036-7

    Article  Google Scholar 

  84. Wang H, Zhao L, Du W, Qian F (2011) A hybrid method for identifying T–S fuzzy models. In: 2011 eighth international conference on fuzzy systems and knowledge discovery (FSKD), vol 1, pp 11–15. https://doi.org/10.1109/FSKD.2011.6019488

  85. Williams RJ, Zipser D (1989) A learning algorithm for continually running fully recurrent neural networks. Neural Comput 1(2):270–280. https://doi.org/10.1162/neco.1989.1.2.270

    Article  Google Scholar 

  86. Wong KP (2017) Cross-hedging ambiguous exchange rate risk. J Futur Mark 37(2):132–147. https://doi.org/10.1002/fut.21793

    Article  Google Scholar 

  87. Wong W, Xia M, Chu W (2010) Adaptive neural network model for time-series forecasting. Eur J Oper Res 207(2):807–816. https://doi.org/10.1016/j.ejor.2010.05.022

    Article  MathSciNet  MATH  Google Scholar 

  88. Wu C, Chau K (2011) Rainfall-runoff modeling using artificial neural network coupled with singular spectrum analysis. J Hydrol 399(3):394–409. https://doi.org/10.1016/j.jhydrol.2011.01.017

    Article  Google Scholar 

  89. Yabuta T, Yamada T (1991) Learning control using neural networks. In: Proceedings 1991 IEEE international conference on robotics and automation, vol 1, pp 740–745. https://doi.org/10.1109/ROBOT.1991.131673

  90. Zhang G, Patuwo BE, Hu MY (1998) Forecasting with artificial neural networks: the state of the art. Int J Forecast 14(1):35–62. https://doi.org/10.1016/S0169-2070(97)00044-7

    Article  Google Scholar 

  91. Zhao Y, Stasinakis C, Sermpinis G, Shi Y (2018) Neural network copula portfolio optimization for exchange traded funds. Quant Finance 18(5):761–775. https://doi.org/10.1080/14697688.2017.1414505

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Universiti Tun Hussein Onn Malaysia and the Office for Research, Innovation, Commercialization and Consultancy Management (ORICC) for funding this research under the Postgraduate Research Grant (GPPS), VOT # U612.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rozaida Ghazali.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Waheeb, W., Ghazali, R. A novel error-output recurrent neural network model for time series forecasting. Neural Comput & Applic 32, 9621–9647 (2020). https://doi.org/10.1007/s00521-019-04474-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-019-04474-5

Keywords

Navigation