Journal of Computer Science and Technology

, Volume 34, Issue 2, pp 318–338 | Cite as

A 2-Stage Strategy for Non-Stationary Signal Prediction and Recovery Using Iterative Filtering and Neural Network

  • Feng ZhouEmail author
  • Hao-Min Zhou
  • Zhi-Hua Yang
  • Li-Hua Yang
Regular Paper


Predicting the future information and recovering the missing data for time series are two vital tasks faced in various application fields. They are often subjected to big challenges, especially when the signal is nonlinear and non-stationary which is common in practice. In this paper, we propose a hybrid 2-stage approach, named IF2FNN, to predict (including short-term and long-term predictions) and recover the general types of time series. In the first stage, we decompose the original non-stationary series into several “quasi stationary” intrinsic mode functions (IMFs) by the iterative filtering (IF) method. In the second stage, all of the IMFs are fed as the inputs to the factorization machine based neural network model to perform the prediction and recovery. We test the strategy on five datasets including an artificial constructed signal (ACS), and four real-world signals: the length of day (LOD), the northern hemisphere land-ocean temperature index (NHLTI), the troposphere monthly mean temperature (TMMT), and the national association of securities dealers automated quotations index (NASDAQ). The results are compared with those obtained from the other prevailing methods. Our experiments indicate that under the same conditions, the proposed method outperforms the others for prediction and recovery according to various metrics such as mean absolute error (MAE), root mean square error (RMSE), and mean absolute percentage error (MAPE).


iterative filtering factorization machine neural network time series data recovery 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2019_1913_MOESM1_ESM.pdf (70 kb)
ESM 1 (PDF 69 kb)


  1. [1]
    Safari N, Chung C Y, Price G C D. Novel multi-step short-term wind power prediction framework based on chaotic time series analysis and singular spectrum analysis. IEEE Transactions on Power Systems, 2018, 33(1): 590-601.Google Scholar
  2. [2]
    Oh K J, Kim K J. Analyzing stock market tick data using piecewise nonlinear model. Expert Systems with Applications, 2002, 22(3): 249-255.Google Scholar
  3. [3]
    Wang Y F. Mining stock price using fuzzy rough set system. Expert Systems with Applications, 2003, 24(1): 13-23.Google Scholar
  4. [4]
    Faruk D Ö. A hybrid neural network and ARIMA model for water quality time series prediction. Engineering Applications of Artificial Intelligence, 2010, 23(4): 586-594.MathSciNetGoogle Scholar
  5. [5]
    Kasabov N K, Song Q. DENFIS: Dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Transactions on Fuzzy Systems, 2002, 10(2): 144-154.Google Scholar
  6. [6]
    Franses P H, Ghijsels H. Additive outliers, GRACH and forecasting volatility. International Journal of Forecasting, 1999, 15(1): 1-9.Google Scholar
  7. [7]
    Sarantis N. Nonlinearities, cyclical behaviour and predictability in stock markets: International evidence. International Journal of Forecasting, 2001, 17(3): 459-482.Google Scholar
  8. [8]
    Kalekar P S. Time series forecasting using Holt-Winters exponential smoothing., Jan. 2019.
  9. [9]
    Hansen J V, Nelson R D. Data mining of time series using stacked generalizers. Neurocomputing, 2002, 43(1/2/3/4): 173-184.zbMATHGoogle Scholar
  10. [10]
    Zhang G P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing, 2003, 50: 159-175.zbMATHGoogle Scholar
  11. [11]
    Enke D, Thawornwong S. The use of data mining and neural networks for forecasting stock market returns. Expert Systems with Applications, 2005, 29(4): 927-940.Google Scholar
  12. [12]
    Ture M, Kurt I. Comparison of four different time series methods to forecast hepatitis a virus infection. Expert Systems with Applications, 2006, 31(1): 41-46.Google Scholar
  13. [13]
    Kim K J. Financial time series forecasting using support vector machines. Neurocomputing, 2003, 55(1/2): 307-319.Google Scholar
  14. [14]
    Qian X Y. Financial series prediction: Comparison between precision of time series models and machine learning methods. arXiv:1706.00948, 2017., June 2018.
  15. [15]
    Chen T, Guestrin C. Xgboost: A scalable tree boosting system. In Proc. the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, August 2016, pp.785-794.Google Scholar
  16. [16]
    Ye J, Chow J H, Chen J, Zheng Z. Stochastic gradient boosted distributed decision trees. In Proc. the 18th ACM-Conference on Information and Knowledge Management, November 2009, pp.2061-2064.Google Scholar
  17. [17]
    Kim K J, Han I. Genetic algorithms approach to feature discretization in artificial neural networks for the prediction of stock price index. Expert Systems with Applications, 2000, 19(2): 125-132.Google Scholar
  18. [18]
    Wang Y F. Predicting stock price using fuzzy grey prediction system. Expert Systems with Applications, 2002, 22(1): 33-38.Google Scholar
  19. [19]
    Shen L, Han T L. Applying rough sets to market timing decisions. Decision Support Systems, 2004, 37(4): 583-597.Google Scholar
  20. [20]
    Vellido A, Lisboa P J G, Meehan K. Segmentation of the on-line shopping market using neural networks. Expert Systems with Applications, 1999, 17(4): 303-314.Google Scholar
  21. [21]
    Chen A S, Leung M T, Daouk H. Application of neural networks to an emerging financial market: Forecasting and trading the Taiwan stock index. Computers and Operations Research, 2003, 30(6): 901-923.zbMATHGoogle Scholar
  22. [22]
    Rather A M, Agarwal A, Sastry V N. Recurrent neural network and a hybrid model for prediction of stock returns. Expert Systems with Applications, 2015, 42(6): 3234-3241.Google Scholar
  23. [23]
    Yang Z, Yang L, Qi D. Detection of spindles in sleep EEGs using a novel algorithm based on the Hilbert-Huang transform. In Wavelet Analysis and Applications, Qian T, Vai M I, Xu Y S (eds.), Birkhäuser, 2007, pp.543-559.Google Scholar
  24. [24]
    Wang J Z, Wang J J, Zhang Z G, Guo S P. Forecasting stock indices with back propagation neural network. Expert Systems with Applications, 2011, 38(11): 14346-14355.Google Scholar
  25. [25]
    Liu H, Chen C, Tian H Q, Li Y F. A hybrid model for wind speed prediction using empirical mode decomposition and artificial neural networks. Renewable Energy, 2012, 48: 545-556.Google Scholar
  26. [26]
    Kao L J, Chiu C C, Lu C J, Chang C H. A hybrid approach by integrating wavelet-based feature extraction with MARS and SVR for stock index forecasting. Decision Support Systems, 2013, 54(3): 1228-1244.Google Scholar
  27. [27]
    Zhang L, Wu X, Ji W, Abourizk S M. Intelligent approach to estimation of tunnel-induced ground settlement using wavelet packet and support vector machines. Journal of Computing in Civil Engineering, 2016, 31(2): Article No. 04016053.Google Scholar
  28. [28]
    Wei L Y. A hybrid ANFIS model based on empirical mode decomposition for stock time series forecasting. Applied Soft Computing, 2016, 42: 368-376.Google Scholar
  29. [29]
    Zhou F, Zhou H, Yang Z, Yang L. EMD2FNN: A strategy combining empirical mode decomposition and factorization machine based neural network for stock market trend prediction. Expert Systems with Applications, 2019, 115: 136- 151.Google Scholar
  30. [30]
    Thompson W R, Weil C S. On the construction of tables for moving-average interpolation. Biometrics, 1952, 8(1): 51-54.Google Scholar
  31. [31]
    Watson D F. A refinement of inverse distance weighted interpolation. GeoProcessing, 1985, 2(4): 315-327.MathSciNetGoogle Scholar
  32. [32]
    Liu G R, Zhang G Y. A novel scheme of strain-constructed point interpolation method for static and dynamic mechanics problems. International Journal of Applied Mechanics, 2009, 1(1): 233-258.Google Scholar
  33. [33]
    Schoenberg I J. Contributions to the problem of approximation of equidistant data by analytic functions (part A). Quarterly of Applied Mathematics, 1946, 4: 3-57.Google Scholar
  34. [34]
    Schoenberg I J. Cardinal Spline Interpolation. Society for Industrial and Applied Mathematics, 1973.Google Scholar
  35. [35]
    Lin L, Wang Y, Zhou H. Iterative filtering as an alternative algorithm for empirical mode decomposition. Advances in Adaptive Data Analysis, 2009, 1(4): 543-560.MathSciNetGoogle Scholar
  36. [36]
    Cicone A, Liu J, Zhou H. Adaptive local iterative filtering for signal decomposition and instantaneous frequency analysis. Applied and Computational Harmonic Analysis, 2016, 41(2): 384-411.MathSciNetzbMATHGoogle Scholar
  37. [37]
    Cicone A, Zhou H. Multidimensional iterative filtering method for the decomposition of high-dimensional nonstationary signals. Numerical Mathematics: Theory, Methods and Applications, 2017, 10(2): 278-298.MathSciNetzbMATHGoogle Scholar
  38. [38]
    Huang N E, Shen Z, Long S R, Wu M C, Shih H H, Zheng Q, Yen N C, Chi C T, Liu H H. The empirical mode decomposition and the Hilbert spectrum for nonlinear and nonstationary time series analysis. Proceedings of the Royal Society A: Mathematical Physical and Engineering Sciences, 1998, 454(1971): 903-995.MathSciNetzbMATHGoogle Scholar
  39. [39]
    Holt C C. Forecasting seasonals and trends by exponentially weighted moving averages. International Journal of Forecasting, 2004, 20(1): 5-10.Google Scholar
  40. [40]
    Winters P R. Forecasting sales by exponentially weighted moving averages. Management Science, 1960, 6(3): 231-362.MathSciNetzbMATHGoogle Scholar
  41. [41]
    Flandrin P, Rilling G, Goncalves P. Empirical mode decomposition as a filter bank. IEEE Signal Processing Letters, 2004, 11(2): 112-114.Google Scholar
  42. [42]
    Zhou F, Yang L, Zhou H, Yang L. Optimal averages for nonlinear signal decompositions — Another alternative for empirical mode decomposition. Signal Processing, 2016, 121: 17-29.Google Scholar
  43. [43]
    Huang N E, Shen Z, Long S R. A new view of nonlinear water waves: The Hilbert spectrum. Annual Review of Fluid Mechanics, 1999, 31(1): 417-457.MathSciNetGoogle Scholar
  44. [44]
    Huang W, Shen Z, Huang N E, Yuan C F. Engineering analysis of biological variables: An example of blood pressure over 1 day. Proceedings of the National Academy of Sciences of the United States of America, 1998, 95(9): 4816-4821.Google Scholar
  45. [45]
    Yang Z, Qi D, Yang L. Signal period analysis based on Hilbert-Huang transform and its application to texture analysis. In Proc. the 3rd International Conference on Image and Graphics, April 2005, pp.430-433.Google Scholar
  46. [46]
    Smith J S. The local mean decomposition and its application to EEG perception data. Journal of the Royal Society Interface, 2005, 2(5): 443-454.Google Scholar
  47. [47]
    Delechelle E, Lemoine J, Niang O. Empirical mode decomposition: An analytical approach for sifting process. IEEE Signal Processing Letters, 2005, 12(11): 764-767.zbMATHGoogle Scholar
  48. [48]
    Diop E H S, Alexandre R, Boudraa A O. Analysis of intrinsic mode functions: A PDE approach. IEEE Signal Processing Letters, 2010, 17(4): 398-401.Google Scholar
  49. [49]
    Hong H, Wang X, Tao Z. Local integral mean-based sifting for empirical mode decomposition. IEEE Signal Processing Letters, 2009, 16(10): 841-844.Google Scholar
  50. [50]
    Peng S, Hwang WL. Null space pursuit: An operator-based approach to adaptive signal separation. IEEE Transactions on Signal Processing, 2010, 58(5): 2475-2483.MathSciNetzbMATHGoogle Scholar
  51. [51]
    Daubechies I, Lu J, Wu H T. Synchrosqueezed wavelet transforms: An empirical mode decomposition-like tool. Applied and Computational Harmonic Analysis, 2011, 30(2): 243-261.MathSciNetzbMATHGoogle Scholar
  52. [52]
    Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(6): 1137-1149.Google Scholar
  53. [53]
    Shelhamer E, Long J, Darrell T. Fully convolutional networks for semantic segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(4): 640-651.Google Scholar
  54. [54]
    Hinton G, Deng L, Yu D, Dahl G E, Mohamed A, Jaitly N, Senior A, Vanhoucke V, Nguyen P, Sainath T N. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Processing Magazine, 2012, 29(6): 82-97.Google Scholar
  55. [55]
    Chen C H. Handbook of Pattern Recognition and Computer Vision (5th edition). World Scientific Publishing, 2016.Google Scholar
  56. [56]
    Goldberg Y. Neural Network Methods for Natural Language Processing. Morgan and Claypool Publishers, 2017.Google Scholar
  57. [57]
    Rendle S. Factorization machines. In Proc. the 10th IEEE International Conference on Data Mining, December 2010, pp.995-1000.Google Scholar
  58. [58]
    Han J, Moraga C. The influence of the sigmoid function parameters on the speed of backpropagation learning. In Proc. International Workshop on Artificial Neural Networks: From Natural to Artificial Neural Computation, June 1995, pp.195-201.Google Scholar
  59. [59]
    Lecun Y, Bengio Y, Hinton G. Deep learning. Nature, 2015, 521(7553): 436-444.Google Scholar
  60. [60]
    He K, Zhang X, Ren S, Sun J. Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification. In Proc. the 2015 IEEE International Conference on Computer Vision, December 2015, pp.1026-1034.Google Scholar
  61. [61]
    Clevert D A, Unterthiner T, Hochreiter S. Fast and accurate deep network learning by exponential linear units (ELUs). arXiv:1511.07289, 2015., November 2018.
  62. [62]
    Makridakis S. Accuracy measures: Theoretical and practical concerns. International Journal of Forecasting, 1993, 9(4): 527-529.Google Scholar

Copyright information

© Springer Science+Business Media, LLC & Science Press, China 2019

Authors and Affiliations

  • Feng Zhou
    • 1
    Email author
  • Hao-Min Zhou
    • 2
  • Zhi-Hua Yang
    • 1
  • Li-Hua Yang
    • 3
    • 4
  1. 1.School of Information ScienceGuangdong University of Finance and EconomicsGuangzhouChina
  2. 2.School of MathematicsGeorgia Institute of TechnologyAtlantaU.S.A.
  3. 3.Guangdong Province Key Laboratory of Computational ScienceGuangzhouChina
  4. 4.School of MathematicsSun Yat-sen UniversityGuangzhouChina

Personalised recommendations