Advertisement

Multi-step Time Series Forecasting of Electric Load Using Machine Learning Models

  • Shamsul Masum
  • Ying Liu
  • John Chiverton
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10841)

Abstract

Multi-step forecasting is very challenging and there are a lack of studies available that consist of machine learning algorithms and methodologies for multi-step forecasting. It has also been found that lack of collaborations between these different fields is creating a barrier to further developments. In this paper, multi-step time series forecasting are performed on three nonlinear electric load datasets extracted from Open-Power-System-Data.org using two machine learning models. Multi-step forecasting performance of Auto-Regressive Integrated Moving Average (ARIMA) and Long-Short-Term-Memory (LSTM) based Recurrent Neural Networks (RNN) models are compared. Comparative analysis of forecasting performance of the two models reveals that the LSTM model has superior performance in comparison to the ARIMA model for multi-step electric load forecasting.

Keywords

Time series analysis Multi-step forecasting ARIMA LSTM 

References

  1. 1.
    Mandal, P., Haque, A.U., Meng, J., Srivastava, A.K., Martinez, R.: A novel hybrid approach using wavelet, firefly algorithm and fuzzy ARTMAP for day-ahead electricity price forecasting. IEEE Trans. Power Syst. 28(2), 1041–1051 (2013)CrossRefGoogle Scholar
  2. 2.
    Du Preez, J., Witt, S.F.: Univariate versus multivariate time series forecasting: an application to international tourism demand. Int. J. Forecasting 19(3), 435–451 (2003)CrossRefGoogle Scholar
  3. 3.
    Nataraja, C., Gorawar, M., Shilpa, G., Harsha, J.S.: Short term load forecasting using time series analysis: a case study for Karnataka, India. Int. J. Eng. Sci. Innov. Technol. 1, 45–53 (2012)Google Scholar
  4. 4.
    Masum, S., Liu, Y., Chiverton, J.: Comparative analysis of the outcomes of differing time series forecasting strategies. In: 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery. IEEE Press (2017)Google Scholar
  5. 5.
    Wang, J.J., Wang, J.Z., Zhang, Z.G., Guo, S.P.: Stock index forecasting based on a hybrid model. Omega 40(6), 758–766 (2012)CrossRefGoogle Scholar
  6. 6.
    Meyler, A., Kenny, G., Quinn, T.: Forecasting Irish inflation using ARIMA models. Published in Central Bank and Financial Services Authority of Ireland Technical Paper Series, vol. 3, p. 148 (1998)Google Scholar
  7. 7.
    Tabachnick, B.G., Fidell, L.S.: Using Multivariate Statistics, 4th edn. Pearson Education, Upper Saddle River (2001)Google Scholar
  8. 8.
    Kam, K.M.: Stationary and non-stationary time series prediction using state space model and pattern-based approach. The University of Texas at Arlington (2014)Google Scholar
  9. 9.
    Lineesh, M., Minu, K., John, C.J.: Analysis of nonstationary nonlinear economic time series of gold price: a comparative study. In: International Mathematical Forum, vol. 5, no. 34, pp. 1673–1683. Citeseer (2010)Google Scholar
  10. 10.
    Liu, K., Subbarayan, S., Shoults, R.R., Manry, M.T., Kwan, C., Lewis, F.L., Naccarino, J.: Comparison of very short-term load forecasting techniques. IEEE Trans. Power Syst. 11(2), 877–882 (1996)CrossRefGoogle Scholar
  11. 11.
    Hagan, M.T., Behr, S.M.: The time series approach to short term load forecasting. IEEE Trans. Power Syst. PWRS-2(3), 785–791 (1987)CrossRefGoogle Scholar
  12. 12.
    Yang, H.T., Huang, C.M., Huang, C.L.: Identification of ARMAX model for short term load forecasting: an evolutionary programming approach. IEEE Trans. Power Syst. 11(1), 403–408 (1996)CrossRefGoogle Scholar
  13. 13.
    Espinoza, M., Joye, C., Belmans, R., Moor, B.D.: Short-term load forecasting, profile identification, and customer segmentation: a methodology based on periodic time series. IEEE Trans. Power Syst. 20(3), 1622–1630 (2005)CrossRefGoogle Scholar
  14. 14.
    Mandal, J.K., Sinha, A.K., Parthasarathy, G.: Application of recurrent neural network for short term load forecasting in electric power system. In: IEEE International Conference on Neural Networks, vol. 5, pp. 2694–2698 (1995)Google Scholar
  15. 15.
    Senjyu, T., Takara, H., Uezato, K., Funabashi, T.: One-hour-ahead load forecasting using neural network. IEEE Trans. Power Syst. 17(1), 113–118 (2002)CrossRefGoogle Scholar
  16. 16.
    Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 5(2), 157–166 (1994)CrossRefGoogle Scholar
  17. 17.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  18. 18.
    Sak, H., Senior, A., Beaufays, F.: Long short-term memory recurrent neural network architectures for large scale acoustic modelling. In: Fifteenth Annual Conference of the International Speech Communication Association (2014)Google Scholar
  19. 19.
    Marino, D.L., Amarasinghe, K., Manic, M.: Building energy load forecasting using deep neural networks. In: 42nd Annual Conference of the IEEE Industrial Electronics Society (IECON), pp. 7046–7051 (2016)Google Scholar
  20. 20.
    Wu, W., Chen, K., Qiao, Y., Lu, Z.: Probabilistic short-term wind power forecasting based on deep neural networks. In: International Conference on Probabilistic Methods Applied to Power Systems (PMAPS), pp. 1–8 (2016)Google Scholar
  21. 21.
    Ho, S.L., Xie, M., Goh, T.N.: A comparative study of neural network and Box-Jenkins ARIMA modeling in time series prediction. Comput. Ind. Eng. 42, 371–375 (2002)CrossRefGoogle Scholar
  22. 22.
    Fu, R., Zhang, Z., Li, L.: Using LSTM and GRU neural network methods for traffic flow prediction. In: 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC), pp. 324–328 (2016)Google Scholar
  23. 23.
    Cao, Q., Ewing, B., Thompson, M.: Forecasting wind speed with recurrent neural networks. Eur. J. Oper. Res. 221, 148–54 (2012)MathSciNetCrossRefGoogle Scholar
  24. 24.
    Ma, X., Tao, Z., Wang, Y., Yu, H., Wang, Y.: Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transp. Res. Part C: Emerg. Technol. 54, 187–197 (2015)CrossRefGoogle Scholar
  25. 25.
    Tian, Y., Pan, L.: Predicting short-term traffic flow by long short-term memory recurrent neural network. In: IEEE International Conference on Smart City, Chengdu, pp. 153–158 (2015)Google Scholar
  26. 26.
    Cheng, H., Tan, P.N.: Semi-supervised learning with data calibration for long-term time series forecasting. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, p. I-9. ACM (2008)Google Scholar
  27. 27.
    Molaei, S.M., Keyvanpour, M.R.: An analytical review for event prediction system on time series. In: 2nd International Conference on Pattern Recognition and Image Analysis (IPRIA), pp. 1–6 (2015)Google Scholar
  28. 28.
    Minaei-Bidgoli, B., Lajevardi, S.B.: Correlation mining between time series stream and event stream. In: Fourth International Conference on Networked Computing and Advanced Information Management, Gyeongju, pp. 333–338 (2008)Google Scholar
  29. 29.
    Soyiri, I.N., Reidpath, D.D.: An overview of health forecasting. Environ. Health Prev. Med. 18(1), 1–9 (2013)CrossRefGoogle Scholar
  30. 30.
    Ben Taieb, S., Bontempi, G., Atiya, A.F., Sorjamaa, A.: A review and comparison of strategies for multi-step ahead time series forecasting based on the NN5 forecasting competition. Expert Syst. Appl. 39(8), 7067–7083 (2012)CrossRefGoogle Scholar
  31. 31.
    An, N.H., Anh, D.T.: Comparison of strategies for multi-step ahead prediction of time series using neural network. In: International Conference on Advanced Computing and Applications (ACOMP), pp. 142–149 (2015)Google Scholar
  32. 32.
    George, E.P.B., Gwilym, M.J., Gregory, C.R., Greta, M.L.: Time Series Analysis: Forecasting and Control. Wiley Publisher, New Jersey (2015)zbMATHGoogle Scholar
  33. 33.
    Medsker, L., Jain, L.: Recurrent Neural Networks, Design and Applications. CRC Press LLC, Boca Raton (2001)Google Scholar
  34. 34.
    Graves, A.: Neural networks. In: Graves, A. (ed.) Supervised Sequence Labelling with Recurrent Neural Networks. SCI, vol. 385, pp. 15–35. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-24797-2_3CrossRefzbMATHGoogle Scholar
  35. 35.
    Olah, C.: Understanding LSTM networks. http://colah.github.io/posts/2015-08-Understanding-LSTMs
  36. 36.
  37. 37.
    McKinney, W.: Python for Data Analysis. O’Reilly, Sebastopol (2013)Google Scholar
  38. 38.
    Lewis, N.D.: Deep Time Series Forecasting with Python. Create Space Independent Publishing Platform (2016)Google Scholar
  39. 39.
    Kingma, D.P., Ba, J.L.: ADAM: a method for stochastic optimization. In: ICLR, pp. 1–15 (2015)Google Scholar
  40. 40.
    Chollet, F.: Keras, GitHub repository (2015). https://github.com/keras-team/keras
  41. 41.
    Seabold, S., Josef, P.: Statsmodels: econometric and statistical modeling with Python. In: The Proceedings of the 9th Python in Science Conference (2010)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of EngineeringUniversity of PortsmouthPortsmouthUK

Personalised recommendations