Skip to main content

Advertisement

Log in

An integrated model based on deep kernel extreme learning machine and variational mode decomposition for day-ahead electricity load forecasting

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Accurate short-term electricity load forecasts are critical for the secure and economic operation of power systems. This paper presents a computationally efficient and powerful three-stage model to accurately forecast short-term electricity load. Variational mode decomposition (VMD) was used in the first stage to extract features from the historical load signal. The stacked kernel extreme learning machine (KELM)-based auto-encoders were utilized in the unsupervised feature learning-based second stage. In the third stage, the high-order learned features were used as the inputs for the KELM-based regression model. The proposed deep KELM architecture combines stacked KELM-based auto-encoders and a KELM-based regression model to forecast short-term electricity load effectively. In order to examine the performance improvement of the proposed forecasting model, several performance comparison tests were realized using publicly available electricity load and day-ahead forecast data from the Turkish transmission system operator (TSO). The proposed model results were compared with state-of-the-art deep extreme learning machine (ELM) architectures as well as the benchmark forecasting models based on original ELM, KELM, artificial neural network (ANN), support vector machine (SVM), and regression tree (RT). The comparison results indicated that the proposed model outperformed state-of-the-art architectures and was significantly more successful than the TSO model.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Data availability

The author confirms that the datasets analyzed during the current study are available in public repositories (see reference [30]).

Abbreviations

AE:

Auto-encoder

ANN:

Artificial neural network

ARMA:

Auto-regressive moving average

BP:

Backpropagation

CSO:

Chicken swarm optimizer

DBM:

Deep Boltzmann machines

DBN:

Deep belief networks

EKF:

Extended Kalman filter

ELM:

Extreme learning machine

ELMAE:

ELM-based AE

EMD:

Empirical mode decomposition

FISTA:

Fast iterative shrinkage-thresholding algorithm

GOA:

Grasshopper optimization algorithm

HELM:

Hierarchical ELM

HW:

Holt–Winters

IMF:

Intrinsic mode function

KELM:

Kernel ELM

KELMAE:

KELM-based AE

LT:

Long term

MA:

Moving average

MABC:

Modified artificial bee colony

MAE:

Mean absolute error

MAPE:

Mean absolute percentage error

MATLAB:

Matrix laboratory

MLELM:

Multilayer ELM

MLKELM:

Multilayer KELM

MP:

Moore–Penrose

MT:

Medium term

PSO:

Particle swarm optimization

R:

Correlation coefficient

RBF:

Radial basis function

RBM:

Restricted Boltzmann machines

RMSE:

Root mean squared error

RT:

Regression tree

SDPSO:

Switching delayed PSO

SLFN:

Single-hidden-layer feed-forward NN

ST:

Short term

SVM:

Support vector machine

TSO:

Transmission system operator

UKF:

Unscented Kalman filter

VMD:

Variational mode decomposition

WT:

Wavelet transform

References

  1. Nti IK, Teimeh M, Nyarko-Boateng O, Adekoya AF (2020) Electricity load forecasting: a systematic review. J Electr Syst Inf Technol. https://doi.org/10.1186/s43067-020-00021-8

    Article  Google Scholar 

  2. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    Article  MATH  Google Scholar 

  3. Kuster C, Rezgui Y, Mourshed M (2017) Electrical load forecasting models: a critical systematic review. Sustain Cities Soc 35(August):257–270. https://doi.org/10.1016/j.scs.2017.08.009

    Article  Google Scholar 

  4. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501. https://doi.org/10.1016/j.neucom.2005.12.126

    Article  Google Scholar 

  5. Huang G, Huang GB, Song S, You K (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48. https://doi.org/10.1016/j.neunet.2014.10.001

    Article  MATH  Google Scholar 

  6. Huang G, Song S, Gupta JND, Wu C (2014) Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern 44(12):2405–2417. https://doi.org/10.1109/TCYB.2014.2307349

    Article  Google Scholar 

  7. Lekamalage CKL, Liu T, Yang Y, Lin Z, Huang G-B (2015) Extreme learning machine for multilayer perceptron. 27(4): 435–444. https://doi.org/10.1007/978-3-319-14063-6_36

  8. Vong CM, Chen C, Wong PK (2018) Kernel-based multilayer extreme learning machines for representation learning. Neurocomputing 310(3):265–276. https://doi.org/10.1016/j.neucom.2018.05.032

    Article  MathSciNet  Google Scholar 

  9. Li S, Wang P, Goel L (2015) Short-term load forecasting by wavelet transform and evolutionary extreme learning machine. Electr Power Syst Res 122:96–103. https://doi.org/10.1016/j.epsr.2015.01.002

    Article  Google Scholar 

  10. Daubechies I (1990) The wavelet transform, time-frequency localization and signal analysis. IEEE Trans Inf Theory 36(5):961–1005

    Article  MathSciNet  MATH  Google Scholar 

  11. Karaboga D (2010) Artificial bee colony algorithm. Scholarpedia 5(3):6915

    Article  Google Scholar 

  12. Sulaiman SM, Jeyanthy PA, Devaraj D, Shihabudheen KV (2022) A novel hybrid short-term electricity forecasting technique for residential loads using empirical mode decomposition and extreme learning machines. Comput Electr Eng 98:107663. https://doi.org/10.1016/j.compeleceng.2021.107663

    Article  Google Scholar 

  13. Huang NE et al. (1998) The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. In: Proceedings of the Royal Society of London. Series A: mathematical, physical and engineering sciences. 454(1971): 903–995

  14. Ertugrul ÖF (2016) Forecasting electricity load by a novel recurrent extreme learning machines approach. Int J Electr Power Energy Syst 78:429–435. https://doi.org/10.1016/j.ijepes.2015.12.006

    Article  Google Scholar 

  15. Li S, Goel L, Wang P (2016) An ensemble approach for short-term load forecasting by extreme learning machine. Appl Energy 170:22–29. https://doi.org/10.1016/j.apenergy.2016.02.114

    Article  Google Scholar 

  16. Chen Y, Kloft M, Yang Y, Li C, Li L (2018) Mixed kernel based extreme learning machine for electric load forecasting. Neurocomputing 312:90–106. https://doi.org/10.1016/j.neucom.2018.05.068

    Article  Google Scholar 

  17. Liu C, Sun B, Zhang C, Li F (2020) A hybrid prediction model for residential electricity consumption using holt-winters and extreme learning machine. Appl Energy 275:115383. https://doi.org/10.1016/j.apenergy.2020.115383

    Article  Google Scholar 

  18. Li W, Kong D, Wu J (2017) A novel hybrid model based on extreme learning machine, k-nearest neighbor regression and wavelet denoising applied to short-term electric load forecasting. Energies 10(5):694. https://doi.org/10.3390/en10050694

    Article  Google Scholar 

  19. Cover T, Hart P (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27

    Article  MATH  Google Scholar 

  20. Wu J, Cui Z, Chen Y, Kong D, Wang Y (2019) A new hybrid model to predict the electrical load in fi ve states of Australia. Energy 166:598–609. https://doi.org/10.1016/j.energy.2018.10.076

    Article  Google Scholar 

  21. Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47

    Article  Google Scholar 

  22. Huang G-B, Zhou H, Ding X, Zhang R (2011) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man, Cybern Part B 42(2):513–529

    Article  Google Scholar 

  23. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95-international conference on neural networks. 4: 1942–1948

  24. Liu N, Tang Q, Zhang J, Fan W, Liu J (2014) A hybrid forecasting model with parameter optimization for short-term load forecasting of micro-grids. Appl Energy 129:336–345. https://doi.org/10.1016/j.apenergy.2014.05.023

    Article  Google Scholar 

  25. Zeng N, Zhang H, Liu W, Liang J, Alsaadi FE (2017) A switching delayed PSO optimized extreme learning machine for short-term load forecasting. Neurocomputing 240:175–182. https://doi.org/10.1016/j.neucom.2017.01.090

    Article  Google Scholar 

  26. Chen X-D, Hai-Yue Y, Wun J-S, Wu C-H, Wang C-H, Li L-L (2020) Power load forecasting in energy system based on improved extreme learning machine. Energy Explor Exploit 38(4):1194–1211. https://doi.org/10.1177/0144598720903797

    Article  Google Scholar 

  27. Meng X, Liu Y, Gao X, Zhang H (2014) A new bio-inspired algorithm: chicken swarm optimization. In: International conference in swarm intelligence. pp. 86–94

  28. Peng W, Xu L, Li C, Xie X, Zhang G (2019) Stacked autoencoders and extreme learning machine based hybrid model for electrical load prediction. J Intell Fuzzy Syst 37(4):5403–5416. https://doi.org/10.3233/JIFS-190548

    Article  Google Scholar 

  29. Dragomiretskiy K, Zosso D (2013) Variational mode decomposition. IEEE Trans signal Process 62(3):531–544

    Article  MathSciNet  MATH  Google Scholar 

  30. Energy exchange Istanbul. https://seffaflik.epias.com.tr/transparency/

  31. Loh W (2011) Classification and regression trees. Wiley Interdiscip Rev data Min Knowl Discov 1(1):14–23

    Article  Google Scholar 

  32. Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol P-A, Bottou L (2010) Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 11(12):3371–3408

    MathSciNet  MATH  Google Scholar 

  33. Hinton GE, Salakhutdinov RR (2006) Reducing the dimensionality of data with neural networks. Science 313(5786):504–507

    Article  MathSciNet  MATH  Google Scholar 

  34. Cambria E et al (2013) Representational learning with ELMs for big data. IEEE Intell Syst 28(6):31–34. https://doi.org/10.1109/MIS.2013.140

    Article  Google Scholar 

  35. Tang J, Deng C, Huang G-B (2015) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27(4):809–821

    Article  MathSciNet  Google Scholar 

  36. Bengio Y, Courville A, Vincent P (2013) Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 35(8):1798–1828

    Article  Google Scholar 

  37. Hinton GE, Osindero S, Teh Y-W (2006) A fast learning algorithm for deep belief nets. Neural Comput 18(7):1527–1554

    Article  MathSciNet  MATH  Google Scholar 

  38. Beck A, Teboulle M (2009) A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J Imaging Sci 2(1):183–202

    Article  MathSciNet  MATH  Google Scholar 

  39. Tihonov AN (1963) Solution of incorrectly formulated problems and the regularization method. Sov Math 4:1035–1038

    Google Scholar 

  40. Morozov VA (1975) Linear and nonlinear ill-posed problems J. Sov Math 4(6):706–736

    Article  MATH  Google Scholar 

  41. Hong M, Luo Z-Q (2017) On the linear convergence of the alternating direction method of multipliers. Math Program 162(1):165–199

    Article  MathSciNet  MATH  Google Scholar 

  42. Extreme learning machines. http://extreme-learning-machines.org/

  43. Hyndman RJ, Koehler AB (2006) Another look at measures of forecast accuracy. Int J Forecast 22(4):679–688. https://doi.org/10.1016/j.ijforecast.2006.03.001

    Article  Google Scholar 

  44. Madrid EA, Antonio N (2021) Short-term electricity load forecasting with machine learning. Inf 12(2):1–21. https://doi.org/10.3390/info12020050

    Article  Google Scholar 

  45. Eren S et al (2017) A ubiquitous Web-based dispatcher information system for effective monitoring and analysis of the electricity transmission grid. Int J Electr Power Energy Syst 86:93–103. https://doi.org/10.1016/j.ijepes.2016.10.006

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ceyhun Yıldız.

Ethics declarations

Conflict of interest

The author declares no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yıldız, C. An integrated model based on deep kernel extreme learning machine and variational mode decomposition for day-ahead electricity load forecasting. Neural Comput & Applic 35, 18763–18781 (2023). https://doi.org/10.1007/s00521-023-08702-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-023-08702-x

Keywords

Navigation