A research of Monte Carlo optimized neural network for electricity load forecast

  • Binbin Yong
  • Liang Huang
  • Fucun Li
  • Jun Shen
  • Xin Wang
  • Qingguo ZhouEmail author


In this paper, we apply the Monte Carlo neural network (MCNN), a type of neural network optimized by Monte Carlo algorithm, to electricity load forecast. Meanwhile, deep MCNNs with one, two and three hidden layers are designed. Results have demonstrated that three-layer MCNN improves 70.35% accuracy for 7-week electricity load forecast, compared with traditional neural network. And five-layer MCNN improves 17.24% accuracy for 7-week forecast. This proves that MCNN has great potential in electricity load forecast.


Monte Carlo Neural network Electricity load forecast Deep MCNNs 



This work was supported by National Natural Science Foundation of China under Grant Nos. 61402210 and 60973137, Ministry of Education—China Mobile Research Foundation under Grant No. MCM20170206, State Grid Corporation Science and Technology Project under Grant No. SGGSKY00FJJS1700302, Program for New Century Excellent Talents in University under Grant No. NCET-12-0250, Major National Project of High Resolution Earth Observation System under Grant No. 30-Y20A34-9010-15/17, Strategic Priority Research Program of the Chinese Academy of Sciences with Grant No. XDA03030100, Google Research Awards and Google Faculty Award.

Compliance with ethical standards

Conflict of interest

All authors declare that they have no conflicts of interest regarding the publication of this manuscript.


  1. 1.
    Bergmeir C (2012) On the use of cross-validation for time series predictor evaluation. Inf Sci 191(9):192–213CrossRefGoogle Scholar
  2. 2.
    Cao LJ, Tay FEH (2004) Support vector machine with adaptive parameters in financial time series forecasting. IEEE Trans Neural Netw 14(6):1506–1518CrossRefGoogle Scholar
  3. 3.
    Chen D, Tian Y, Liu X (2016) Structural nonparallel support vector machine for pattern recognition. Pattern Recognit 60:296–305CrossRefGoogle Scholar
  4. 4.
    Fen MO (2017) Persistence of chaos in coupled Lorenz systems. Chaos Solitons Fractals 95:200–205MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Gao Y, Kong X, Hu C, Zhang Z, Li H, Hou L (2015) Multivariate data modeling using modified kernel partial least squares. Chem Eng Res Des 94:466–474CrossRefGoogle Scholar
  6. 6.
    Garzon M, Botelho F (1999) Dynamical approximation by recurrent neural networks. Neurocomputing 29(13):25–46CrossRefGoogle Scholar
  7. 7.
    Islam BU, Baharudin Z, Nallagownden P (2015) Short term electric load forecasting with back propagation neural network and simulated annealing genetic algorithm. Appl Mech Mater 785:14–18CrossRefGoogle Scholar
  8. 8.
    Li J, Qin G, Wen X, Hu N (2002) Over-fitting in neural network learning algorithms and its solving strategies. J Vib Meas Diagn 22(4):260–264Google Scholar
  9. 9.
    Neumaier A (1998) Solving ill-conditioned and singular linear systems: a tutorial on regularization. SIAM Rev 40(3):636–666MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Sch C, Laptev I, Caputo B (2004) Recognizing human actions: a local SVM approach. In: Proceedings of the 17th international conference on pattern recognition, vol 3, pp 32–36Google Scholar
  11. 11.
    Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetzbMATHGoogle Scholar
  12. 12.
    Tong S, Koller D (2001) Support vector machine active learning with applications to text classification. J Mach Learn Res 2(1):45–66zbMATHGoogle Scholar
  13. 13.
    Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–99CrossRefGoogle Scholar
  14. 14.
    Wang JH, Jiang JH, Yu RQ (1996) Robust back propagation algorithm as a chemometric tool to prevent the overfitting to outliers. Chemom Intell Lab Syst 34(1):109–115CrossRefGoogle Scholar
  15. 15.
    Yong B, Li F, Lv Q, Shen J, Zhou Q (2017) Derivative-based acceleration of general vector machine. Soft Comput 10(10):1–9Google Scholar
  16. 16.
    Yong B, Shen J, Shen Z, Chen H, Wang X, Zhou Q (2017) GVM based intuitive simulation web application for collision detection. Neurocomputing 279(2):63–73Google Scholar
  17. 17.
    Yong B, Xu Z, Shen J, Chen H, Tian Y, Zhou Q (2017) Neural network model with Monte Carlo algorithm for electricity demand forecasting in Queensland. In: Australasian Computer Science Week Multiconference, vol 47, pp 1–7Google Scholar
  18. 18.
    Zhang Y, Yang Y (2015) Cross-validation for selecting a model selection procedure. J Econom 187(1):95–112MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Zhao H (2016) General vector machine. arXiv preprint arXiv:1602.03950

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of Information Science and EngineeringLanzhou UniversityLanzhouChina
  2. 2.School of Physical Science and TechnologyLanzhou UniversityLanzhouChina
  3. 3.School of Computing and Information TechnologyUniversity of WollongongWollongongAustralia

Personalised recommendations