Advertisement

Boosting Based Multiple Kernel Learning and Transfer Regression for Electricity Load Forecasting

  • Di Wu
  • Boyu Wang
  • Doina Precup
  • Benoit Boulet
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10536)

Abstract

Accurate electricity load forecasting is of crucial importance for power system operation and smart grid energy management. Different factors, such as weather conditions, lagged values, and day types may affect electricity load consumption. We propose to use multiple kernel learning (MKL) for electricity load forecasting, as it provides more flexibilities than traditional kernel methods. Computation time is an important issue for short-term load forecasting, especially for energy scheduling demand. However, conventional MKL methods usually lead to complicated optimization problems. Another practical aspect of this application is that there may be very few data available to train a reliable forecasting model for a new building, while at the same time we may have prior knowledge learned from other buildings. In this paper, we propose a boosting based framework for MKL regression to deal with the aforementioned issues for short-term load forecasting. In particular, we first adopt boosting to learn an ensemble of multiple kernel regressors, and then extend this framework to the context of transfer learning. Experimental results on residential data sets show the effectiveness of the proposed algorithms.

Keywords

Electricity load forecasting Boosting Multiple kernel learning Transfer learning 

References

  1. 1.
    Atsawathawichok, P., Teekaput, P., Ploysuwan, T.: Long term peak load forecasting in Thailand using multiple kernel Gaussian process. In: ECTI-CON, pp. 1–4 (2014)Google Scholar
  2. 2.
    Bach, F.R., Lanckriet, G.R., Jordan, M.I.: Multiple kernel learning, conic duality, and the SMO algorithm. In: ICML, pp. 6–13 (2004)Google Scholar
  3. 3.
    Bühlmann, P., Hothorn, T.: Boosting algorithms: regularization, prediction and model fitting. Stat. Sci. 22, 477–505 (2007)MathSciNetCrossRefMATHGoogle Scholar
  4. 4.
    Bunn, D., Farmer, E.D.: Comparative Models for Electrical Load Forecasting. John Wiley and Sons Inc., New York (1985)Google Scholar
  5. 5.
    Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 27 (2011)CrossRefGoogle Scholar
  6. 6.
    Chapelle, O., Shivaswamy, P., Vadrevu, S., Weinberger, K., Zhang, Y., Tseng, B.: Boosted multi-task learning. Mach. Learn. 85(1–2), 149–173 (2011)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Dai, W., Yang, Q., Xue, G.R., Yu, Y.: Boosting for transfer learning. In: ICML, pp. 193–200 (2007)Google Scholar
  8. 8.
    Fiot, J.B., Dinuzzo, F.: Electricity demand forecasting by multi-task learning. IEEE Trans. Smart Grid PP(99), 1 (2016)CrossRefGoogle Scholar
  9. 9.
    Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: ICML, pp. 148–156 (1996)Google Scholar
  10. 10.
    Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29, 1189–1232 (2001)MathSciNetCrossRefMATHGoogle Scholar
  11. 11.
    Gönen, M., Alpaydın, E.: Multiple kernel learning algorithms. J. Mach. Learn. Res. 12, 2211–2268 (2011)MathSciNetMATHGoogle Scholar
  12. 12.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009).  https://doi.org/10.1007/978-0-387-84858-7 CrossRefMATHGoogle Scholar
  13. 13.
    Hippert, H.S., Pedreira, C.E., Souza, R.C.: Neural networks for short-term load forecasting: a review and evaluation. IEEE Trans. Power Syst. 16(1), 44–55 (2001)CrossRefGoogle Scholar
  14. 14.
  15. 15.
    Kamyab, F., Amini, M., Sheykhha, S., Hasanpour, M., Jalali, M.M.: Demand response program in smart grid using supply function bidding mechanism. IEEE Trans. Smart Grid 7(3), 1277–1284 (2016)CrossRefGoogle Scholar
  16. 16.
    Mason, L., Baxter, J., Bartlett, P., Frean, M.: Boosting algorithms as gradient descent in function space. In: NIPS, pp. 512–518 (2000)Google Scholar
  17. 17.
  18. 18.
    Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2010)CrossRefGoogle Scholar
  19. 19.
    Pardoe, D., Stone, P.: Boosting for regression transfer. In: ICML, pp. 863–870 (2010)Google Scholar
  20. 20.
    Rosset, S., Zhu, J., Hastie, T.: Boosting as a regularized path to a maximum margin classifier. J. Mach. Learn. Res. 5, 941–973 (2004)MathSciNetMATHGoogle Scholar
  21. 21.
    Soliman, S.A.H., Al-Kandari, A.M.: Electrical Load Forecasting: Modeling and Model Construction. Elsevier, New York (2010)Google Scholar
  22. 22.
    Wang, B., Pineau, J.: Online boosting algorithms for anytime transfer and multitask learning. In: AAAI, pp. 3038–3044 (2015)Google Scholar
  23. 23.
    Xia, H., Hoi, S.C.: MKBoost: a framework of multiple kernel boosting. IEEE Trans. Knowl. Data Eng. 25(7), 1574–1586 (2013)CrossRefGoogle Scholar
  24. 24.
    Yao, Y., Doretto, G.: Boosting for transfer learning with multiple sources. In: CVPR, pp. 1855–1862 (2010)Google Scholar
  25. 25.
    Zhang, R., Dong, Z.Y., Xu, Y., Meng, K., Wong, K.P.: Short-term load forecasting of Australian National Electricity Market by an ensemble model of extreme learning machine. IET Gener. Transm. Distrib. 7(4), 391–397 (2013)CrossRefGoogle Scholar
  26. 26.
    Zhuang, J., Tsang, I.W., Hoi, S.C.: Two-layer multiple kernel learning. In: AISTATS, pp. 909–917 (2011)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.McGill UniversityMontrealCanada
  2. 2.Princeton UniversityPrincetonUSA

Personalised recommendations