Skip to main content

Advertisement

Log in

A Systematic Analysis for Energy Performance Predictions in Residential Buildings Using Ensemble Learning

  • Research Article-Computer Engineering and Computer Science
  • Published:
Arabian Journal for Science and Engineering Aims and scope Submit manuscript

Abstract

Energy being a precious resource needs to be mindfully utilized, so that efficiency is achieved and its wastage is curbed. Globally, multi-storeyed buildings are the biggest energy consumers. A large portion of energy within a building is consumed to maintain the desired temperature for the comfort of occupants. For this purpose, heating load and cooling load requirements of the building need to be met. These requirements should be minimized to reduce energy consumption and optimize energy usage. Some characteristics of buildings greatly affect the heating load and cooling load requirements. This paper presented a systematic approach for analysing various factors of a building playing a vital role in energy consumption, followed by the algorithmic approaches of traditional machine learning and modern ensemble learning for energy consumption prediction in residential buildings. The results revealed that ensemble techniques outperform machine learning techniques with an appreciable margin. The accuracy of predicting heating load and cooling load, respectively, with multiple linear regression was 88.59% and 85.26%, with support vector regression was 82.38% and 89.32%, with K-nearest neighbours was 91.91% and 94.47%. The accuracy achieved with ensemble techniques was comparatively better—99.74% and 94.79% with random forests, 99.73% and 96.22% with gradient boosting machines, 99.75% and 95.94% with extreme gradient boosting.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

Data Availability

Yes, Data are available.

Code Availability

Yes, code is available.

References

  1. Lam, J.C.; Wan, K.K.; Tsang, C.L.; Yang, L.: Building energy efficiency in different climates. Energy Convers. Manag. 49(8), 2354–2366 (2008)

    Article  Google Scholar 

  2. Ahmad, M.W.; Mourshed, M.; Rezgui, Y.: Trees vs neurons: comparison between random forest and ANN for high-resolution prediction of building energy consumption. Energy Build. 147, 77–89 (2017)

    Article  Google Scholar 

  3. Chou, J.S.; Bui, D.K.: Modeling heating and cooling loads by artificial intelligence for energy-efficient building design. Energy Build. 82, 437–446 (2014)

    Article  Google Scholar 

  4. Jain, R.K.; Smith, K.M.; Culligan, P.J.; Taylor, J.E.: Forecasting energy consumption of multi-family residential buildings using support vector regression: investigating the impact of temporal and spatial monitoring granularity on performance accuracy. Appl. Energy 123, 168–178 (2014)

    Article  Google Scholar 

  5. Krzywinski, M.; Altman, N.: Multiple linear regression: when multiple variables are associated with a response, the interpretation of a prediction equation is seldom simple. Nat. Methods 12(12), 1103–1105 (2015)

    Article  Google Scholar 

  6. Carreira, P.; Costa, A.A.; Mansu, V.; Arsénio, A.: Can HVAC Really Learn from Users? A Simulation-Based Study on the Effectiveness of Voting for Comfort and Energy Use Optimization. Sustain. Cities Soc. 41, 275–285 (2018)

    Article  Google Scholar 

  7. Drgoňa, J.; Picard, D.; Kvasnica, M.; Helsen, L.: Approximate model predictive building control via machine learning. Appl. Energy 218, 199–216 (2018)

    Article  Google Scholar 

  8. Roy, S.S.; Roy, R.; Balas, V.E.: Estimating heating load in buildings using multivariate adaptive regression splines, extreme learning machine, a hybrid model of MARS and ELM. Renew. Sustain. Energy Rev. 82, 4256–4268 (2018)

    Article  Google Scholar 

  9. Kumar, S.; Pal, S.K.; Singh, R.P.: A novel method based on extreme learning machine to predict heating and cooling load through design and structural attributes. Energy Build. 176, 275–286 (2018)

    Article  Google Scholar 

  10. Ngo, N.T.: Early predicting cooling loads for energy-efficient design in office buildings by machine learning. Energy Build. 182, 264–273 (2019)

    Article  Google Scholar 

  11. Sunikka-Blank, M.; Galvin, R.: Introducing the prebound effect: the gap between performance and actual energy consumption. Build. Res. Inf. 40(3), 260–273 (2012)

    Article  Google Scholar 

  12. Galvin, R.: Making the ‘rebound effect’more useful for performance evaluation of thermal retrofits of existing homes: defining the ‘energy savings deficit’and the ‘energy performance gap’. Energy Build. 69, 515–524 (2014)

    Article  Google Scholar 

  13. Tsanas, A.; Xifara, A.: Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools. Energy Build. 49, 560–567 (2012)

    Article  Google Scholar 

  14. Fan, C.; Xiao, F.; Wang, S.: Development of prediction models for next-day building energy consumption and peak power demand using data mining techniques. Appl. Energy 127, 1–10 (2014)

    Article  Google Scholar 

  15. Wei, X.; Kusiak, A.; Li, M.; Tang, F.; Zeng, Y.: Multi-objective optimization of the HVAC (heating, ventilation, and air conditioning) system performance. Energy 83, 294–306 (2015)

    Article  Google Scholar 

  16. Park, H.S.; Lee, M.; Kang, H.; Hong, T.; Jeong, J.: Development of a new energy benchmark for improving the operational rating system of office buildings using various data-mining techniques. Appl. Energy 173, 225–237 (2016)

    Article  Google Scholar 

  17. Candanedo, L.M.; Feldheim, V.; Deramaix, D.: Data driven prediction models of energy use of appliances in a low-energy house. Energy Build. 140, 81–97 (2017)

    Article  Google Scholar 

  18. Manjarres, D.; Mera, A.; Perea, E.; Lejarazu, A.; Gil-Lopez, S.: An energy-efficient predictive control for HVAC systems applied to tertiary buildings based on regression techniques. Energy Build. 152, 409–417 (2017)

    Article  Google Scholar 

  19. Peng, Y.; Rysanek, A.; Nagy, Z.; Schlüter, A.: Using machine learning techniques for occupancy-prediction-based cooling control in office buildings. Appl. Energy 211, 1343–1358 (2018)

    Article  Google Scholar 

  20. Gallagher, C.V.; Bruton, K.; Leahy, K.; O’Sullivan, D.T.: The suitability of machine learning to minimise uncertainty in the measurement and verification of energy savings. Energy Build. 158, 647–655 (2018)

    Article  Google Scholar 

  21. Deb, C.; Lee, S.E.; Santamouris, M.: Using artificial neural networks to assess HVAC related energy saving in retrofitted office buildings. Sol. Energy 163, 32–44 (2018)

    Article  Google Scholar 

  22. Nayak, S.C.: Escalation of forecasting accuracy through linear combiners of predictive models. EAI Endorsed Trans. Scalable Inf. Syst. 6(22), 1–14 (2019)

    Google Scholar 

  23. Sethi, J.S.; Mittal, M.: Ambient air quality estimation using supervised learning techniques. EAI Endorsed Trans. Scalable Inf. Syst. 6(22) (2019)

  24. Pallonetto, F.; De Rosa, M.; Milano, F.; Finn, D.P.: Demand response algorithms for smart-grid ready residential buildings using machine learning models. Appl. Energy 239, 1265–1282 (2019)

    Article  Google Scholar 

  25. Pham, A.D.; Ngo, N.T.; Truong, T.T.H.; Huynh, N.T.; Truong, N.S.: Predicting energy consumption in multiple buildings using machine learning for improving energy efficiency and sustainability. J. Clean. Prod. 260, 121082 (2020)

    Article  Google Scholar 

  26. Walker, S.; Khan, W.; Katic, K.; Maassen, W.; Zeiler, W.: Accuracy of different machine learning algorithms and added-value of predicting aggregated-level energy performance of commercial buildings. Energy Build. 209, 109705 (2020)

    Article  Google Scholar 

  27. Xu, X.; Wang, W.; Hong, T.; Chen, J.: Incorporating machine learning with building network analysis to predict multi-building energy use. Energy Build. 186, 80–97 (2019)

    Article  Google Scholar 

  28. Zhou, G.; Moayedi, H.; Bahiraei, M.; Lyu, Z.: Employing artificial bee colony and particle swarm techniques for optimizing a neural network in prediction of heating and cooling loads of residential buildings. J. Clean. Prod. 254, 120082 (2020)

    Article  Google Scholar 

  29. Gao, W.; Alsarraf, J.; Moayedi, H.; Shahsavar, A.; Nguyen, H.: Comprehensive preference learning and feature validity for designing energy-efficient residential buildings using machine learning paradigms. Appl. Soft Comput. 84, 105748 (2019)

    Article  Google Scholar 

  30. Seyedzadeh, S.; Rahimian, F.P.; Rastogi, P.; Glesk, I.: Tuning machine learning models for prediction of building energy loads. Sustain. Cities Soc. 47, 101484 (2019)

    Article  Google Scholar 

  31. Roy, S.S.; Samui, P.; Nagtode, I.; Jain, H.; Shivaramakrishnan, V.; Mohammadi-Ivatloo, B.: Forecasting heating and cooling loads of buildings: a comparative performance analysis. J. Ambient Intell. Humaniz. Comput. 11(3), 1253–1264 (2020)

    Article  Google Scholar 

  32. Iruela, J.R.S.; Ruiz, L.G.B.; Pegalajar, M.C.; Capel, M.I.: A parallel solution with GPU technology to predict energy consumption in spatially distributed buildings using evolutionary optimization and artificial neural networks. Energy Convers. Manag. 207, 112535 (2020)

    Article  Google Scholar 

  33. Das, S.; Swetapadma, A.; Panigrahi, C.; Abdelaziz, A.Y.: Improved method for approximation of heating and cooling load in urban buildings for energy performance enhancement. Electr. Power Compon. Syst. 48, 1–11 (2020)

    Article  Google Scholar 

  34. Cozza, S.; Chambers, J.; Deb, C.; Scartezzini, J.L.; Schlüter, A.; Patel, M.K.: Do energy performance certificates allow reliable predictions of actual energy consumption and savings? Learning from the Swiss national database. Energy Build. 224, 110235 (2020)

    Article  Google Scholar 

  35. https://sweetcode.io/simple-multiple-linear-regression-python-scikit/

  36. Cunningham, P.; Delany, S.J.: k-Nearest neighbour classifiers. Multiple Classif. Syst. 34(8), 1–17 (2007)

    Google Scholar 

  37. Martínez, F.; Frías, M.P.; Pérez, M.D.; Rivera, A.J.: A methodology for applying k-nearest neighbor to time series forecasting. Artif. Intell. Rev. 52(3), 2019–2037 (2019)

    Article  Google Scholar 

  38. https://www.slideshare.net/amirudind/k-nearest-neighbor-presentation

  39. Smola, A.J.; Schölkopf, B.: A tutorial on support vector regression. Stat. Comput. 14(3), 199–222 (2004)

    Article  MathSciNet  Google Scholar 

  40. https://scikit-learn.org/0.18/auto_examples/svm/plot_svm_regression.html

  41. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  Google Scholar 

  42. Cutler, A.; Cutler, D.R.; Stevens, J.R.: Random forests. In: Ensemble Machine Learning, pp. 157–175. Springer, Boston, MA (2012)

  43. Friedman, J.H.: Stochastic gradient boosting. Comput. Stat. Data Anal. 38(4), 367–378 (2002)

    Article  MathSciNet  Google Scholar 

  44. Chen, T.; Guestrin, C.: Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016)

  45. https://archive.ics.uci.edu/ml/datasets/Energy+efficiency

  46. Myers, L.; Sirois, M.J.: Spearman correlation coefficients, differences between. Encycl. Stat. Sci. (2004)

Download references

Author information

Authors and Affiliations

Authors

Contributions

Mrinal Pandey and Monika Goyal conducted the research and analyze the data. Monika Goyal performed the literature survey and experiments. Statistical Analysis is done by Mrinal Pandey. The research article is written by Mrinal Pandey and Monika Goyal.

Corresponding author

Correspondence to Mrinal Pandey.

Appendix

Appendix

1.1 Sample Calculations for Model Evaluation

The sample calculations using formulae in Eqs. 913 are described here. Table 6 contains the predicted values, observed values of response variables Y1 and Y2 from the dataset and predicted values after applying KNN algorithms. The calculations for model evaluation on the basis of values given in Table 6 have been performed manually on 20 and 100 sample size, respectively, which has been selected in respective order from 1–10 and 1–100.

Table 6 Sample dataset showing all predictor values and predicted values using KNN

Referring to Eqs. 913, applying the formulae on observed values and values predicted using KNN, For Y1 calculated results for samples of initial 20 records,

$$ \begin{aligned} & {\text{RMSE}} = 12.01 \\ & {\text{MSE}} = 144.29 \\ & {\text{MAE}} = 10.2 \\ & R\;{\text{Squared}} = - 5.2 \\ & {\text{Accuracy}} = 43.82\% \\ \end{aligned} $$

Referring to Eqs. 913, applying the formulae on observed values and values predicted using KNN, For Y1 calculated results for samples of initial 100 records,

$$ \begin{aligned} & {\text{RMSE}} = 14.02 \\ & {\text{MSE}} = 196.6 \\ & {\text{MAE}} = 10.9 \\ & R\;{\text{Squared}} = - 1.55 \\ & {\text{Accuracy}} = 48.46\% \\ \end{aligned} $$

The sample calculations using formulae in Eqs. 913 are described here. Table 7 contains the predicted values, observed values of response variables Y1 and Y2 from the dataset, and predicted values after applying XGBoost algorithms. The calculations for model evaluation on the basis of values given in Table 7 have been performed manually on 20 and 100 sample size, respectively, which has been selected in respective order from 1–20 and 1–100.

Table 7 Sample dataset showing all predictor values and predicted values using XGBoost

Applying the formulae on the values predicted using XGBoost, For Y1 the calculated results for samples of initial 20 records,

$$ \begin{aligned} & {\text{RMSE}} = 11.98 \\ & {\text{MSE}} = 143.59 \\ & {\text{MAE}} = 9.82 \\ & R\;{\text{Squared}} = - 5.17 \\ & {\text{Accuracy}} = 40.68 \\ \end{aligned} $$

Applying the formulae on the values predicted using XGBoost, For Y1 the calculated results for samples of initial 100 records,

$$ \begin{aligned} & {\text{RMSE}} = 14.25 \\ & {\text{MSE}} = 203.1 \\ & {\text{MAE}} = 11.13 \\ & R\;{\text{Squared}} = - 1.63 \\ & {\text{Accuracy}} = 49.57 \\ \end{aligned} $$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Goyal, M., Pandey, M. A Systematic Analysis for Energy Performance Predictions in Residential Buildings Using Ensemble Learning. Arab J Sci Eng 46, 3155–3168 (2021). https://doi.org/10.1007/s13369-020-05069-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13369-020-05069-2

Keywords

Navigation