Skip to main content
Log in

Method for Improving Gradient Boosting Learning Efficiency Based on Modified Loss Functions

  • THEMATIC ISSUE
  • Published:
Automation and Remote Control Aims and scope Submit manuscript

Abstract

We consider a new method to improve the quality of training in gradient boosting as well as to increase its generalization performance based on the use of modified loss functions. In computational experiments, the possible applicability of this method to improve the quality of gradient boosting when solving various classification and regression problems on real data is shown.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.

Notes

  1. https://drive.google.com/file/d/1EyiNNQ_u0CzQ7qYEdZEeEwFTwjkkv2xL/view.

  2. https://archive.ics.uci.edu/ml/datasets/Arrhythmia.

  3. https://drive.google.com/file/d/1ADa975pas6WPm5SDmPCRF4oPrAyoBkx4/view?usp=sharing.

REFERENCES

  1. Friedman, J.H., Multiple additive regression trees with application in epidemiology, Stat. Med., 2003, vol. 22, no. 9, pp. 1365–1381.

    Article  Google Scholar 

  2. Elith, J., Boosted regression trees for ecological modeling, CRAN, 2018, vol. 77, no. 4, pp. 802–813.

    Google Scholar 

  3. Lalchand, V., Extracting more from boosted decision trees: A high energy physics case study, 2020. arXiv:2001.06033.

  4. Breiman, L., Random forests, Mach. Learn., 2001, vol. 45, no. 1.

  5. Zhi-Hua, Z., Ensemble Methods: Foundations and Algorithms, New York: Chapman & Hall/CRC, 2012.

    Google Scholar 

  6. Zhuravlev, Yu.I., Senko, O.V., Dokukin, A.A., Kiselyova, N.N., and Saenko, I.A., Two-level regression method using ensembles of trees with optimal divergence, Dokl. Math., 2021, vol. 104, no. 1, pp. 212–215.

    Article  MathSciNet  MATH  Google Scholar 

  7. Friedman, J.H., Stochastic gradient boosting, Comput. Stat. DataAnal., 2002, vol. 38, no. 4, pp. 367–378.

    Article  MathSciNet  MATH  Google Scholar 

  8. Prokhorenkova, L., Gusev, G., Vorobev, A., Dorogush, A.V., and Gulin, A., CatBoost: unbiased boosting with categorical features, 2017. arXiv:1706.09516.

  9. Chen, T. and Guestrin, C., XGBoost: A scalable tree boosting system, 2016. arXiv:1603.02754.

  10. Ke, G. et al., LightGBM: A highly efficient gradient boosting decision tree, Adv. Neural Inf. Process. Syst., 2017, vol. 30, pp. 3146–3154.

    Google Scholar 

  11. Brown, G., Wyatt, J., Harris, R., and Xin Yao, Diversity creation methods: A survey and categorisation, Inf. Fusion, 2005, vol. 6, pp. 367–378.

    Article  Google Scholar 

  12. Dokukin, A.A. and Senko, O.V., Optimal convex correcting procedures in problems of high dimension, Comput. Math. Math. Phys., 2011, vol. 51, no. 9, pp. 1644–1652.

    Article  MathSciNet  Google Scholar 

  13. Guvenir, H. Altay, Acar, B., and Muderrisoglu, H., Arrhythmia Data Set. https://archive.ics.uci.edu/ml/datasets/Arrhythmia.

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to N. S. Korolev or O. V. Senko.

Additional information

Translated by V. Potapchouck

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Korolev, N.S., Senko, O.V. Method for Improving Gradient Boosting Learning Efficiency Based on Modified Loss Functions. Autom Remote Control 83, 1935–1943 (2022). https://doi.org/10.1134/S00051179220120074

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S00051179220120074

Keywords

Navigation