Skip to main content
Log in

One-against-all-based Hellinger distance decision tree for multiclass imbalanced learning

面向多类不平衡学习的一对多海林格距离决策树研究

  • Research Article
  • Published:
Frontiers of Information Technology & Electronic Engineering Aims and scope Submit manuscript

Abstract

Since traditional machine learning methods are sensitive to skewed distribution and do not consider the characteristics in multiclass imbalance problems, the skewed distribution of multiclass data poses a major challenge to machine learning algorithms. To tackle such issues, we propose a new splitting criterion of the decision tree based on the one-against-all-based Hellinger distance (OAHD). Two crucial elements are included in OAHD. First, the one-against-all scheme is integrated into the process of computing the Hellinger distance in OAHD, thereby extending the Hellinger distance decision tree to cope with the multiclass imbalance problem. Second, for the multiclass imbalance problem, the distribution and the number of distinct classes are taken into account, and a modified Gini index is designed. Moreover, we give theoretical proofs for the properties of OAHD, including skew insensitivity and the ability to seek a purer node in the decision tree. Finally, we collect 20 public real-world imbalanced data sets from the Knowledge Extraction based on Evolutionary Learning (KEEL) repository and the University of California, Irvine (UCI) repository. Experimental and statistical results show that OAHD significantly improves the performance compared with the five other well-known decision trees in terms of Precision, F-measure, and multiclass area under the receiver operating characteristic curve (MAUC). Moreover, through statistical analysis, the Friedman and Nemenyi tests are used to prove the advantage of OAHD over the five other decision trees.

摘要

由于传统机器学习方法对偏斜分布很敏感, 且未考虑多类不平衡问题的特点, 多类偏斜分布对机器学习算法来说是一个巨大挑战. 为解决这一问题, 提出一种新的基于一对多的海林格距离 (OAHD) 决策树分割准则. OAHD主要由两部分组成. 首先, 将一对多思想集成到OAHD的海林格距离计算过程中, 从而对海林格距离决策树进行扩展, 使其能解决多类不平衡问题. 其次, 针对多类不平衡问题, 考虑了不同类的分布和数量, 设计了改进的基尼系数. 此外, 对OAHD的性质进行理论证明, 包括偏斜不敏感性和在决策树中寻找更纯节点的能力. 最后, 从基于进化学习的知识抽取 (KEEL) 和加州大学欧文分校 (UCI) 数据库中收集20个公开的真实不平衡数据集进行实验. 实验结果表明, 与其他5种常用决策树相比, OAHD在精度、 F值, 和多类别接收者操作特征曲线下面积 (MAUC) 上有显著优势. 此外, 使用了Friedman和Nemenyi检验, 统计结果表明OAHD优于其他5种决策树.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

Download references

Author information

Authors and Affiliations

Authors

Contributions

Minggang DONG guided the research. Ming LIU designed the research and drafted the paper. Chao JING helped organize the paper. Minggang DONG and Chao JING revised and finalized the paper.

Corresponding author

Correspondence to Chao Jing  (敬超).

Ethics declarations

Minggang DONG, Ming LIU, and Chao JING declare that they have no conflict of interest.

Additional information

Project supported by the National Natural Science Foundation of China (Nos. 61802085 and 61563012), the Guangxi Provincial Natural Science Foundation, China (Nos. 2021GXNSFAA220074 and 2020GXNSFAA159038), the Guangxi Key Laboratory of Embedded Technology and Intelligent System Foundation, China (No. 2018A-04), and the Guangxi Key Laboratory of Trusted Software Foundation, China (No. kx202011)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dong, M., Liu, M. & Jing, C. One-against-all-based Hellinger distance decision tree for multiclass imbalanced learning. Front Inform Technol Electron Eng 23, 278–290 (2022). https://doi.org/10.1631/FITEE.2000417

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1631/FITEE.2000417

Key words

关键词

CLC number

Navigation