Machine Learning

, Volume 4, Issue 2, pp 227–243

An Empirical Comparison of Pruning Methods for Decision Tree Induction

  • John Mingers
Article

DOI: 10.1023/A:1022604100933

Cite this article as:
Mingers, J. Machine Learning (1989) 4: 227. doi:10.1023/A:1022604100933

Abstract

This paper compares five methods for pruning decision trees, developed from sets of examples. When used with uncertain rather than deterministic data, decision-tree induction involves three main stages—creating a complete tree able to classify all the training examples, pruning this tree to give statistical reliability, and processing the pruned tree to improve understandability. This paper concerns the second stage—pruning. It presents empirical comparisons of the five methods across several domains. The results show that three methods—critical value, error complexity and reduced error—perform well, while the other two may cause problems. They also show that there is no significant interaction between the creation and pruning methods.

Decision trees Knowledge acquisition Uncertain data Pruning 
Download to read the full article text

Copyright information

© Kluwer Academic Publishers 1989

Authors and Affiliations

  • John Mingers
    • 1
  1. 1.School of Industrial and Business StudiesUniversity of WarwickCoventryEngland

Personalised recommendations