Class-Oriented Reduction of Decision Tree Complexity
In some classification problems, apart from a good model, we might be interested in obtaining succinct explanations for particular classes. Our goal is to provide simpler classification models for these classes without a significant accuracy loss. In this paper, we propose some modifications to the splitting criteria and the pruning heuristics used by standard top-down decision tree induction algorithms. This modifications allow us to take each particular class importance into account and lead us to simpler models for the most important classes while, at the same time, the overall classifier accuracy is preserved.
KeywordsClass Weight Tree Pruning Pruning Strategy Split Criterion Accuracy Loss
Unable to display preview. Download preview PDF.
- 4.Domingos, P.: Metacost: A general method for making classifiers cost-sensitive. In: 5th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 155–164 (1999)Google Scholar
- 7.Blake, C.L., Newman, D.J., Merz, C.J.: UCI repository of machine learning databases (1998)Google Scholar
- 8.Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)Google Scholar