Cost-Sensitive Decision Trees with Multiple Cost Scales

  • Zhenxing Qin
  • Shichao Zhang
  • Chengqi Zhang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3339)


How to minimize misclassification errors has been the main focus of Inductive learning techniques, such as CART and C4.5. However, misclassification error is not the only error in classification problem. Recently, researchers have begun to consider both test and misclassification costs. Previous works assume the test cost and the misclassification cost must be defined on the same cost scale. However, sometimes we may meet difficulty to define the multiple costs on the same cost scale. In this paper, we address the problem by building a cost-sensitive decision tree by involving two kinds of cost scales, that minimizes the one kind of cost and control the other in a given specific budget. Our work will be useful for many diagnostic tasks involving target cost minimization and resource consumption for obtaining missing information.


Decision Tree Resource Consumption Resource Cost Test Cost Tree Building 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Ling, C., Yang, Q., Wang, J., Zhang, S.: Decision Trees with Minimal Costs. In: Proceedings of 21st International Conference on Machine Learning, Banff, Alberta, Canada, July 4-8 (2004)Google Scholar
  2. 2.
    Turney, P.D.: Types of cost in inductive concept learning. In: Workshop on Cost-Sensitive Learning at the Seventeenth International Conference on Machine Learning, Stanford University, California (2000)Google Scholar
  3. 3.
    Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. Irvine, CA: University of Califor-nia, Department of Information and Computer Science (1998),
  4. 4.
    Turney, P.D.: Cost-sensitive classication: Empirical evaluation of a hybrid genetic decision tree induction algorithm. Journal of Articial Intelligence Research 2, 369–409 (1995)Google Scholar
  5. 5.
    Mitchell, T.M.: Machine Learning. McGraw Hills, New York (1997)zbMATHGoogle Scholar
  6. 6.
    Zubek, V.B., Dietterich, T.G.: Pruning Improves Heuristic Search for Cost-Sensitive Learning. In: Proceedings of the Nineteenth International Conference on Machine Learning, Sydney, Australia, pp. 27–34 (2002)Google Scholar
  7. 7.
    Greiner, R., Grove, A.J., Roth, D.: Learning cost-sensitive active classiers. Articial Intelligence 139(2), 137–174 (2002)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)Google Scholar
  9. 9.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Monterey (1984)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Zhenxing Qin
    • 1
  • Shichao Zhang
    • 1
  • Chengqi Zhang
    • 1
  1. 1.Faculty of Information TechnologyUniversity of Technology, SydneySydneyAustralia

Personalised recommendations