Advertisement

Bloomy Decision Tree for Multi-objective Classification

  • Einoshin Suzuki
  • Masafumi Gotoh
  • Yuta Choki
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2168)

Abstract

This paper presents a novel decision-tree induction for a multi-objective data set, i.e. a data set with a multi-dimensional class. Inductive decision-tree learning is one of the frequently-used methods for a single-objective data set, i.e. a data set with a single-dimensional class. However, in a real data analysis, we usually have multiple objectives, and a classifier which explains them simultaneously would be useful and would exhibit higher readability. A conventional decision-tree inducer requires transformation of a multi-dimensional class into a single-dimensional class, but such a transformation can considerably worsen both accuracy and readability. In order to circumvent this problem we propose a bloomy decision tree which deals with a multi-dimensional class without such transformations. A bloomy decision tree has a set of split nodes each of which splits examples according to their attribute values, and a set of flower nodes each of which predicts a class dimension of examples. A flower node appears not only at the fringe of a tree but also inside a tree. Our pruning is executed during tree construction, and evaluates each class dimension based on Cramér’s V. The proposed method has been implemented as D3-B (Decision tree in Bloom), and tested with eleven data sets. The experiments showed that D3-B has higher accuracies in nine data sets than C4.5 and tied with it in the other two data sets. In terms of readability, D3-B has a smaller number of split nodes in all data sets, and thus outperforms C4.5.

Keywords

Decision Tree Leaf Node Child Node Class Dimension Gain Ratio 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Blake, C., Keogh, E., and Merz, C. J.: UCI Repository of Machine Learning Databases, http://www.ics.uci.edu/~mlearn/MLRepository.html, Univ. California, Irvine, Dept. Information and Computer Science (1998).Google Scholar
  2. 2.
    Breiman, L., Friedman, J., Olshen, R., and Stone, C. A.: Classification and Regression Trees, Chapman & Hall, New York (1984).zbMATHGoogle Scholar
  3. 3.
    Caruana, R.: “Multitask Learning”, Machine Learning, Vol. 28, No. 1, pp. 41–75 (1997).CrossRefGoogle Scholar
  4. 4.
    Caruana, R.: “Multitask Learning”, Ph. D. Thesis, CMU-CS-97-203, School of Computer Science, Carnegie Mellon Univ., Pittsburgh, Pa. (1997).Google Scholar
  5. 5.
    Dougherty, J., Kohavi, R., and Sahami M.: “Supervised and Unsupervised Discretization of Continuous Features”, Proc. Twelfth Int’l Conf. on Machine Learning (ICML), pp. 194–202 (1995).Google Scholar
  6. 6.
    Forouraghi, B., Schmerr, L. W., and Prabhu, G. M.: “Induction of Multivariate Regression Trees for Design Optimization”, Proc. Twelfth Nat’l Conf. on Artificial Intelligence (AAAI), pp. 607–612 (1994).Google Scholar
  7. 7.
    Kendall, M. G.: Multivariate Analysis, second edition, Charles Griffin, High Wycombe, England (1980).Google Scholar
  8. 8.
    Mingers, J.: “An Empirical Comparison of Pruning Methods for Decision-Tree Induction”, Machine Learning, Vol. 4, No. 2, pp. 227–243 (1989).CrossRefGoogle Scholar
  9. 9.
    Murthy, S. K.: “Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey”, Data Mining and Knowledge Discovery, Vol. 2, No. 4, pp. 345–389 (1998).CrossRefGoogle Scholar
  10. 10.
    Quinlan, J. R.: “Induction of Decision Trees”, Machine Learning, Vol. 1, No. 1, pp. 81–106 (1986).Google Scholar
  11. 11.
    Quinlan, J. R.: C4.5: Programs for Machine Learning, Morgan Kaufmann, San Mateo, Calif. (1993).Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Einoshin Suzuki
    • 1
  • Masafumi Gotoh
    • 1
  • Yuta Choki
    • 1
  1. 1.Division of Electrical and Computer EngineeringYokohama National UniversityJapan

Personalised recommendations