Advertisement

Data Mining and Knowledge Discovery

, Volume 3, Issue 2, pp 197–217 | Cite as

Partitioning Nominal Attributes in Decision Trees

  • Don Coppersmith
  • Se June Hong
  • Jonathan R.M. Hosking
Article

Abstract

To find the optimal branching of a nominal attribute at a node in an L-ary decision tree, one is often forced to search over all possible L-ary partitions for the one that yields the minimum impurity measure. For binary trees (L = 2) when there are just two classes a short-cut search is possible that is linear in n, the number of distinct values of the attribute. For the general case in which the number of classes, k, may be greater than two, Burshtein et al. have shown that the optimal partition satisfies a condition that involves the existence of 2 L hyperplanes in the class probability space. We derive a property of the optimal partition for concave impurity measures (including in particular the Gini and entropy impurity measures) in terms of the existence ofL vectors in the dual of the class probability space, which implies the earlier condition.

Unfortunately, these insights still do not offer a practical search method when n and k are large, even for binary trees. We therefore present a new heuristic search algorithm to find a good partition. It is based on ordering the attribute's values according to their principal component scores in the class probability space, and is linear in n. We demonstrate the effectiveness of the new method through Monte Carlo simulation experiments and compare its performance against other heuristic methods.

binary decision tree classification data mining entropy Gini index impurity optimal splitting 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L., Friedman, J.H., Olshen, R.A., and Stone, C.J. 1984. Classification and Regression Trees. Monterey, CA: Wadsworth International Group.Google Scholar
  2. Burshtein, D., Pietra, V.D., Kanevsky, D., and Nadas, A. 1989. A splitting theorem for tree construction. Research Report RC 14754, IBM Research Division, Yorktown Heights, NY.Google Scholar
  3. Burshtein, D., Pietra, V.D., Kanevsky, D., and Nadas, A. 1992. Minimum impurity partitions. Ann. Statist., 20:1637-1646.Google Scholar
  4. Chou, P.A. 1988. Application of information theory to pattern recognition and the design of decision trees and trellises. Ph.D. Dissertation, Stanford Univ., Stanford, CA.Google Scholar
  5. Chou, P.A. 1991. Optimal partitioning for classification and regression trees. IEEE Trans. PAMI, 13:340-354.Google Scholar
  6. Cover, T.M. 1965. Geometrical and statistical properties of system of linear inequalities with applications in pattern recognition. IEEE Trans. Elec. Comput., EC-14:326-334.Google Scholar
  7. Mehta, M., Agrawal, R., and Rissanen, J. 1996. SLIQ: A fast classifier for data mining. Proc. 5th Int. Conf. on Extending Database Technology, Avignon, France.Google Scholar
  8. Nadas, A., Nahamoo, D., Picheny, M.A., and Powell, J. 1991. An iterative flip-flop approximation of the most informative split in the construction of decision trees. Proc. ICASSP-91, pp. 565-568.Google Scholar
  9. NASA. 1992. Introduction to IND Version 2.1, GA23-2475-02 edition. NASA Ames Research Center.Google Scholar
  10. Palmer, W.C. 1965. Meteorological drought. Research Paper 45, Weather Bureau, Washington, DC.Google Scholar
  11. Quinlan, J.R. 1993. C4.5 Programs for Machine Learning. San Francisco, CA: Morgan Kaufmann.Google Scholar
  12. Teqnical Services Inc. 1996. National Electronic Drought Atlas (CD-ROM). Teqnical Services Inc., New London, CT.Google Scholar
  13. Willeke, G.E., Hosking, J.R.M., Wallis, J.R., and Guttman, N.B. 1995. The National Drought Atlas (draft). IWR Report 94-NDS-4, US Army Corps of Engineers, Fort Belvoir, VA.Google Scholar

Copyright information

© Kluwer Academic Publishers 1999

Authors and Affiliations

  • Don Coppersmith
    • 1
  • Se June Hong
    • 1
  • Jonathan R.M. Hosking
    • 1
  1. 1.IBM Research DivisionT.J. Watson Research CenterYorktown HeightsUSA

Personalised recommendations