Advertisement

Minimising Decision Tree Size as Combinatorial Optimisation

  • Christian Bessiere
  • Emmanuel Hebrard
  • Barry O’Sullivan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5732)

Abstract

Decision tree induction techniques attempt to find small trees that fit a training set of data. This preference for smaller trees, which provides a learning bias, is often justified as being consistent with the principle of Occam’s Razor. Informally, this principle states that one should prefer the simpler hypothesis. In this paper we take this principle to the extreme. Specifically, we formulate decision tree induction as a combinatorial optimisation problem in which the objective is to minimise the number of nodes in the tree. We study alternative formulations based on satisfiability, constraint programming, and hybrids with integer linear programming. We empirically compare our approaches against standard induction algorithms, showing that the decision trees we obtain can sometimes be less than half the size of those found by other greedy methods. Furthermore, our decision trees are competitive in terms of accuracy on a variety of well-known benchmarks, often being the most accurate. Even when post-pruning of greedy trees is used, our constraint-based approach is never dominated by any of the existing techniques.

Keywords

Decision Tree Small Tree Constraint Programming Decision Tree Induction Constraint Programming Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Esmeir, S., Markovitch, S.: Anytime learning of decision trees. Journal of Machine Learning Research 8, 891–933 (2007)MATHGoogle Scholar
  2. 2.
    Hyafil, L., Rivest, R.L.: Constructing optimal binary decision trees is NP-complete. Inf. Process. Lett. 5(1), 15–17 (1976)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Katsirelos, G., Walsh, T.: A compression algorithm for large arity extensional constraints. In: Bessière, C. (ed.) CP 2007. LNCS, vol. 4741, pp. 379–393. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  4. 4.
    Prosser, P., Unsworth, C.: Rooted tree and spanning tree constraints. In: Workshop on Modelling and Solving Problems with Constraints, held at ECAI 2006 (2006)Google Scholar
  5. 5.
    Quinlan, J.R.: Induction of decision trees. Machine Learning 1(1), 81–106 (1986)Google Scholar
  6. 6.
    Quinlan, R.J.: C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco (1993)Google Scholar
  7. 7.
    Utgoff, P.E., Berkman, N.C., Clouse, J.A.: Decision tree induction based on efficient tree restructuring. Machine Learning 29, 5–44 (1997)CrossRefMATHGoogle Scholar
  8. 8.
    Witten, I.H., Frank, E.: Data Mining: Practical machine learning tools and techniques. Morgan Kaufmann, San Francisco (2005)MATHGoogle Scholar
  9. 9.
    Yüksektepe, F.Ü., Türkay, M.: A mixed-integer programming approach to multi-class data classification problem. European Journal of OR 173(3), 910–920 (2006)MathSciNetMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Christian Bessiere
    • 1
  • Emmanuel Hebrard
    • 2
  • Barry O’Sullivan
    • 2
  1. 1.LIRMMMontpelierFrance
  2. 2.Cork Constraint Computation Centre Department of Computer ScienceUniversity College CorkIreland

Personalised recommendations