Advertisement

Finding Optimal Decision Trees

  • Petr Máša
  • Tomáš Kočka
Part of the Advances in Soft Computing book series (AINSC, volume 35)

Abstract

This paper presents a new algorithm that finds the generative model of a decision tree from data. We show that for infinite data and finite number of attributes the algorithm always finds the generative model (i.e. the model of the decision tree, from which the data were generated) except measure zero set of distributions. The algorithm returns reasonable results even when the above-mentioned assumptions are not satisfied. The algorithm is polynomial in the number of leaves of the generative model compared to the exponential complexity of the trivial exhaustive search algorithm. Similar result was recently obtained for learning Bayesian networks from data ([1],[2]). Experimental comparison of the new algorithm with the CART standard on both simulated and real data is shown. The new algorithm shows significant improvements over the CART algorithm in both cases. The whole paper is for simplicity restricted to binary variables but can be easily generalized.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    1. Chickering, M.: Learning Equivalence Classes of Bayesian-Network Structures, Journal of Machine Learning Research 2 (2002), pp. 445–498.Google Scholar
  2. 2.
    2. Chickering, M., Meek, Ch.: Finding Optimal Bayesian Networks, In Proceedings of Eighteenth Conference on Uncertainty in Artificial Intelligence, Edmonton, AB (2002), pp. 94–102.Google Scholar
  3. 3.
    3. Breiman L. et al.: Classification and Regression Trees, Woodsworth International Group (1984).Google Scholar
  4. 4.
    4. P. Utgo., N. C. Berkman, and J. A. Clouse: Decision tree induction based on efficient tree restructuring, Machine Learning (1997), pp. 5–44.Google Scholar
  5. 5.
    5. Utgo., P.E.: Decision Tree Induction Based on Efficient Tree Restructuring, Technical Report 95–18, University of Massachusetts, Department of Computer Science, Amherst, MA (1996).Google Scholar
  6. 6.
    6. Quinlan, J. R.: Simplifying decision trees. International Journal of Man- Machine Studies, 27 (1987), pp. 221–234.CrossRefGoogle Scholar
  7. 7.
    7. Wikipedia contributions. Occam's Razor. Retrieved from http://en.wikipedia.org/wiki/Occam's Razor on January 8, 2006.Google Scholar
  8. 8.
    8. Pfahringer, B.: Inducing Small and Accurate Decision Trees, Technical Report, Oesterreichisches Forschungsinstitut fuer Artificial Intelligence, Wien, 1998.Google Scholar
  9. 9.
    9. Esposito, F.,Malerba, D., Semerado, G.: A Comparative Analysi of Methods for Pruning Decision Trees, IEEE Transactions on Pattern Analysis and Machine Intelligence, 5 (1997), pp. 476–491.CrossRefGoogle Scholar

Copyright information

© Springer 2006

Authors and Affiliations

  • Petr Máša
    • 1
  • Tomáš Kočka
    • 1
  1. 1.Faculty of Informatics and StatisticsUniversity of EconomicsPrague

Personalised recommendations