Advertisement

A “Top-Down and Prune” Induction Scheme for Constrained Decision Committees

  • Richard Nock
  • Pascal Jappy
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1642)

Abstract

It was previously argued that Decision Tree learning algorithms such as CART or C4.5 can also be useful to build small and accurate Decision Lists. In that paper, we investigate the possibility of using a similar “top-down and prune” scheme to induce formulae from a much different class: Decision Committees. A decision committee contains rules, each of which being a couple (monomial, vector), where the vector’s components are highly constrained with respect to classical polynomials. Each monomial is a condition that, when matched by an instance, returns its vector. When each monomial is tested, the sum of the returned vectors is used to take the decision. Decision Trees, Lists and Committees are complementary formalisms for the user: while trees are based on literal ordering, lists are based on monomial ordering, and committees remove any orderings over the tests. Our contribution is a new algorithm, WIDC, which learns using the same “top-down and prune” scheme, but building Decision Committees. Experimental results on twenty-two domains tend to show that WIDC is able to produce small, accurate, and interpretable decision committees.

Keywords

Decision Tree Decision Table Induction Scheme Induction Algorithm Boost Decision Tree 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    L. Breiman, J. H. Freidman, R. A. Olshen, and C. J. Stone. Classification and Regression Trees. Wadsworth, 1984.Google Scholar
  2. [2]
    C. Blake, E. Keogh, and C. J. Merz. UCI repository of machine learning databases. 1998. http://www.ics.uci.edu/~mlearn/MLRepository.html.
  3. [3]
    W. Buntine and T. Niblett. A further comparison of splitting rules for DecisionTree induction. Machine Learning, pages 75–85, 1992.Google Scholar
  4. [4]
    P. Clark and T. Niblett. The CN2 induction algorithm. Machine Learning, 3:261–283, 1989.Google Scholar
  5. [5]
    P. Domingos. A Process-oriented Heuristic for Model selection. In Proc. of the 15 th ICML, pages 127–135, 1998.Google Scholar
  6. [6]
    E. Franck and I. Witten. Using a Permutation Test for Attribute selection in Decision Trees. In Proc. of the 15 th ICML, pages 152–160, 1998.Google Scholar
  7. [7]
    R. C. Holte. Very simple Classification rules perform well on most commonly used datasets. Machine Learning, pages 63–91, 1993.Google Scholar
  8. [8]
    L. Hyafil and R. Rivest. Constructing optimal decision trees is NP-complete. Inform. Process. Letters, pages 15–17, 1976.Google Scholar
  9. [9]
    George H. John, Ron Kohavi, and Karl Pfleger. Irrelevant features and the subset selection problem. In Proc. of the 11 th ICML, pages 121–129, 1994.Google Scholar
  10. [10]
    M. J. Kearns and Y. Mansour. A Fast, Bottom-up Decision Tree Pruning algorithm with Near-Optimal generalization. In Proc. of the 15 th ICML, 1998.Google Scholar
  11. [12]
    D. Kohavi and D. Sommerfield. Targetting Business users with Decision Table Classifiers. In Proc. of the 4 th Intl Conf. on KDD, 1998.Google Scholar
  12. [13]
    T. M. Mitchell. Machine Learning. McGraw-Hill, 1997.Google Scholar
  13. [14]
    R. S. Michalski, I. Mozetic, J. Hong, and N. Lavrac. The AQ15 inductive learning system: An overview and experiments. In Proc. of AAAI’86, pages 1041–1045, 1986.Google Scholar
  14. [15]
    R. Nock and O. Gascuel. On learning decision committees. In Proc. of the 12 th ICML, pages 413–420, 1995.Google Scholar
  15. [16]
    R. Nock and P. Jappy. On the power of decision lists. In Proc. of the 15 th ICML, pages 413–420, 1998.Google Scholar
  16. [17]
    R. Nock and P. Jappy. Decision Tree based induction of Decision Lists. International Journal of Intelligent Data Analysis (accepted), 1999.Google Scholar
  17. [18]
    J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann, 1994.Google Scholar
  18. [19]
    J. R. Quinlan. Bagging, Boosting and C4.5. In Proc. of AAAI-96, pages 725–730, 1996.Google Scholar
  19. [20]
    R. L. Rivest. Learning decision lists. Machine Learning, pages 229–246, 1987.Google Scholar
  20. [21]
    R. E. Schapire. The strength of weak learnability. Machine Learning, pages 197–227, 1990.Google Scholar
  21. [22]
    R. E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. In Proceedings of COLT’98, pages 80–91, 1998.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Richard Nock
    • 1
  • Pascal Jappy
    • 2
  1. 1.Dept of Maths and CS, Campus de FouilloleUniv. des Antilles-GuyanePointe-à-PitreFrance
  2. 2.Léonard’s LogicParisFrance

Personalised recommendations