Advertisement

Possibilistic Induction in Decision-Tree Learning

  • Eyke Hüllermeier
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2430)

Abstract

We propose a generalization of Ockham’s razor, a widely applied principle of inductive inference. This generalization intends to capture the aspect of uncertainty involved in inductive reasoning. To this end, Ockham’s razor is formalized within the framework of possibility theory: It is not simply used for identifying a single, apparently optimal model, but rather for concluding on the possibility of various candidate models. The possibilistic version of Ockham's razor is applied to (lazy) decision tree learning.

Keywords

Decision Tree Leaf Node Inductive Reasoning Inductive Inference Possibility Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    D. W. Aha, editor. Lazy Learning. Kluwer Academic Publ., 1997.Google Scholar
  2. 2.
    L. Breiman, J. Friedman, R. Olshen, and C. Stone. Classification and Regression Trees. Wadsworth International Group, Belmont, CA, 1984.MATHGoogle Scholar
  3. 3.
    W. Buntime. Learning classification trees. Statistics and Computing, 2(2), 1992.Google Scholar
  4. 4.
    L. J. Cohen. An Introduction to the Philosophy of Induction and Probability. Claredon Press, Oxford, 1989.Google Scholar
  5. 5.
    P. Domingos. The role of Occam’s razor in knowledge discovery. Data Mining and Knowledge Discovery, 3:409–425, 1999.CrossRefGoogle Scholar
  6. 6.
    D. Dubois, E. Hüllermeier, and H. Prade. Fuzzy set-based methods in instancebased reasoning. IEEE Transactions on Fuzzy Systems. To appear.Google Scholar
  7. 7.
    D. Dubois and H. Prade. Possibility Theory. Plenum Press, 1988.Google Scholar
  8. 8.
    J. H. Friedman, R. Kohavi, and Y. Yun. Lazy decision trees. In Proceedings AAAI-96. Morgan Kaufmann, 1096.Google Scholar
  9. 9.
    R. Kohavi and C. Kunz. Option decision trees with majority votes. In Proceedings ICML-97.Google Scholar
  10. 10.
    J. Mingers. An empirical comparison of pruning methods for decision tree induction. Machine Learning, 4:227–243, 1989.CrossRefGoogle Scholar
  11. 11.
    J. Mingers. An empirical comparison of selection measures for decision-tree induction. Machine Learning, 3:319–342, 1989.Google Scholar
  12. 12.
    J. R. Quinlan. Discovering rules by induction from large collections of examples. In D. Michie, editor, Expert Systems in the Micro Electronic Age. 1979.Google Scholar
  13. 13.
    J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.Google Scholar
  14. 14.
    L. A. Zadeh. Fuzzy sets as a basis for a theory of possibility. Fuzzy Sets and Systems, 1:3–28, 1978.MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Eyke Hüllermeier
    • 1
  1. 1.Department of Mathematics and Computer ScienceUniversity of MarburgGermany

Personalised recommendations