Encyclopedia of Machine Learning and Data Mining

2017 Edition
| Editors: Claude Sammut, Geoffrey I. Webb

Decision Tree

  • Johannes FürnkranzEmail author
Reference work entry
DOI: https://doi.org/10.1007/978-1-4899-7687-1_66


The induction of decision trees is one of the oldest and most popular techniques for learning discriminatory models, which has been developed independently in the statistical (Breiman et al. 1984; Kass 1980) and machine learning (Hunt et al. 1966; Quinlan 19831986) communities. A decision tree is a tree-structured classification model, which is easy to understand, even by non-expert users, and can be efficiently induced from data. An extensive survey of decision-tree learning can be found in Murthy (1998).

This is a preview of subscription content, log in to check access.

Recommended Reading

  1. Breiman L (2001) Random forests. Mach Learn 45(1): 5–32zbMATHCrossRefGoogle Scholar
  2. Breiman L, Friedman JH, Olshen R, Stone C (1984) Classification and regression trees. Wadsworth & Brooks, Pacific GrovezbMATHGoogle Scholar
  3. Buntine W, Niblett T (1992) A further comparison of splitting rules for decision-tree induction. Mach Learn 8:75–85Google Scholar
  4. Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Saitta L (ed) Proceedings of the 13th international conference on machine learning, Bari. Morgan Kaufmann, pp 148–156Google Scholar
  5. Hunt EB, Marin J, Stone PJ (1966) Experiments in induction. Academic, New YorkGoogle Scholar
  6. Kass GV (1980) An exploratory technique for investigating large quantities of categorical data. Appl Stat 29:119–127CrossRefGoogle Scholar
  7. Mingers J (1989a) An empirical comparison of selection measures for decision-tree induction. Mach Learn 3:319–342Google Scholar
  8. Mingers J (1989b) An empirical comparison of pruning methods for decision tree induction. Mach Learn 4:227–243CrossRefGoogle Scholar
  9. Murthy SK (1998) Automatic construction of decision trees from data: a multi-disciplinary survey. Data Min Knowl Discov 2(4):345–389CrossRefGoogle Scholar
  10. Quinlan JR (1983) Learning efficient classification procedures and their application to chess end games. In: Michalski RS, Carbonell JG, Mitchell TM (eds) Machine learning. An artificial intelligence approach, Tioga, Palo Alto, pp 463–482Google Scholar
  11. Quinlan JR (1986) Induction of decision trees. Mach Learn 1:81–106Google Scholar
  12. Quinlan JR (1993) C4.5: Programs for machine learning. Morgan Kaufmann, San MateoGoogle Scholar
  13. Quinlan JR (1996) Improved use of continuous attributes in C4.5. J Artif Intell Res 4:77–90zbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.Knowledge Engineering GroupTU DarmstadtDarmstadtDeutschland
  2. 2.Department of Information TechnologyUniversity of LeobenLeobenAustria