Advertisement

Unified View of Decision Tree Learning Machines for the Purpose of Meta-learning

  • Krzysztof Grąbczewski
Conference paper
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 95)

Abstract

The experience gained from thorough analysis of many decision tree (DT) induction algorithms, has resulted in a unified model for DT construction and reliable testing. The model has been designed and implemented within Intemi - a versatile environment for data mining. Its modular architecture facilitates construction of all the most popular algorithms by combining proper building blocks. Alternative components can be reliably compared by tests in the same environment. This is the start point for a manifold research in the area of DTs, which will bring advanced meta-learning algorithms providing new knowledge about DT induction and optimal DT models for many kinds of data.

Keywords

Decision trees meta-learning object oriented design 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bobrowski, L., Krętowski, M.: Induction of multivariate decision trees by using dipolar criteria. In: Zighed, D.A., Komorowski, J., Żytkow, J.M. (eds.) PKDD 2000. LNCS (LNAI), vol. 1910, pp. 331–336. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  2. 2.
    Breiman, L., Friedman, J.H., Olshen, A., Stone, C.J.: Classification and regression trees. Wadsworth, Belmont (1984)zbMATHGoogle Scholar
  3. 3.
    Buntine, W., Niblett, T.: A further comparison of splitting rules for decision-tree induction. Machine Learning 8, 75–85 (1992), 10.1007/BF00994006Google Scholar
  4. 4.
    Esposito, F., Malerba, D., Semeraro, G.: A comparative analysis of methods for pruning decision trees. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(5), 476–491 (1997)CrossRefGoogle Scholar
  5. 5.
    Gama, J.: Probabilistic linear tree. In: ICML 1997: Proceedings of the Fourteenth International Conference on Machine Learning, pp. 134–142. Morgan Kaufmann Publishers Inc., San Francisco (1997)Google Scholar
  6. 6.
    Gama, J.: Discriminant trees. In: ICML 1999: Proceedings of the Sixteenth International Conference on Machine Learning, pp. 134–142. Morgan Kaufmann Publishers Inc., San Francisco (1999)Google Scholar
  7. 7.
    Grąbczewski, K., Duch, W.: A general purpose separability criterion for classification systems. In: Proceedings of the 4th Conference on Neural Networks and Their Applications, Zakopane, Poland, pp. 203–208 (June 1999)Google Scholar
  8. 8.
    Grąbczewski, K., Duch, W.: The Separability of Split Value criterion. In: Proceedings of the 5th Conference on Neural Networks and Their Applications, Zakopane, Poland, June 2000 , pp. 201–208 (2000)Google Scholar
  9. 9.
    Grąbczewski, K., Jankowski, N.: Versatile and efficient meta-learning architecture: Knowledge representation and management in computational intelligence. In: IEEE Symposium Series on Computational Intelligence (SSCI 2007), pp. 51–58. IEEE, Los Alamitos (2007)Google Scholar
  10. 10.
    Grąbczewski, K., Jankowski, N.: Efficient and friendly environment for computational intelligence. Knowledge-Based Systems, 41p. (2011) (accepted)Google Scholar
  11. 11.
    John, G.H.: Robust linear discriminant trees. In: AI & Statistics 1995 [7], pp. 285–291. Springer, Heidelberg (1995)Google Scholar
  12. 12.
    Kim, H., Loh, W.Y.: Classification trees with bivariate linear discriminant node models. Journal of Computational and Graphical Statistics 12, 512–530 (2003)CrossRefMathSciNetGoogle Scholar
  13. 13.
    Kohavi, R., Sommerfield, D., Dougherty, J.: Data mining using MLC++: A machine learning library in C++. In: Tools with Artificial Intelligence, pp. 234–245. IEEE Computer Society Press, Los Alamitos (1996), http://www.sgi.com/tech/mlc Google Scholar
  14. 14.
    Loh, W.Y., Vanichsetakul, N.: Tree-structured classification via generalized discriminant analysis (with discussion). Journal of the American Statistical Association 83, 715–728 (1988)CrossRefzbMATHMathSciNetGoogle Scholar
  15. 15.
    Loh, W.Y., Shih, Y.S.: Split selection methods for classification trees. Statistica Sinica 7, 815–840 (1997)zbMATHMathSciNetGoogle Scholar
  16. 16.
    Mingers, J.: An empirical comparison of selection measures for decision-tree induction. Machine Learning 3, 319–342 (1989)Google Scholar
  17. 17.
    Mingers, J.: An empirical comparison of pruning methods for decision tree induction. Machine Learning 4(2), 227–243 (1989)CrossRefGoogle Scholar
  18. 18.
    Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2, 1–32 (1994)zbMATHGoogle Scholar
  19. 19.
    Müller, W., Wysotzki, F.: The decision-tree algorithm CAL5 based on a statistical approach to its splitting algorithm. In: Machine Learning and Statistics: The Interface, pp. 45–65 (1997)Google Scholar
  20. 20.
    Quinlan, J.R.: Induction of decision trees. Machine Learning 1, 81–106 (1986)Google Scholar
  21. 21.
    Quinlan, J.R.: Simplifying decision trees. Int. J. Man-Mach. Stud. 27(3), 221–234 (1987)CrossRefGoogle Scholar
  22. 22.
    Quinlan, J.R.: Programs for machine learning (1993)Google Scholar
  23. 23.
    Rokach, L., Maimon, O.: Top-down induction of decision trees classifiers – a survey. IEEE Transactions on Systems, Man and Cybernetics: Part C 1(11), 1–12 (2002)Google Scholar
  24. 24.
    Rokach, L., Maimon, O.: Data Mining with Decision Trees: Theory and Applications. World Scientific, Singapore (2008)zbMATHGoogle Scholar
  25. 25.
    Utgoff, P.E., Brodley, C.E.: Linear machine decision trees. Technical Report UM-CS-1991-010, Department of Computer Science, University of Massachusetts (1991)Google Scholar
  26. 26.
    Yildiz, O.T., Alpaydin, E.: Linear discriminant trees. International Journal of Pattern Recognition and Artifficial Intelligence 19(3), 323–353 (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Krzysztof Grąbczewski
    • 1
  1. 1.Department of InformaticsNicolaus Copernicus UniversityToruńPoland

Personalised recommendations