Advertisement

Model Selection in Omnivariate Decision Trees

  • Olcay Taner Yıldız
  • Ethem Alpaydın
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3720)

Abstract

We propose an omnivariate decision tree architecture which contains univariate, multivariate linear or nonlinear nodes, matching the complexity of the node to the complexity of the data reaching that node. We compare the use of different model selection techniques including AIC, BIC, and CV to choose between the three types of nodes on standard datasets from the UCI repository and see that such omnivariate trees with a small percentage of multivariate nodes close to the root generalize better than pure trees with the same type of node everywhere. CV produces simpler trees than AIC and BIC without sacrificing from expected error. The only disadvantage of CV is its longer training time.

Keywords

Linear Discriminant Analysis Decision Node Univariate Node Quadratic Node Time Select 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Breslow, L.A., Aha, D.W.: Simplifying decision trees: A survey. Technical Report AIC-96-014, Navy Center for Applied Research in AI, Naval Research Laboratory, Washington DC, USA (1997)Google Scholar
  2. 2.
    Murthy, S.K.: Automatic construction of decision trees from data: A multi-disciplinary survey. Data Mininig and Knowledge Discovery 2, 345–389 (1998)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Yıldız, O.T., Alpaydın, E.: Linear discriminant trees. International Journal of Pattern Recognition and Artificial Intelligence 19, 323–353 (2005)CrossRefGoogle Scholar
  4. 4.
    Lim, T.S., Loh, W.Y., Shih, Y.S.: A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms. Machine Learning 40, 203–228 (2000)zbMATHCrossRefGoogle Scholar
  5. 5.
    Yıldız, O.T., Alpaydın, E.: Omnivariate decision trees. IEEE Transactions on Neural Networks 12, 1539–1546 (2001)CrossRefGoogle Scholar
  6. 6.
    Alpaydın, E.: Introduction to Machine Learning. The MIT Press, Cambridge (2004)Google Scholar
  7. 7.
    Guo, H., Gelfand, S.B.: Classification trees with neural network feature extraction. IEEE Transactions on Neural Networks 3, 923–933 (1992)CrossRefGoogle Scholar
  8. 8.
    Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Second International Symposium on Information Theory, pp. 267–281 (1973)Google Scholar
  9. 9.
    Schwarz, G.: Estimating the dimension of a model. Annals of Statistics 6, 461–464 (1978)zbMATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    Dietterich, T.G.: Approximate statistical tests for comparing supervised classification learning classifiers. Neural Computation 10, 1895–1923 (1998)CrossRefGoogle Scholar
  11. 11.
    Friedman, J.H.: A recursive partitioning decision rule for non-parametric classification. IEEE Transactions on Computers, 404–408 (1977)Google Scholar
  12. 12.
    Gama, J.: Discriminant trees. In: 16th International Conference on Machine Learning, New Brunswick, New Jersey, pp. 134–142. Morgan Kaufmann, San Francisco (1999)Google Scholar
  13. 13.
    Gama, J.: Functional trees. Machine Learning 55, 219–250 (2004)zbMATHCrossRefGoogle Scholar
  14. 14.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. John Wiley and Sons, Chichester (1984)zbMATHGoogle Scholar
  15. 15.
    Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2, 1–32 (1994)zbMATHGoogle Scholar
  16. 16.
    Loh, W.Y., Vanichsetakul, N.: Tree-structured classification via generalized discriminant analysis. Journal of the American Statistical Association 83, 715–725 (1988)zbMATHCrossRefMathSciNetGoogle Scholar
  17. 17.
    Loh, W.Y., Shih, Y.S.: Split selection methods for classification trees. Statistica Sinica 7, 815–840 (1997)zbMATHMathSciNetGoogle Scholar
  18. 18.
    Kim, H., Loh, W.: Classification trees with unbiased multiway splits. Journal of the American Statistical Association, 589–604 (2001)Google Scholar
  19. 19.
    Brodley, C.E., Utgoff, P.E.: Multivariate decision trees. Machine Learning 19, 45–77 (1995)zbMATHGoogle Scholar
  20. 20.
    Landwehr, N., Hall, M., Frank, E.: Logistic model trees. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) ECML 2003. LNCS (LNAI), vol. 2837, pp. 241–252. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  21. 21.
    Blake, C., Merz, C.: UCI repository of machine learning databases (2000)Google Scholar
  22. 22.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, New York (2001)zbMATHGoogle Scholar
  23. 23.
    Cohen, W.W.: Fast effective rule induction. In: The Twelfth International Conference on Machine Learning, pp. 115–123 (1995)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Olcay Taner Yıldız
    • 1
  • Ethem Alpaydın
    • 1
  1. 1.Department of Computer EngineeringBoğaziçi UniversityIstanbulTurkey

Personalised recommendations