Pruning multivariate decision trees by hyperplane merging

  • Miroslav Kubat
  • Doris Flotzinger
Part of the Lecture Notes in Computer Science book series (LNCS, volume 912)


Several techniques for induction of multivariate decision trees have been published in the last couple of years. Internal nodes of such trees typically contain binary tests questioning to what side of a hyperplane the example lies. Most of these algorithms use cut-off pruning mechanisms similar to those of traditional decision trees. Nearly unexplored remains the large domain of substitutional pruning methods, where a new decision test (derived from previous decision tests) replaces a subtree. This paper presents an approach to multivariate-tree pruning based on merging the decision hyperplanes, and demonstrates its performance on artificial and benchmark data.


Classification Accuracy Benchmark Data Pruning Technique Decision Test 11th International Joint 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. L. Breiman, J. Friedman, R. Olshen, and C.J. Stone (1984). Classification and Regression Trees. Wadsworth International Group, Belmont, CAGoogle Scholar
  2. C.E. Brodley and P.E. Utgoff (1994). Multivariate Decision Trees. Machine Learning (in press)Google Scholar
  3. B. Cestnik and I. Bratko (1991). On Estimating Probabilities in Tree Pruning. Proceedings of the European Working Session on Mashine Learning, Porto, Portugal, March 6–8, 138–150Google Scholar
  4. P.K. Chan (1988). Inductive Learning with BCT. Proceedings of the 5th International Conference on Machine Learning, Morgan KaufmannGoogle Scholar
  5. R.O. Duda and P.E. Hart (1973). Pattern Classification and Scene Analysis. John Wiley & Sons, New YorkGoogle Scholar
  6. Heng Guo and S.B. Gelfand (1992). Classification Trees with Neural Network Feature Extraction. IEEE Transactions on Neural Networks, 3:923–933Google Scholar
  7. S. Murthy, S. Kasif, S. Salzberg, and R. Beigel (1993). OC1: Randomized Induction of Oblique Decision Trees. Proceedings of the 11th National Conference on Artificial Intelligence, Washington, DCGoogle Scholar
  8. G. Pagallo (1989). Learning DNF by Decision Trees. Proceedings of the 11th International Joint Conference on Artificial Intelligence, IJCAI'89 Google Scholar
  9. Y. Park and J. Sklansky (1990). Automated Design of Linear Tree Classifiers. Pattern Recognition 23:1393–1412Google Scholar
  10. J.R. Quinlan (1990). Probabilistic Decision Trees. In Kodratoff, Y.-Michalski, R.S. (eds.) Machine Learning: An Artificial Intelligence Approach, Volume III, Morgan Kaufmann, 140–152Google Scholar
  11. J.R. Quinlan (1993). C4.5: Programs for Machine Learning. Morgan Kaufmann, San MateoGoogle Scholar
  12. P.E. Utgoff (1989). Perceptron Trees: A Case Study in Hybrid Concept Representations. Connection Science 1:377–391Google Scholar
  13. P.E. Utgoff and C.E. Brodley (1990): An Incremental Method for Finding Multivariate Splits for Decision Trees. Proceedings of the 7th International Conference on Machine Learning, Morgan KaufmannGoogle Scholar
  14. S.M. Weiss and I. Kapouleas (1989). An Empirical Comparison of Pattern Recognition, Neural Nets, and Machine Learning Classification Methods. Proceedings of the 11th International Joint Conference on Artificial Intelligence, IJCAI'89, Detroit, MI, 781–787Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1995

Authors and Affiliations

  • Miroslav Kubat
    • 1
  • Doris Flotzinger
    • 2
  1. 1.Institute for Systems SciencesJohannes Kepler UniversityLinzAustria
  2. 2.Department of Medical Informatics, Institute of Biomedical EngineeringGraz University of TechnologyGrazAustria

Personalised recommendations