Advertisement

Pattern Analysis & Applications

, Volume 5, Issue 2, pp 201–209 | Cite as

Combining Different Methods and Numbers of Weak Decision Trees

  • Patrice Latinne
  • Olivier Debeir
  • Christine Decaestecker

Abstract:

Several ways of manipulating a training set have shown that weakened classifier combination can improve prediction accuracy. In the present paper, we focus on learning set sampling (Breiman’s Bagging) and random feature subset selections (Ho’s Random Subspaces). We present a combination scheme labelled ‘Bagfs’, in which new learning sets are generated on the basis of both bootstrap replicates and random subspaces. The performances of the three methods (Bagging, Random Subspaces and Bagfs) are compared to the standard Adaboost algorithm. All four methods are assessed by means of a decision-tree inducer (C4.5). In addition, we also study whether the number and the way in which they are created has a significant influence on the performance of their combination. To answer these two questions, we undertook the application of the McNemar test of significance and the Kappa degree-of-agreement. The results, obtained on 23 conventional databases, show that on average, Bagfs exhibits the best agreement between prediction and supervision.

Key words: Bagging; Boosting; Decision trees; Ensemble learning; Random subspaces 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag London Limited 2002

Authors and Affiliations

  • Patrice Latinne
    • 1
  • Olivier Debeir
    • 2
  • Christine Decaestecker
    • 3
  1. 1.IRIDIA (Artificial Intelligence Department), Université Libre de Bruxelles, Brussels, BelgiumBE
  2. 2.Information and Decision Systems, Université Libre de Bruxelles, Brussels, BelgiumBE
  3. 3.Laboratory of Histopathology, Université Libre de Bruxelles, Brussels, BelgiumBE

Personalised recommendations