A Comparison of Random Forest with ECOC-Based Classifiers

  • R. S. Smith
  • M. Bober
  • T. Windeatt
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6713)


We compare experimentally the performance of three approaches to ensemble-based classification on general multi-class datasets. These are the methods of random forest, error-correcting output codes (ECOC) and ECOC enhanced by the use of bootstrapping and class-separability weighting (ECOC-BW). These experiments suggest that ECOC-BW yields better generalisation performance than either random forest or unmodified ECOC. A bias-variance analysis indicates that ECOC benefits from reduced bias, when compared to random forest, and that ECOC-BW benefits additionally from reduced variance. One disadvantage of ECOC-based algorithms, however, when compared with random forest, is that they impose a greater computational demand leading to longer training times.


Random Forest Target Class Output Code Good Generalisation Performance Random Forest Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Breiman, L.: Random Forests. Journal of Machine Learning 45(1), 5–32 Springer, Heidelberg (2001)CrossRefzbMATHGoogle Scholar
  2. 2.
    Dietterich, T.G., Bakiri, G.: Solving Multiclass Learning Problems via Error-Correcting Output Codes. Journal of Artificial Intelligence Research 2, 263–286 (1995)zbMATHGoogle Scholar
  3. 3.
    James, G.: Majority Vote Classifiers: Theory and Applications. PhD Dissertation, Stanford University (1998)Google Scholar
  4. 4.
    Merz, C.J., Murphy, P.M.: UCI Repository of Machine Learning Databases (1998),
  5. 5.
    Smith, R.S., Windeatt, T.: Class-Separability Weighting and Bootstrapping in Error Correcting Output Code Ensembles. In: El Gayar, N., Kittler, J., Roli, F. (eds.) MCS 2010. LNCS, vol. 5997, pp. 185–194. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  6. 6.
    Smith, R.S., Windeatt, T.: A Bias-Variance Analysis of Bootstrapped Class-Separability Weighting for Error-Correcting Output Code Ensembles. In: Proc. 20th Int. Conf. on Pattern Recognition (ICPR), pp. 61–64 (2010)Google Scholar
  7. 7.
    Dietterich, T.G., Bakiri, G.: Error-correcting output codes: A general method for improving multiclass inductive learning programs. In: Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-1991), pp. 572–577. AAAI Press, Anaheim (1991)Google Scholar
  8. 8.
    Windeatt, T.: Accuracy/ Diversity and Ensemble Classifier Design. IEEE Trans. Neural Networks 17(4) (2006)Google Scholar
  9. 9.
    Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity Creation Methods: A Survey and Categorisation. Journal of Information Fusion 6(1) (2005)Google Scholar
  10. 10.
    James, G.: Variance and Bias for General Loss Functions. Machine Learning 51(2), 115–135 (2003)CrossRefzbMATHGoogle Scholar
  11. 11.
    Kohavi, R., Wolpert, D.: Bias plus variance decomposition for zero-one loss functions. In: Proc. 13th International Conference on Machine Learning, pp. 275–283 (1996)Google Scholar
  12. 12.
    Bernard, S., Heutte, L., Adam, S.: Influence of Hyperparameters on Random Forest Accuracy. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds.) MCS 2009. LNCS, vol. 5519, pp. 171–180. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  13. 13.
    Duin, R.P.W., Juszczak, P., Paclik, P., Pekalska, E., de Ridder, D., Tax, D.M.J., Verzakov, S.: PRTools 4.1, A Matlab Toolbox for Pattern Recognition. Delft University of Technology (2007)Google Scholar
  14. 14.
    Witten, I.H., Frank, E.: Data Mining: Practical machine learning tools and techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • R. S. Smith
    • 1
  • M. Bober
    • 2
  • T. Windeatt
    • 1
  1. 1.Centre for Vision, Speech and Signal ProcessingUniversity of SurreySurreyUK
  2. 2.Mitsubishi Electric R&D Centre Europe B.VSurreyUK

Personalised recommendations