Advertisement

Minimally-Sized Balanced Decomposition Schemes for Multi-class Classification

Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 373)

Abstract

Error-Correcting Output Coding (ECOC) is a well-known class of decomposition schemes for multi-class classification. It allows representing any multiclass classification problem as a set of binary classification problems. Due to code redundancy ECOC schemes can significantly improve generalization performance on multi-class classification problems. However, they can face a computational complexity problem when the number of classes is large.

In this paper we address the computational-complexity problem of the decomposition schemes. We study a particular class of minimally-sized ECOC decomposition schemes, namely the class of minimally-sized balanced decomposition schemes (MBDSs) [14].We show thatMBDSs do not face a computational-complexity problem for large number of classes. However we also show that MBDSs cannot correct the classification errors of the binary classifiers in MBDS ensembles. Therefore we propose voting with MBDS ensembles (VMBDSs).We show that the generalization performance of the VMBDSs ensembles improves with the number of MBDS classifiers. However this number can become large and thus the VMBDSs ensembles can have a computational-complexity problem as well. Fortunately our experiments show that VMBDSs are comparable with ECOC ensembles and can outperform one-against-all ensembles using only a small number of MBDS ensembles.

Keywords

Support Vector Machine Generalization Performance Code Word Decomposition Scheme Decomposition Matrix 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Allwein, E., Schapire, R., Singer, Y.: Reducing multiclass to binary: A unifying approach for margin classifiers. J. Machine Learning Research 1, 113–141 (2002)MathSciNetGoogle Scholar
  2. 2.
    Asuncion, A., Newman, D.J.: UCI machine learning repository, http://www.ics.uci.edu/~mlearn/MLRepository.html
  3. 3.
    Cohen, W.: Fast effective rule induction. In: Prieditis, A., Russell, R. (eds.) Proc. the 12th Int. Conf. Machine Learning, Tahoe City, CA, pp. 115–123. Morgan Kaufmann, San Francisco (1995)Google Scholar
  4. 4.
    Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. J. Artif. Intell. Research 2, 263–286 (1995)zbMATHGoogle Scholar
  5. 5.
    Domingos, P.: A unified bias-variance decomposition for zero-one and squared loss. In: Kautz, H., Porter, B. (eds.) Proc. the 17th National Conf. Artif. Intell. and 12th Conf. Innovative Applications Artif. Intell., pp. 564–569. AAAI Press, Menlo Park (2000)Google Scholar
  6. 6.
    Freund, Y., Schapire, R.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comp. Syst. Sci. 55, 119–139 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Fürnkranz, J.: Round robin classification. J. Machine Learning Research 2, 721–747 (2002)zbMATHGoogle Scholar
  8. 8.
    Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Comp. 13, 637–649 (2001)CrossRefzbMATHGoogle Scholar
  9. 9.
    Kong, E.B., Dietterich, T.G.: Error-correcting output coding corrects bias and variance. In: Prieditis, A., Russell, S.J. (eds.) Proc. the 12th Int. Conf. Machine Learning, Tahoe City, CA, pp. 313–321. Morgan Kaufmann, San Francisco (1995)Google Scholar
  10. 10.
    le Cessie, S., van Houwelingen, J.C.: Ridge estimators in logistic regression. Applied Statistics 41, 191–201 (1992)CrossRefzbMATHGoogle Scholar
  11. 11.
    Libor, S.: Face recognition database (2011), http://cswww.essex.ac.uk/mv/allfaces/index.html
  12. 12.
    Lissoni, F., Llerena, P., Sanditov, B.: Inventors’ small worlds: academic and CNRS researchers in networks of inventors in France. In: Proc. the DIME Final Conf., Maastricht, The Netherlands (2011)Google Scholar
  13. 13.
    Lorena, A.C., De Carvalho, A.C.P.L.F., Gama, J.M.P.: A review on the combination of binary classifiers in multiclass problems. Artif. Intell. Rev. 30, 19–37 (2008)CrossRefGoogle Scholar
  14. 14.
    Mayoraz, E., Moreira, M.: On the decomposition of polychotomies into dichotomies. In: Fisher, D.H. (ed.) Proc. the 14th Int. Conf. Machine Learning, Nashville, TN, pp. 219–226. Morgan Kaufmann, San Francisco (1997)Google Scholar
  15. 15.
    Nadeau, C., Bengio, Y.: Inference for the generalization error. Machine Learning 52, 239–281 (2001)CrossRefGoogle Scholar
  16. 16.
    Nilsson, N.: Learning machines: foundations of trainable pattern-classifying systems. McGraw-Hill, New York (1965)zbMATHGoogle Scholar
  17. 17.
    Peterson, W., Weldon, J.: Error-correcting codes. MIT Press, Cambridge (1972)zbMATHGoogle Scholar
  18. 18.
    Rifkin, R., Klautau, A.: In defense of one-vs-all classification. J. Machine Learning Research 5, 101–141 (2004)MathSciNetGoogle Scholar
  19. 19.
    Weston, J., Watkins, C.: Support vector machines for multi-class pattern recognition. In: Verleysen, M. (ed.) Proc. the 7th European Symp. Artif. Neural Networks, Bruges, Belgium, pp. 219–224 (1999)Google Scholar
  20. 20.
    Witten, I., Frank, E., Hall, M.: Data mining: Practical machine learning tools and techniques. Morgan Kaufmann, San Francisco (2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  1. 1.Department of Knowledge EngineeringMaastricht UniversityMaastrichtThe Netherlands
  2. 2.Faculty of Health, Medicine and Life SciencesMaastricht UniversityMaastrichtThe Netherlands
  3. 3.Donders Institute for Brain, Cognition and BehaviourRadboud University NijmegenNijmegenThe Netherlands

Personalised recommendations