A New Ensemble-Based Cascaded Framework for Multiclass Training with Simple Weak Learners

  • Teo Susnjak
  • Andre Barczak
  • Napoleon Reyes
  • Ken Hawick
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6854)


We present a novel approach to multiclass learning using an ensemble-based cascaded learning framework. By implementing a multiclass cascaded classifier with AdaBoost, we show how detection runtimes are accelerated since only a subset of the ensemble is executed, thus making the classifiers suitable for computer vision applications. We also propose a new multiclass weak learner and demonstrate the framework’s ability to achieve arbitrarily low training errors in conjunction with it. We tested our algorithm against AdaBoost.OC, ECC and M2 multiclass learning methods, on seven benchmark UCI datasets. In our experiments, we found that our framework achieves higher accuracy on five out of seven datasets and displays faster runtime efficiency in all cases.


Class Label Training Error Weak Learner Computer Vision Application Multiclass Problem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lorena, A.C., Carvalho, A.C., Gama, J.a.M.: A review on the combination of binary classifiers in multiclass problems. Artif. Intell. Rev. 30, 19–37 (2008)CrossRefGoogle Scholar
  2. 2.
    Verschae, R., del Solar, J.R.: Coarse-to-fine multiclass nested cascades for object detection. In: International Conference on Pattern Recognition, pp. 344–347 (2010)Google Scholar
  3. 3.
    Li, L.: Multiclass boosting with repartitioning. In: Proc. of the 23rd Int. Con. on Machine Learning, ICML 2006, pp. 569–576. ACM, NY (2006)Google Scholar
  4. 4.
    Allwein, E.L., Schapire, R.E., Singer, Y.: Reducing multiclass to binary: a unifying approach for margin classifiers. J. Mach. Learn. Res. 1, 113–141 (2001)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. CoRR cs.AI/9501101 (1995)Google Scholar
  6. 6.
    Guruswami, V., Sahai, A.: Multiclass learning, boosting, and error-correcting codes. In: Proc. of the 12th Ann. Conf. on Comput. Learn. Theory, COLT 1999, pp. 145–155. ACM, New York (1999)Google Scholar
  7. 7.
    Freund, Y., Schapire, R.: Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference in Machine Learning, pp. 148–156 (1996)Google Scholar
  8. 8.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Viola, P., Jones, M.: Robust real-time face detection. In: Proc. of the 8th Int. Con. on Computer Vision (ICCV 2001), p. 747. IEEE Computer Society, Los Alamitos (2001)Google Scholar
  10. 10.
    Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37, 297–336 (1999)CrossRefzbMATHGoogle Scholar
  11. 11.
    Eibl, G., Pfeiffer, K.P.: Multiclass boosting for weak classifiers. J. Mach. Learn. Res. 6, 189–210 (2005)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Teo Susnjak
    • 1
  • Andre Barczak
    • 1
  • Napoleon Reyes
    • 1
  • Ken Hawick
    • 1
  1. 1.Massey University AlbanyNew Zealand

Personalised recommendations