Multi-class Boosting with Class Hierarchies

  • Goo Jun
  • Joydeep Ghosh
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5519)

Abstract

We propose AdaBoost.BHC, a novel multi-class boosting algorithm. AdaBoost.BHC solves a C class problem by using C − 1 binary classifiers defined by a hierarchy that is learnt on the classes based on their closeness to one another. It then applies AdaBoost to each binary classifier. The proposed algorithm is empirically evaluated with other multi-class AdaBoost algorithms using a variety of datasets. The results show that AdaBoost.BHC is consistently among the top performers, thereby providing a very reliable platform. In particular, it requires significantly less computation than AdaBoost.MH, while exhibiting better or comparable generalization power.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. In: European Conference on Computational Learning Theory, pp. 23–37 (1995)Google Scholar
  2. 2.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Abney, S., Schapire, R., Singer, Y.: Boosting applied to tagging and pp attachment (1999)Google Scholar
  4. 4.
    Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3), 297–336 (1999)CrossRefMATHGoogle Scholar
  5. 5.
    Schapire, R.E.: Using output codes to boost multiclass learning problems. In: ICML 1997: Proceedings of the Fourteenth International Conference on Machine Learning, pp. 313–321 (1997)Google Scholar
  6. 6.
    Guruswami, V., Sahai, A.: Multiclass learning, boosting, and error-correcting codes. In: COLT 1999: Proceedings of the twelfth annual conference on Computational learning theory, pp. 145–155. ACM, New York (1999)CrossRefGoogle Scholar
  7. 7.
    Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. Journal of Artifical Intelligence Research 2, 263 (1995)MATHGoogle Scholar
  8. 8.
    Kumar, S., Ghosh, J., Crawford, M.M.: Hierarchical fusion of multiple classifiers for hyperspectral data analysis. Pattern Analysis & Applications 5(2), 210–220 (2002)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Rajan, S., Ghosh, J.: An empirical comparison of hierarchical vs. two-level approaches to multiclass problems. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 283–292. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  10. 10.
    Tibshirani, R., Hastie, T.: Margin trees for high-dimensional classification. J. Mach. Learn. Res. 8, 637–652 (2007)MATHGoogle Scholar
  11. 11.
    Zhu, J., Rosset, S., Zou, H., Hastie, T.: Multi-class adaboost. Tech. rep., Department of Statistics, University of Michigan, Ann Arbor, MI 48109 (2006)Google Scholar
  12. 12.
    Sun, Y., Todorovic, S., Li, J.: Unifying multi-class adaboost algorithms with binary base learners under the margin framework. Pattern Recogn. Lett. 28(5), 631–643 (2007)CrossRefGoogle Scholar
  13. 13.
    Asuncion, A., Newman, D.J.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
  14. 14.
    Ho, T.K., Basu, M.: Complexity measures of supervised classification problems. IEEE Transactions on Pattern Analysis and Machine Intelligence 24, 289–300 (2002)CrossRefGoogle Scholar
  15. 15.
    Li, L.: Multiclass boosting with repartitioning. In: ICML 2006: Proceedings of the 23rd international conference on Machine learning, pp. 569–576. ACM, New York (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Goo Jun
    • 1
  • Joydeep Ghosh
    • 1
  1. 1.Department of Electrical and Computer EngineeringThe University of Texas at AustinAustinUSA

Personalised recommendations