Advertisement

Hierarchical Ensemble Support Cluster Machine

  • Mingmin Chi
  • Youdong Miao
  • Youze Tang
  • Jón Atli Benediktsson
  • Xuanjing Huang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5519)

Abstract

In real applications, a large-scale data set is usually available for a classifier design. The recently proposed Support Cluster Machine (SCM) can deal with such a problem, where data representation is firstly changed with a mixture model such that the classifier works on a component level instead of individual data points. However, it is difficult to decide the proper number of components for designing a successful SCM classifier. In the paper, a hierarchical ensemble SCM (HESCM) is proposed to address the problem. Initially, a hierarchical mixture modeling strategy is used to obtain different levels of mixture models from fine representation to coarse representation. Then, the mixture model in each level is exploited for training SCM. Finally, the learnt models from all the levels are integrated to obtain an ensemble result. Experiments carried on two real large-scale data sets validate the effectiveness of the proposed approach, increasing classification accuracy and stability as well as significantly reducing computational and spatial complexities of a supervised classifier compared to the state-of-the-art classifiers.

Keywords

Support Vector Machine Bottom Level Adult Data Proper Number Virtual Sample 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Briem, G.J., Benediktsson, J.A., Sveinsson, J.R.: Multiple classifiers in classification of multisource remote sensing data. IEEE Trans. Geosci Remote Sensing 40(10), 2291–2299 (2002)CrossRefGoogle Scholar
  2. 2.
    Dempster, A., Laird, N., Rubin, D.: Maximum likelihood from incomplete data via the EM algorithm. The Royal Statistical Society, Series B (1977)Google Scholar
  3. 3.
    Jebara, T., Kondor, R., Howard, A.: Probability product kernels. Journal of Machine Learning Research 5, 819–844 (2004)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Joachims, T.: Text Categorization with Support Vector Machines: Learning with Many Relevant Features. Springer, Heidelberg (1997)Google Scholar
  5. 5.
    Li, B., Chi, M., Fan, J., Xue, X.: Support cluster machine. In: Proceedings of the 24th International Conference on Machine Learning, Corvallis, USA, June 2007, pp. 505–512 (2007)Google Scholar
  6. 6.
    Melgani, F., Bruzzone, L.: Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sensing 42(8), 1778–1790 (2004)CrossRefGoogle Scholar
  7. 7.
    Osuna, E., Freund, R., Girosit, F.: Training support vector machines: an application to face detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 1997), June 1997, pp. 130–136 (1997)Google Scholar
  8. 8.
    Schölkopf, B., Smola, A.J.: Learning with Kernels. MIT Press, Cambridge (2002)zbMATHGoogle Scholar
  9. 9.
    Vasconcelos, N., Lippman, A.: Learning mixture hierarchies. In: Proceedings of the 1998 conference on Advances in neural information processing systems II, pp. 606–612. MIT Press, Cambridge (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Mingmin Chi
    • 1
  • Youdong Miao
    • 1
  • Youze Tang
    • 1
  • Jón Atli Benediktsson
    • 2
  • Xuanjing Huang
    • 1
  1. 1.School of Computer ScienceFudan UniversityShanghaiChina
  2. 2.Faculty of Electrical and Computer EngineeringUniversity of IcelandIceland

Personalised recommendations