Hierarchical Ensemble Support Cluster Machine
In real applications, a large-scale data set is usually available for a classifier design. The recently proposed Support Cluster Machine (SCM) can deal with such a problem, where data representation is firstly changed with a mixture model such that the classifier works on a component level instead of individual data points. However, it is difficult to decide the proper number of components for designing a successful SCM classifier. In the paper, a hierarchical ensemble SCM (HESCM) is proposed to address the problem. Initially, a hierarchical mixture modeling strategy is used to obtain different levels of mixture models from fine representation to coarse representation. Then, the mixture model in each level is exploited for training SCM. Finally, the learnt models from all the levels are integrated to obtain an ensemble result. Experiments carried on two real large-scale data sets validate the effectiveness of the proposed approach, increasing classification accuracy and stability as well as significantly reducing computational and spatial complexities of a supervised classifier compared to the state-of-the-art classifiers.
KeywordsSupport Vector Machine Bottom Level Adult Data Proper Number Virtual Sample
Unable to display preview. Download preview PDF.
- 2.Dempster, A., Laird, N., Rubin, D.: Maximum likelihood from incomplete data via the EM algorithm. The Royal Statistical Society, Series B (1977)Google Scholar
- 4.Joachims, T.: Text Categorization with Support Vector Machines: Learning with Many Relevant Features. Springer, Heidelberg (1997)Google Scholar
- 5.Li, B., Chi, M., Fan, J., Xue, X.: Support cluster machine. In: Proceedings of the 24th International Conference on Machine Learning, Corvallis, USA, June 2007, pp. 505–512 (2007)Google Scholar
- 7.Osuna, E., Freund, R., Girosit, F.: Training support vector machines: an application to face detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 1997), June 1997, pp. 130–136 (1997)Google Scholar
- 9.Vasconcelos, N., Lippman, A.: Learning mixture hierarchies. In: Proceedings of the 1998 conference on Advances in neural information processing systems II, pp. 606–612. MIT Press, Cambridge (1999)Google Scholar