Advertisement

Selective Ensemble Algorithms of Support Vector Machines Based on Constraint Projection

  • Lei Wang
  • Yong Yang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5552)

Abstract

This paper proposes two novel ensemble algorithms for training support vector machines based on constraint projection technique and selective ensemble strategy. Firstly, projective matrices are determined upon randomly selected must-link and cannot-link constraint sets, with which original training samples are transformed into different representation spaces to train a group of base classifiers. Then, two selective ensemble techniques are used to learn the best weighting vector for combining them, namely genetic optimization and minimizing deviation errors respectively. Experiments on UCI datasets show that both proposed algorithms improve the generalization performance of support vector machines significantly, which are much better than classical ensemble algorithms, such as Bagging, Boosting, feature Bagging and LoBag.

Keywords

Support vector machines Constraint projection Selective ensemble 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dietterich, T.G.: Machine learning research: four current directions. AI Magazine 18, 97–136 (1997)Google Scholar
  2. 2.
    Krogh, A., Vedelsby, J.: Neural Network Ensembles, Cross Validation, and Active Learning. In: Advances in Neural Information Processing Systems, pp. 231–238 (1995)Google Scholar
  3. 3.
    Kuncheva, L.: Combing Pattern Classifier: Methods and Algorithm. John wiley and Sons, Chichester (2004)CrossRefGoogle Scholar
  4. 4.
    Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)CrossRefzbMATHGoogle Scholar
  5. 5.
    Dong, Y.S., Han, K.S.: A Comparison of Several Ensemble Methods for Text Categorization. In: IEEE Int. Conf. on Services Computing, pp. 419–422. IEEE Press, Shanghai (2004)Google Scholar
  6. 6.
    Tao, D.C., Tang, O.X.: Asymmetric Bagging and Random Subspace for Support Vector Machines-based Relevance Feedback in Image Retrieval. IEEE Trans. on Pat. Ana. and Mach. Intel. 28, 1088–1099 (2006)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Valentini, G., Dietterich, T.: Bias-variance Analysis of Support Vector Machines for the Development of SVM-based Ensemble Methods. J. of Mach. Learn. Res., 725–775 (2004)Google Scholar
  8. 8.
    Basu, S., Banerjee, A., Mooney, R.J.: Active Semi-supervision for Pairwise Constrained Clustering. In: Proc. of the SIAM Int. Conf. on Data Mining, Lake Buena Vista, Florida, USA, pp. 333–344 (2004)Google Scholar
  9. 9.
    Zhou, Z.H., Wu, J., Tang, W.: Ensembling Neural Networks: Many could be Better than All. Artif. Intel. 137, 239–263 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Dietterich, T.: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and randomization. Mach. Learn. 40, 139–158 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Lei Wang
    • 1
    • 2
  • Yong Yang
    • 3
  1. 1.School of Economics Information EngineeringSouthwest University of Finance and EconomicsChengduChina
  2. 2.Research Center of China Payment SystemSouthwest University of Finance and EconomicsChengduChina
  3. 3.Suminet Communication Technology(Shanghai) Co., LtdShanghaiChina

Personalised recommendations