Advertisement

Diversified SVM Ensembles for Large Data Sets

  • Ivor W. Tsang
  • Andras Kocsor
  • James T. Kwok
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4212)

Abstract

Recently, the core vector machine (CVM) has shown significant speedups on classification and regression problems with massive data sets. Its performance is also almost as accurate as other state-of-the-art SVM implementations. By incorporating the orthogonality constraints to diversify the CVM ensembles, this turns out to speed up the maximum margin discriminant analysis (MMDA) algorithm. Extensive comparisons with the MMDA ensemble along with bagging on a number of large data sets show that the proposed diversified CVM ensemble can improve classification performance, and is also faster than the original MMDA algorithm by more than an order of magnitude.

Keywords

Support Vector Machine Ensemble Method Ensemble Learning Machine Learn Research Orthogonality Constraint 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Tsang, I.W., Kwok, J.T., Cheung, P.M.: Core vector machines: Fast SVM training on very large data sets. Journal of Machine Learning Research 6, 363–392 (2005)MathSciNetGoogle Scholar
  2. 2.
    Tsang, I.W., Kwok, J.T., Lai, K.T.: Core vector regression for very large regression problems. In: Proceedings of the Twentieth-Second International Conference on Machine Learning, Bonn, Germany, pp. 913–920 (2005)Google Scholar
  3. 3.
    Kim, H.C., Pang, S., Je, H.M., Kim, D., Bang, S.: Constructing support vector machine ensemble. Pattern Recognition 36, 2757–2767 (2003)MATHCrossRefGoogle Scholar
  4. 4.
    Valentini, G., Dietterich, T.: Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods. Journal of Machine Learning Research 5, 725–775 (2004)MathSciNetGoogle Scholar
  5. 5.
    Kivinen, J., Warmuth, M.K.: Boosting as entropy projection. In: Proceedings of the twelfth annual conference on Computational learning theory, Santa Cruz, California, United States, pp. 134–144 (1999)Google Scholar
  6. 6.
    Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Machine Learning 51, 181–207 (2003)MATHCrossRefGoogle Scholar
  7. 7.
    Kocsor, A., Kovács, K., Szepesvári, C.: Margin maximizing discriminant analysis. In: Proceedings of the 15th European Conference on Machine Learning, Pisa, Italy, pp. 227–238 (2004)Google Scholar
  8. 8.
    Mangasarian, O., Musicant, D.: Lagrangian support vector machines. Journal of Machine Learning Research 1, 161–177 (2001)MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Kienzle, W., Schölkopf, B.: Training support vector machines with multiple equality constraints. In: Proceedings of the European Conference on Machine Learning (2005)Google Scholar
  10. 10.
    Ye, J., Li, T., Xiong, T., Janardan, R.: Using uncorrelated discriminant analysis for tissue classification with gene expression d. IEEE/ACM Transactions on Computational Biology and Bioinformatics 1, 181–190 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ivor W. Tsang
    • 1
  • Andras Kocsor
    • 2
  • James T. Kwok
    • 1
  1. 1.Department of Computer Science and EngineeringHong Kong University of Science and TechnologyClear Water BayHong Kong
  2. 2.Research Group on Artificial IntelligenceHungarian Academy of Sciences and University of SzegedSzegedHungary

Personalised recommendations