Advertisement

Probabilistic Model Combination for Support Vector Machine Using Positive-Definite Kernel-Based Regularization Path

  • Ning Zhao
  • Zhihui Zhao
  • Shizhong Liao
Conference paper
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 122)

Abstract

Model combination is an important approach to improving the generalization performance of support vector machine (SVM), but usually has low computational efficiency. In this paper, we propose a novel probabilistic model combination method for support vector machine on regularization path (PMCRP). We first design an efficient regularization path algorithm, namely the regularization path of support vector machine based on positive-definite kernel (PDSVMP), which constructs the initial candidate model set. Then, we combine the initial models using Bayesian model averaging. Experimental results on benchmark datasets show that PMCRP has significant advantage over cross-validation and the Generalized Approximate Cross-Validation (GACV), meanwhile guaranteeing high computation efficiency of model combination.

Keywords

Support Vector Machines Model Combination Regularization Path 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Adankon, M.M., Cheriet, M.: Optimizing resources in model selection for support vector machine. Pattern Recognition 40(3), 953–963 (2007)zbMATHCrossRefGoogle Scholar
  2. 2.
    Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Machine Learning 46(1), 131–159 (2002)zbMATHCrossRefGoogle Scholar
  3. 3.
    Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: The entire regularization path for the support vector machine. Journal of Machine Learning Research 5, 1391–1415 (2004)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Ong, C.J., Shao, S., Yang, J.: An improved algorithm for the solution of the regularization path of support vector machine. IEEE Transactions on Neural Networks 21(3), 451–462 (2010)CrossRefGoogle Scholar
  5. 5.
    Opper, M., Winther, O.: Gaussian process classification and svm: mean field results and leave-one-out estimator. In: Smola, A.J., Bartlett, P., Schölkopf, B., Schuurmans, D. (eds.) Advances in Large Margin Classifiers, pp. 43–65. MIT Press, Cambridge (2000)Google Scholar
  6. 6.
    Seeger, M.: Bayesian model selection for support vector machine, gaussian processes and other kernel classifiers. In: Advances in Neural Information Processing Systems 12, pp. 603–609. MIT Press, Cambridge (2000)Google Scholar
  7. 7.
    Sollich, P.: Bayesian methods for support vector machines: evidence and predictive class probabilities. Machine Learning 46(1-3), 21–52 (2002)zbMATHCrossRefGoogle Scholar
  8. 8.
    Vapnik, V.N.: The nature of statistical learning theory. Springer, New York (2000)zbMATHGoogle Scholar
  9. 9.
    Wahba, G., Lin, Y., Zhang, H.: Generalized approximate cross validation for support vector machines, or, another way to look at margin-like quantities. Tech. rep., Department of Statistics, University of Wisconsin (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Ning Zhao
    • 1
  • Zhihui Zhao
    • 1
  • Shizhong Liao
    • 1
  1. 1.School of Computer Science and TechnologyTianjin UniversityChina

Personalised recommendations