Probabilistic Model Combination for Support Vector Machine Using Positive-Definite Kernel-Based Regularization Path
Model combination is an important approach to improving the generalization performance of support vector machine (SVM), but usually has low computational efficiency. In this paper, we propose a novel probabilistic model combination method for support vector machine on regularization path (PMCRP). We first design an efficient regularization path algorithm, namely the regularization path of support vector machine based on positive-definite kernel (PDSVMP), which constructs the initial candidate model set. Then, we combine the initial models using Bayesian model averaging. Experimental results on benchmark datasets show that PMCRP has significant advantage over cross-validation and the Generalized Approximate Cross-Validation (GACV), meanwhile guaranteeing high computation efficiency of model combination.
KeywordsSupport Vector Machines Model Combination Regularization Path
Unable to display preview. Download preview PDF.
- 5.Opper, M., Winther, O.: Gaussian process classification and svm: mean field results and leave-one-out estimator. In: Smola, A.J., Bartlett, P., Schölkopf, B., Schuurmans, D. (eds.) Advances in Large Margin Classifiers, pp. 43–65. MIT Press, Cambridge (2000)Google Scholar
- 6.Seeger, M.: Bayesian model selection for support vector machine, gaussian processes and other kernel classifiers. In: Advances in Neural Information Processing Systems 12, pp. 603–609. MIT Press, Cambridge (2000)Google Scholar
- 9.Wahba, G., Lin, Y., Zhang, H.: Generalized approximate cross validation for support vector machines, or, another way to look at margin-like quantities. Tech. rep., Department of Statistics, University of Wisconsin (1999)Google Scholar