Abstract
The success of support vector machine depends on the tuning of its several parameters which affect the generalization error. An effective approach is to estimate the generalization error and then search for parameters so that this estimator is minimized. This requires that the estimators are both effective and computationally efficient. Leave-one-out (LOO) method is the extreme case of cross-validation, and in this case, a single point is excluded from the training set, and the classifier is trained using the remaining points. It is then determined whether this new classifier correctly labels the point that was excluded. The process is repeated over the entire training set, and the LOO error is computed by taking the average over these trials. LOO error provides an almost unbiased estimate of the generalization error. However one shortcoming of the LOO method is that it is highly time consuming, thus methods are sought to speed up the process. An effective approach is to approximate the LOO error by its upper bound that is a function of the parameters. Then, we search for parameter so that this upper bound is minimized. This approach has successfully been developed for both support vector classification machine and support vector regression machine. In this chapter we introduced other LOO bounds for several algorithms of support vector machine.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Anderson, J.: Regression and ordered categorical variables (with discussion). J. R. Stat. Soc., Ser. B, Stat. Methodol. 46, 1–30 (1984)
Arie, B.D., Yoav, G.: Ordinal datasets. http://www.cs.waikato.ac.nz/ml/weka/ (2005)
Burges, C.J.C., Crisp, D.J.: Uniqueness of the SVM solution. In: Solla, S.A., Leen, T.K., Mller, K.R. (eds.) Proceedings of the Twelfth Conference on Neural Information Processing Systems. MIT Press, Cambridge (1999)
Chang, M.W., Lin, C.J.: Leave-one-out bounds for support vector regression model selection. Neural Comput. 17, 1188–1222 (2005)
Chu, W., Keerthi, S.S.: New approaches to support vector ordinal regression. In: Proc. of International Conference on Machine Learning (ICML-05), pp. 145–152 (2005)
Devroye, L., Györfi, L., Lugosi, G.: A Probabilistic Theory of Pattern Recognition. Springer, Berlin (1996)
Gretton, A., Herbrich, R., Chapelle, O.: Estimating the leave-one-out error for classification learning with SVMs. http://www.kyb.tuebingen.mpg.de/publications/pss/ps1854.ps (2003)
Herbrich, R., Graepel, R., Bollmann-Sdorra, P., Obermayer, K.: Learning a preference relation for information retrieval. In: Proceedings of the AAAI Workshop Text Categorization and Machine Learning, Madison, USA (1998)
Herbrich, R., Graepel, T., Obermayer, K.: Support vector learning for ordinal regression. In: Proceedings of the Ninth International Conference on Artificial Neural Networks, pp. 97–102. (1999)
Jaakkola, T.S., Haussler, D.: Exploiting generative models in discriminative classifiers. In: Advances in Neural Information Processing Systems, vol. 11. MIT Press, Cambridge (1998)
Joachims, T.: Estimating the generalization performance of an SVM efficiently. In: Proceedings of the 17th International Conference on Machine Learning, San Francisco, CA, pp. 431–438. Morgan Kaufmann, San Mateo (2000)
Shashua, A., Levin, A.: Ranking with large margin principle: two approaches. Adv. Neural Inf. Process. Syst. 15, 937–944 (2003)
Tangian, A., Gruber, J.: Constructing quadratic and polynomial objective functions. In: Proceedings of the 3rd International Conference on Econometric Decision Models, Schwerte, Germany, pp. 166–194. Springer, Berlin (1995)
Tian, Y.J., Deng, N.Y.: Leave-one-out bounds for support vector regression. In: International Conference on Intelligent Agents, Web Technologies and Internet Commerce, Austria, vol. 2, pp. 1061–1066 (2005)
Tian, Y.J., Deng, N.Y.: A leave-one-out bound for support vector regression. In: Hawaii International Conference on Statistics, Mathematics and Related Fields, pp. 1551–1567 (2006)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Vapnik, V.N., Chapelle, O.: Bounds on error expectation for SVM. In: Advances in Large-Margin Classifiers (Neural Information Processing), pp. 261–280. MIT Press, Cambridge (2000)
Yang, Z.X.: Support vector ordinal regression machines and multi-class classification. Ph.D. Thesis, China Agricultural University (2007)
Yang, Z.X., Tian, Y.J., Deng, N.Y.: Leave-one-out bounds for support vector ordinal regression machine. In: Neural Computing and Applications (2009)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2011 Springer-Verlag London Limited
About this chapter
Cite this chapter
Shi, Y., Tian, Y., Kou, G., Peng, Y., Li, J. (2011). LOO Bounds for Support Vector Machines. In: Optimization Based Data Mining: Theory and Applications. Advanced Information and Knowledge Processing. Springer, London. https://doi.org/10.1007/978-0-85729-504-0_2
Download citation
DOI: https://doi.org/10.1007/978-0-85729-504-0_2
Publisher Name: Springer, London
Print ISBN: 978-0-85729-503-3
Online ISBN: 978-0-85729-504-0
eBook Packages: Computer ScienceComputer Science (R0)