Skip to main content

Part of the book series: Advanced Information and Knowledge Processing ((AI&KP))

  • 1594 Accesses

Abstract

The success of support vector machine depends on the tuning of its several parameters which affect the generalization error. An effective approach is to estimate the generalization error and then search for parameters so that this estimator is minimized. This requires that the estimators are both effective and computationally efficient. Leave-one-out (LOO) method is the extreme case of cross-validation, and in this case, a single point is excluded from the training set, and the classifier is trained using the remaining points. It is then determined whether this new classifier correctly labels the point that was excluded. The process is repeated over the entire training set, and the LOO error is computed by taking the average over these trials. LOO error provides an almost unbiased estimate of the generalization error. However one shortcoming of the LOO method is that it is highly time consuming, thus methods are sought to speed up the process. An effective approach is to approximate the LOO error by its upper bound that is a function of the parameters. Then, we search for parameter so that this upper bound is minimized. This approach has successfully been developed for both support vector classification machine and support vector regression machine. In this chapter we introduced other LOO bounds for several algorithms of support vector machine.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anderson, J.: Regression and ordered categorical variables (with discussion). J. R. Stat. Soc., Ser. B, Stat. Methodol. 46, 1–30 (1984)

    MATH  Google Scholar 

  2. Arie, B.D., Yoav, G.: Ordinal datasets. http://www.cs.waikato.ac.nz/ml/weka/ (2005)

  3. Burges, C.J.C., Crisp, D.J.: Uniqueness of the SVM solution. In: Solla, S.A., Leen, T.K., Mller, K.R. (eds.) Proceedings of the Twelfth Conference on Neural Information Processing Systems. MIT Press, Cambridge (1999)

    Google Scholar 

  4. Chang, M.W., Lin, C.J.: Leave-one-out bounds for support vector regression model selection. Neural Comput. 17, 1188–1222 (2005)

    Article  MATH  Google Scholar 

  5. Chu, W., Keerthi, S.S.: New approaches to support vector ordinal regression. In: Proc. of International Conference on Machine Learning (ICML-05), pp. 145–152 (2005)

    Chapter  Google Scholar 

  6. Devroye, L., Györfi, L., Lugosi, G.: A Probabilistic Theory of Pattern Recognition. Springer, Berlin (1996)

    MATH  Google Scholar 

  7. Gretton, A., Herbrich, R., Chapelle, O.: Estimating the leave-one-out error for classification learning with SVMs. http://www.kyb.tuebingen.mpg.de/publications/pss/ps1854.ps (2003)

  8. Herbrich, R., Graepel, R., Bollmann-Sdorra, P., Obermayer, K.: Learning a preference relation for information retrieval. In: Proceedings of the AAAI Workshop Text Categorization and Machine Learning, Madison, USA (1998)

    Google Scholar 

  9. Herbrich, R., Graepel, T., Obermayer, K.: Support vector learning for ordinal regression. In: Proceedings of the Ninth International Conference on Artificial Neural Networks, pp. 97–102. (1999)

    Chapter  Google Scholar 

  10. Jaakkola, T.S., Haussler, D.: Exploiting generative models in discriminative classifiers. In: Advances in Neural Information Processing Systems, vol. 11. MIT Press, Cambridge (1998)

    Google Scholar 

  11. Joachims, T.: Estimating the generalization performance of an SVM efficiently. In: Proceedings of the 17th International Conference on Machine Learning, San Francisco, CA, pp. 431–438. Morgan Kaufmann, San Mateo (2000)

    Google Scholar 

  12. Shashua, A., Levin, A.: Ranking with large margin principle: two approaches. Adv. Neural Inf. Process. Syst. 15, 937–944 (2003)

    Google Scholar 

  13. Tangian, A., Gruber, J.: Constructing quadratic and polynomial objective functions. In: Proceedings of the 3rd International Conference on Econometric Decision Models, Schwerte, Germany, pp. 166–194. Springer, Berlin (1995)

    Google Scholar 

  14. Tian, Y.J., Deng, N.Y.: Leave-one-out bounds for support vector regression. In: International Conference on Intelligent Agents, Web Technologies and Internet Commerce, Austria, vol. 2, pp. 1061–1066 (2005)

    Google Scholar 

  15. Tian, Y.J., Deng, N.Y.: A leave-one-out bound for support vector regression. In: Hawaii International Conference on Statistics, Mathematics and Related Fields, pp. 1551–1567 (2006)

    Google Scholar 

  16. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  17. Vapnik, V.N., Chapelle, O.: Bounds on error expectation for SVM. In: Advances in Large-Margin Classifiers (Neural Information Processing), pp. 261–280. MIT Press, Cambridge (2000)

    Google Scholar 

  18. Yang, Z.X.: Support vector ordinal regression machines and multi-class classification. Ph.D. Thesis, China Agricultural University (2007)

    Google Scholar 

  19. Yang, Z.X., Tian, Y.J., Deng, N.Y.: Leave-one-out bounds for support vector ordinal regression machine. In: Neural Computing and Applications (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yong Shi .

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag London Limited

About this chapter

Cite this chapter

Shi, Y., Tian, Y., Kou, G., Peng, Y., Li, J. (2011). LOO Bounds for Support Vector Machines. In: Optimization Based Data Mining: Theory and Applications. Advanced Information and Knowledge Processing. Springer, London. https://doi.org/10.1007/978-0-85729-504-0_2

Download citation

  • DOI: https://doi.org/10.1007/978-0-85729-504-0_2

  • Publisher Name: Springer, London

  • Print ISBN: 978-0-85729-503-3

  • Online ISBN: 978-0-85729-504-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics