Neural Computing and Applications

, Volume 18, Issue 7, pp 731–748 | Cite as

Leave-one-out bounds for support vector ordinal regression machine

Original Article

Abstract

The success of support vector machine depends upon its parameters. The leave-one-out (LOO) method provides a quantitative criterion for selecting those parameters. However, one shortcoming of the LOO method is that it is highly time consuming. An effective approach is to approximate the LOO error by an upper bound. This paper is concerned with the support vector ordinal regression machine (SVORM). Two bounds of the LOO error for SVORM are presented. The first bound is based on the geometrical concept of a span. The second one is based on the concept of support vector. Preliminary numerical experiments show the validity of the bounds.

Keywords

Support vector ordinal regression machine Leave-one-out Leave-one-out bound 

Notes

Acknowledgments

We would like to thank anonymous reviewers for their very concrete and helpful comments which improve this paper greatly.

References

  1. 1.
    Boser B, Guyon I, Vapnik V (1992) A training algorithm for optimal margin classifiers. In: Proceeding of the 5th annual ACM workshop on computational learing theory, pp 144–152Google Scholar
  2. 2.
    Vapnik V (1998) Statistical learning theory. Wiley, New YorkMATHGoogle Scholar
  3. 3.
    Devroye L, Györfi L, Lugosi G (1996) A probabilistic theory of pattern recognition. Springer, New YorkMATHGoogle Scholar
  4. 4.
    Jaakkola TS, Haussler D (1999) Exploiting generative models in discriminative classifiers. In: Advances in neural information processing systems, vol 11. MIT Press, Cambridge, pp 487–493Google Scholar
  5. 5.
    Vapnik V, Chapelle O (2000) Bounds on error expectation for support vector machines. Neural Comput 12(9):2013–2036CrossRefGoogle Scholar
  6. 6.
    Gretton A, Herbrich R, Chapelle O (2003) Estimating the leave-one-out error for classification learning with SVMs. http://www.kyb.tuebingen.mpg.de/publications/pss/ps1854.ps, May 15
  7. 7.
    Joachims T (2000) Estimating the generalization performance of an SVM efficientily. In: Proceedings of the 17th international conference on machine learning. Morgan Kaufmann, San Franscisco, pp 431–438Google Scholar
  8. 8.
    Tian Y-J (2005) Support vector regession machine and its application. Ph.D. thesis, China Algricultural UniversityGoogle Scholar
  9. 9.
    Chang M-W, Lin C-J (2005) Leave-one-out bounds for support vector regression model selection. Neural Comput 17(5):1188–1222MATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    Shashua A, Levin A (2002) Taxonomy of large margin principle algorithms for ordinal regression problems. In: Advances in neural information processing systems, vol 15. MIT Press, Cambridge, pp 57–64Google Scholar
  11. 11.
    Herbrich R, Graepel R, Bollmann-Sdorra P, Obermayer K (1998) Learning a preference relation for information retrieval. In: Proceedings of the AAAI workshop text categorization and machine learning, Madison, USAGoogle Scholar
  12. 12.
    Tangian A, Gruber J (1995) Constructing quadratic and polynomial objective functions. In: Proceedings of the 3rd international conference on econometric decision models, Schwerte, Germany. Springer, Heidelberg, pp 166–194Google Scholar
  13. 13.
    Anderson J (1984) Regression and ordered categorical variables (with discussion). J R Stat Soc C Ser B 46:1–30MATHGoogle Scholar
  14. 14.
    Herbrich R, Graepel T, Obermayer K (1999) Support vector learning for ordinal regression. In: Proceedings of the ninth international conference on arrifical neural networks, pp 97–102Google Scholar
  15. 15.
    Chu W, Keerthi SS (2005) New approaches to support vector ordinal regression. In: Proceedings of international conference on machine learning (ICML-05), pp 145–152Google Scholar
  16. 16.
    Arie BD, Yoav G (2005) Ordinal datasets. http://www.cs.waikato.ac.nz/ml/weka/
  17. 17.
    Weston J (1999) Leave-one-out support vector machines. In: Proceedings of the sixteenth international joint conference on artificial intelligence, pp 727–733Google Scholar

Copyright information

© Springer-Verlag London Limited 2008

Authors and Affiliations

  1. 1.College of Mathematics and Systems ScienceXinjiang UniversityUrumuqiPeople’s Republic of China
  2. 2.Academy of Mathematics and Systems ScienceCASBeijingPeople’s Republic of China
  3. 3.Research Center on Fictitious Economy and Data ScienceCASBeijingPeople’s Republic of China
  4. 4.College of ScienceChina Agricultural UniversityBeijingPeople’s Republic of China

Personalised recommendations