Advertisement

Regularization Paths for ν-SVM and ν-SVR

  • Gaëlle Loosli
  • Gilles Gasso
  • Stéphane Canu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4493)

Abstract

This paper presents the ν-SVM and the ν-SVR full regularization paths along with a leave-one-out inspired stopping criterion and an efficient implementation. In the ν-SVR method, two parameters are provided by the user: the regularization parameter C and ν which settles the width of the ε-tube. In the classical ν-SVM method, parameter ν is an lower bound on the number of support vectors in the solution. Based on the previous works of [1,2], extensions of regularization paths for SVM and SVR are proposed and permit to automatically compute the solution path by varying ν or the regularization parameter.

Keywords

Support Vector Machine Support Vector Regularization Parameter Support Vector Regression Generalization Error 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: The entire regularization path for the support vector machine. Journal of Machine Learning Research 5, 1391–1415 (2004)MathSciNetGoogle Scholar
  2. 2.
    Gunter, L., Zhu, J.: Computing the solution path for the regularized support vector regression. In: NIPS (2005)Google Scholar
  3. 3.
    Chen, P.H., Lin, C.J., Schölkopf, B.: A tutorial on v-support vector machines. Applied Stochastic Models in Business and Industry 21, 111–136 (2005)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Schölkopf, B., Smola, A.: Leaning with Kernels. MIT Press, Cambridge (2001)Google Scholar
  5. 5.
    Argyriou, A., Hauser, R., Micchelli, C.A., Ponti, M.: A dc-programming algorithm for kernel selection. In: ICML (2006)Google Scholar
  6. 6.
    Micchelli, C.A., Pontil, M.: Learning the kernel function via regularization. Journal of Machine Learning Research 6, 1099–1125 (2005)MathSciNetGoogle Scholar
  7. 7.
    Bach, F., Heckerman, D., Horvitz, E.: On the path to an ideal ROC curve: Considering cost asymmetry in learning classifiers. In: Cowell, R.G., Ghahramani, Z. (eds.) AISTATS, Society for Artificial Intelligence and Statistics, pp. 9–16 (2005)Google Scholar
  8. 8.
    Wahba, G.: Support Vector Machines, Reproducing Kernel Hilbert spaces and the randomized GACV. In: Schöolkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods: Support Vector Learning, pp. 69–88. MIT Press, Cambridge (1999)Google Scholar
  9. 9.
    Vishwanathan, S.V.N., Smola, A.J., Murty, M.N.: Simple SVM. In: Proceedings of the Twentieth International Conference on Machine Learning (2003)Google Scholar
  10. 10.
    Wang, G., Yeung, D.Y., Lochovsky, F.: Two-dimensional solution path for support vector regression. In: Proc. of the 23rd International Conference on Machine Learning, ICML (2006)Google Scholar
  11. 11.
    Lee, J.H., Lin, C.J.: Automatic model selection for support vector machines. Technical report, Dept. of Computer Science and Information Engineering, National Taiwan University (2000)Google Scholar
  12. 12.
    Lee, M., Keerthi, S., Ong, C.J., DeCoste, D.: An efficient method for computing leave-one-out error in support vector machines with gaussian kernels. IEEE Transactions on Neural Networks 15, 750–757 (2004)CrossRefGoogle Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Gaëlle Loosli
    • 1
  • Gilles Gasso
    • 1
  • Stéphane Canu
    • 1
  1. 1.LITIS, EA 4051, RouenFrance

Personalised recommendations