Skip to main content

A Simple Maximum Gain Algorithm for Support Vector Regression

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5517))

Abstract

Shevade’s et al. Modification 2 is one of the most widely used algorithms to build Support Vector Regression (SVR) models. It selects as a size 2 working set the index pair giving the maximum KKT violation and combines it with the updating heuristics of Smola and Schölkopf enforcing at each training iteration a \(\alpha_i \alpha^*_i =0\) condition. In this work we shall present an alternative, much simpler procedure that selects the updating indices as those giving a maximum gain in the SVR dual function. While we do not try to enforce the \(\alpha_i \alpha^*_i =0\) condition, we show that it will hold at each iteration provided it does so at the starting multipliers. We will numerically show that the proposed procedure requires essentially the same number of iterations than Modification 2 having thus the same time performance while being much simpler to code.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14, 199–222 (2004)

    Article  MathSciNet  Google Scholar 

  2. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    Book  MATH  Google Scholar 

  3. Hsu, C.-W., Chang, C.-C., Lin C.-J.: A practical guide to support vector classification, www.csie.ntu.edu.tw/~cjlin/libsvmtools

  4. Shevade, S.K., Keerthi, S.S., Bhattacharyya, C., Murthy, K.R.K.: Improvements to the smo algorithm for svm regression. IEEE Transactions on Neural Networks 11, 1188–1193 (2000)

    Article  MATH  Google Scholar 

  5. Chang, C.-C., Lin C.-J.: Libsvm regression dataset repository, http://www.csie.ntu.edu.tw/cjlin/libsvmtools/datasets/regression.html

  6. Glasmachers, T., Igel, C.: Second order smo improves svm online and active learning. Neural Computation 20(2), 374–382 (2008)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Barbero, Á., Dorronsoro, J.R. (2009). A Simple Maximum Gain Algorithm for Support Vector Regression. In: Cabestany, J., Sandoval, F., Prieto, A., Corchado, J.M. (eds) Bio-Inspired Systems: Computational and Ambient Intelligence. IWANN 2009. Lecture Notes in Computer Science, vol 5517. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02478-8_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02478-8_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02477-1

  • Online ISBN: 978-3-642-02478-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics