Abstract
Shevade’s et al. Modification 2 is one of the most widely used algorithms to build Support Vector Regression (SVR) models. It selects as a size 2 working set the index pair giving the maximum KKT violation and combines it with the updating heuristics of Smola and Schölkopf enforcing at each training iteration a \(\alpha_i \alpha^*_i =0\) condition. In this work we shall present an alternative, much simpler procedure that selects the updating indices as those giving a maximum gain in the SVR dual function. While we do not try to enforce the \(\alpha_i \alpha^*_i =0\) condition, we show that it will hold at each iteration provided it does so at the starting multipliers. We will numerically show that the proposed procedure requires essentially the same number of iterations than Modification 2 having thus the same time performance while being much simpler to code.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14, 199–222 (2004)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Hsu, C.-W., Chang, C.-C., Lin C.-J.: A practical guide to support vector classification, www.csie.ntu.edu.tw/~cjlin/libsvmtools
Shevade, S.K., Keerthi, S.S., Bhattacharyya, C., Murthy, K.R.K.: Improvements to the smo algorithm for svm regression. IEEE Transactions on Neural Networks 11, 1188–1193 (2000)
Chang, C.-C., Lin C.-J.: Libsvm regression dataset repository, http://www.csie.ntu.edu.tw/cjlin/libsvmtools/datasets/regression.html
Glasmachers, T., Igel, C.: Second order smo improves svm online and active learning. Neural Computation 20(2), 374–382 (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Barbero, Á., Dorronsoro, J.R. (2009). A Simple Maximum Gain Algorithm for Support Vector Regression. In: Cabestany, J., Sandoval, F., Prieto, A., Corchado, J.M. (eds) Bio-Inspired Systems: Computational and Ambient Intelligence. IWANN 2009. Lecture Notes in Computer Science, vol 5517. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02478-8_10
Download citation
DOI: https://doi.org/10.1007/978-3-642-02478-8_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02477-1
Online ISBN: 978-3-642-02478-8
eBook Packages: Computer ScienceComputer Science (R0)