A Simple Maximum Gain Algorithm for Support Vector Regression
Shevade’s et al. Modification 2 is one of the most widely used algorithms to build Support Vector Regression (SVR) models. It selects as a size 2 working set the index pair giving the maximum KKT violation and combines it with the updating heuristics of Smola and Schölkopf enforcing at each training iteration a \(\alpha_i \alpha^*_i =0\) condition. In this work we shall present an alternative, much simpler procedure that selects the updating indices as those giving a maximum gain in the SVR dual function. While we do not try to enforce the \(\alpha_i \alpha^*_i =0\) condition, we show that it will hold at each iteration provided it does so at the starting multipliers. We will numerically show that the proposed procedure requires essentially the same number of iterations than Modification 2 having thus the same time performance while being much simpler to code.
Unable to display preview. Download preview PDF.
- 3.Hsu, C.-W., Chang, C.-C., Lin C.-J.: A practical guide to support vector classification, www.csie.ntu.edu.tw/~cjlin/libsvmtools
- 5.Chang, C.-C., Lin C.-J.: Libsvm regression dataset repository, http://www.csie.ntu.edu.tw/cjlin/libsvmtools/datasets/regression.html