Abstract
The decomposition method is currently one of the major methods for solving the convex quadratic optimization problems being associated with Support Vector Machines (SVM-optimization). A key issue in this approach is the policy for working set selection. We would like to find policies that realize (as good as possible) three goals simultaneously: “(fast) convergence to an optimal solution”, “efficient procedures for working set selection”, and “high degree of generality” (including typical variants of SVM-optimization as special cases). In this paper, we study a general policy for working set selection that has been proposed quite recently. It is known that it leads to convergence for any convex quadratic optimization problem. Here, we investigate its computational complexity when it is used for SVM-optimization. We show that it poses an NP-complete working set selection problem, but a slight variation of it (sharing the convergence properties with the original policy) can be solved in polynomial time. We show furthermore that so-called “rate certifying pairs” (introduced by Hush and Scovel) can be found in linear time, which leads to a quite efficient decomposition method with a polynomial convergence rate for SVM-optimization.
This work was supported in part by the IST Programme of the European Community, under the PASCAL Network of Excellence, IST-2002-506778. This publication only reflects the authors’ views. This work was furthermore supported by the Deutsche Forschungsgemeinschaft Grant SI 498/7-1.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory, pp. 144–152. ACM Press, New York (1992)
Chang, C.-C., Hsu, C.-W., Lin, C.-J.: The analysis of decomposition methods for support vector machines. IEEE Transactions on Neural Networks 11(4), 248–250 (2000)
Hsu, C.-W., Lin, C.-J.: A simple decomposition method for support vector machines. Machine Learning 46(1-3), 291–314 (2002)
Hush, D., Scovel, C.: Polynomial-time decomposition algorithms for support vector machines. Machine Learning 51, 51–71 (2003)
Ibarra, O.H., Kim, C.E.: Fast approximation algorithms for knapsack and sum of subset problem. Journal of the Association on Computing Machinery 22(4), 463–468 (1975)
Joachims, T.: Making large scale SVM learning practical. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods—Support Vector Learning, pp. 169–184. MIT Press, Cambridge (1998)
Karp, R.M.: Reducibility among combinatorial problems. In: Miller, R.E., Thatcher, J.W. (eds.) Complexity of Computer Computations, pp. 85–103. Plenum Press, New York (1972)
Keerthi, S.S., Gilbert, E.G.: Convergence of a generalized SMO algorithm for SVM classifier design. Machine Learning 46, 351–360 (2002)
Keerthi, S.S., Shevade, S., Bhattacharyya, C., Murthy, K.: Improvements to SMO algorithm for SVM regression. IEEE Transactions on Neural Networks 11(5), 1188–1193 (2000)
Keerthi, S.S., Shevade, S., Bhattacharyya, C., Murthy, K.: Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Computation 13, 637–649 (2001)
Laskov, P.: An improved decomposition algorithm for regression support vector machines. Machine Learning 46, 315–350 (2002)
Liao, S.-P., Lin, H.-T., Lin, C.-J.: A note on the decomposition methods for support vector regression. Neural Computation 14, 1267–1281 (2002)
Lin, C.-J.: On the convergence of the decomposition method for support vector machines. IEEE Transactions on Neural Networks 12, 1288–1298 (2001)
Lin, C.-J.: Asymptotic convergence of an SMO algorithm without any assumptions. IEEE Transactions on Neural Networks 13, 248–250 (2002)
Lin, C.-J.: A formal analysis of stopping criteria of decomposition methods for support vector machines. IEEE Transactions on Neural Networks 13, 1045–1052 (2002)
List, N., Simon, H.U.: A general convergence theorem for the decomposition method. In: Proceedings of the 17th Annual Conference on Computational Learning Theory, pp. 363–377. Springer, Heidelberg (2004)
Mangasarian, O.L., Musicant, D.R.: Successive overrelaxation for support vector machines. IEEE Transactions on Neural Networks 10, 1032–1037 (1999)
Mangasarian, O.L., Musicant, D.R.: Active support vector machine classification. Advances in Neural Information Processing Systems 12, 577–583 (2000)
Mangasarian, O.L., Musicant, D.R.: Lagrangian support vector machines. Journal of Machine Learning Research 1, 161–177 (2001)
Osuna, E.E., Freund, R., Girosi, F.: Training support vector machines: an application to face detection. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 130–136 (1997)
Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods—Support Vector Learning, pp. 185–208. MIT Press, Cambridge (1998)
Saunders, C., Stitson, M.O., Weston, J., Bottou, L., Schölkopf, B., Smola, A.J.: Support vector machine reference manual. Technical Report CSD-TR-98-03, Royal Holloway, University of London, Egham, UK (1998)
Vapnik, V.: Statistical Learning Theory. Series on Adaptive and Learning Systems for Signal Processing, Communications, and Control. John Wiley & Sons, Chichester (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Simon, H.U. (2004). On the Complexity of Working Set Selection. In: Ben-David, S., Case, J., Maruoka, A. (eds) Algorithmic Learning Theory. ALT 2004. Lecture Notes in Computer Science(), vol 3244. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30215-5_25
Download citation
DOI: https://doi.org/10.1007/978-3-540-30215-5_25
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23356-5
Online ISBN: 978-3-540-30215-5
eBook Packages: Springer Book Archive