Skip to main content

Training Support Vector Machines via SMO-Type Decomposition Methods

  • Conference paper
Algorithmic Learning Theory (ALT 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3734))

Included in the following conference series:

Abstract

This article gives a comprehensive study on SMO-type (Sequential Minimal Optimization) decomposition methods for training support vector machines. We propose a general and flexible selection of the two-element working set. Main theoretical results include 1) a simple asymptotic convergence proof, 2) a useful explanation of the shrinking and caching techniques, and 3) the linear convergence of this method. This analysis applies to any SMO-type implementation whose selection falls into the proposed framework.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Boser, B., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144–152. ACM Press, New York (1992)

    Chapter  Google Scholar 

  2. Chang, C.-C., Hsu, C.-W., Lin, C.-J.: The analysis of decomposition methods for support vector machines. IEEE Transactions on Neural Networks 11(4), 1003–1008 (2000)

    Article  Google Scholar 

  3. Chang, C.-C., Lin, C.-J.: LIBSVM: a library for support vector machines (2001), Software available at, http://www.csie.ntu.edu.tw/~cjlin/libsvm

  4. Chen, P.-H., Fan, R.-E., Lin, C.-J.: A study on SMO-type decomposition methods for support vector machines. Technical report, Department of Computer Science, National Taiwan University (2005), http://www.csie.ntu.edu.tw/~cjlin/papers/generalSMO.pdf

  5. Cortes, C., Vapnik, V.: Support-vector network. Machine Learning 20, 273–297 (1995)

    MATH  Google Scholar 

  6. Fan, R.-E., Chen, P.-H., Lin, C.-J.: Working set selection using the second order information for training SVM. Technical report, Department of Computer Science, National Taiwan University (2005)

    Google Scholar 

  7. Hush, D., Scovel, C.: Polynomial-time decomposition algorithms for support vector machines. Machine Learning 51, 51–71 (2003)

    Article  MATH  Google Scholar 

  8. Joachims, T.: Making large-scale SVM learning practical. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods - Support Vector Learning. MIT Press, Cambridge (1998)

    Google Scholar 

  9. Keerthi, S.S., Gilbert, E.G.: Convergence of a generalized SMO algorithm for SVM classifier design. Machine Learning 46, 351–360 (2002)

    Article  MATH  Google Scholar 

  10. Keerthi, S.S., Ong, C.J.: On the role of the threshold parameter in SVM training algorithms. Technical Report CD-00-09, Department of Mechanical and Production Engineering, National University of Singapore, Singapore (2000)

    Google Scholar 

  11. Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Computation 13, 637–649 (2001)

    Article  MATH  Google Scholar 

  12. Lin, C.-J.: Linear convergence of a decomposition method for support vector machines. Technical report, Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan (2001)

    Google Scholar 

  13. Lin, C.-J.: On the convergence of the decomposition method for support vector machines. IEEE Transactions on Neural Networks 12(6), 1288–1298 (2001)

    Article  Google Scholar 

  14. Lin, C.-J.: Asymptotic convergence of an SMO algorithm without any assumptions. IEEE Transactions on Neural Networks 13(1), 248–250 (2002)

    Article  Google Scholar 

  15. Lin, C.-J.: A formal analysis of stopping criteria of decomposition methods for support vector machines. IEEE Transactions on Neural Networks 13(5), 1045–1052 (2002)

    Article  Google Scholar 

  16. List, N., Simon, H.U.: A general convergence theorem for the decomposition method. In: Proceedings of the 17th Annual Conference on Learning Theory, pp. 363–377 (2004)

    Google Scholar 

  17. Osuna, E., Freund, R., Girosi, F.: Training support vector machines: An application to face detection. In: Proceedings of CVPR 1997, pp. 130–136. IEEE, New York (1997)

    Google Scholar 

  18. Palagi, L., Sciandrone, M.: On the convergence of a modified version of SVMlight algorithm. Optimization Methods and Software 20(2-3), 315–332 (2005)

    MathSciNet  Google Scholar 

  19. Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods - Support Vector Learning. MIT Press, Cambridge (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Chen, PH., Fan, RE., Lin, CJ. (2005). Training Support Vector Machines via SMO-Type Decomposition Methods. In: Jain, S., Simon, H.U., Tomita, E. (eds) Algorithmic Learning Theory. ALT 2005. Lecture Notes in Computer Science(), vol 3734. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11564089_6

Download citation

  • DOI: https://doi.org/10.1007/11564089_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-29242-5

  • Online ISBN: 978-3-540-31696-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics