Skip to main content

Part of the book series: Industrial and Applied Mathematics ((INAMA))

  • 535 Accesses

Abstract

The learning process of support vector machine algorithms leads to solving a convex quadratic programming problem. Since this optimization problem has a unique solution and also satisfies the Karush–Kuhn–Tucker conditions, it can be solved very efficiently. In this chapter, the formulation of optimization problems which have arisen in the various forms of support vector machine algorithms is discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Cheng, K., Lu, Z., Wei, Y., Shi, Y., Zhou, Y.: Mixed kernel function support vector regression for global sensitivity analysis. Mech. Syst. Signal Process. 96, 201–214 (2017)

    Article  Google Scholar 

  • Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20, 273–297 (1995)

    Article  MATH  Google Scholar 

  • Drucker, H., Burges, C.J.C., Kaufman, L., Smola, A.J., Vapnik, V.: Support vector regression machines. Adv. Neural Inf. Process. Syst. 9, 155–161 (1997)

    Google Scholar 

  • Frank, M., Wolfe, P.: An algorithm for quadratic programming. Nav. Res. Logist. Q. 3, 95–110 (1956)

    Article  MathSciNet  Google Scholar 

  • Genton, M.G.: Classes of kernels for machine learning: a statistics perspective. J. Mach. Learn. Res. 2, 299–312 (2001)

    MathSciNet  MATH  Google Scholar 

  • Karush, W.: Minima of functions of several variables with inequalities as side constraints. M.Sc. Dissertation. Department of Mathematics, University of Chicago (1939)

    Google Scholar 

  • Kuhn, H.W., Tucker, A.W.: Nonlinear programming. In: Berkeley Symposium on Mathematical Statistics and Probability. University of California Press, Berkeley (1951)

    Google Scholar 

  • Mercer, J.: XVI. Functions of positive and negative type, and their connection the theory of integral equations. In: Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character, vol. 209, pp. 415–446 (1909)

    Google Scholar 

  • Murty, K.G., Yu, F.T.: Linear Complementarity, Linear and Nonlinear Programming. Helderman, Berlin (1988)

    MATH  Google Scholar 

  • Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, UK (2004)

    Book  MATH  Google Scholar 

  • Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Process. Lett. 9, 293–300 (1999)

    Article  Google Scholar 

  • Suykens, J.A.K., Gestel, T.V., Brabanter, J.D., Moor, B.D., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)

    Book  MATH  Google Scholar 

  • Vapnik, V., Chervonenkis, A.: A note one class of perceptrons. Autom. Remote. Control. 44, 103–109 (1964)

    MATH  Google Scholar 

  • Vapnik, V.: The Nature of Statistical Learning Theory. Springer, Berlin (2000)

    Book  MATH  Google Scholar 

  • Zanaty, E.A., Afifi, A.: Support vector machines (SVMs) with universal kernels. Appl Artif. Intell. 25, 575–589 (2011)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kourosh Parand .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Parand, K., Baharifard, F., Aghaei, A.A., Jani, M. (2023). Basics of SVM Method and Least Squares SVM. In: Rad, J.A., Parand, K., Chakraverty, S. (eds) Learning with Fractional Orthogonal Kernel Classifiers in Support Vector Machines. Industrial and Applied Mathematics. Springer, Singapore. https://doi.org/10.1007/978-981-19-6553-1_2

Download citation

Publish with us

Policies and ethics