Abstract
The learning process of support vector machine algorithms leads to solving a convex quadratic programming problem. Since this optimization problem has a unique solution and also satisfies the Karush–Kuhn–Tucker conditions, it can be solved very efficiently. In this chapter, the formulation of optimization problems which have arisen in the various forms of support vector machine algorithms is discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Cheng, K., Lu, Z., Wei, Y., Shi, Y., Zhou, Y.: Mixed kernel function support vector regression for global sensitivity analysis. Mech. Syst. Signal Process. 96, 201–214 (2017)
Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20, 273–297 (1995)
Drucker, H., Burges, C.J.C., Kaufman, L., Smola, A.J., Vapnik, V.: Support vector regression machines. Adv. Neural Inf. Process. Syst. 9, 155–161 (1997)
Frank, M., Wolfe, P.: An algorithm for quadratic programming. Nav. Res. Logist. Q. 3, 95–110 (1956)
Genton, M.G.: Classes of kernels for machine learning: a statistics perspective. J. Mach. Learn. Res. 2, 299–312 (2001)
Karush, W.: Minima of functions of several variables with inequalities as side constraints. M.Sc. Dissertation. Department of Mathematics, University of Chicago (1939)
Kuhn, H.W., Tucker, A.W.: Nonlinear programming. In: Berkeley Symposium on Mathematical Statistics and Probability. University of California Press, Berkeley (1951)
Mercer, J.: XVI. Functions of positive and negative type, and their connection the theory of integral equations. In: Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character, vol. 209, pp. 415–446 (1909)
Murty, K.G., Yu, F.T.: Linear Complementarity, Linear and Nonlinear Programming. Helderman, Berlin (1988)
Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, UK (2004)
Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Process. Lett. 9, 293–300 (1999)
Suykens, J.A.K., Gestel, T.V., Brabanter, J.D., Moor, B.D., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)
Vapnik, V., Chervonenkis, A.: A note one class of perceptrons. Autom. Remote. Control. 44, 103–109 (1964)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, Berlin (2000)
Zanaty, E.A., Afifi, A.: Support vector machines (SVMs) with universal kernels. Appl Artif. Intell. 25, 575–589 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Parand, K., Baharifard, F., Aghaei, A.A., Jani, M. (2023). Basics of SVM Method and Least Squares SVM. In: Rad, J.A., Parand, K., Chakraverty, S. (eds) Learning with Fractional Orthogonal Kernel Classifiers in Support Vector Machines. Industrial and Applied Mathematics. Springer, Singapore. https://doi.org/10.1007/978-981-19-6553-1_2
Download citation
DOI: https://doi.org/10.1007/978-981-19-6553-1_2
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-6552-4
Online ISBN: 978-981-19-6553-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)