Abstract
In this paper, we propose two new smooth support vector machines for \(\varepsilon \)-insensitive regression. According to these two smooth support vector machines, we construct two systems of smooth equations based on two novel families of smoothing functions, from which we seek the solution to \(\varepsilon \)-support vector regression (\(\varepsilon \)-SVR). More specifically, using the proposed smoothing functions, we employ the smoothing Newton method to solve the systems of smooth equations. The algorithm is shown to be globally and quadratically convergent without any additional conditions. Numerical comparisons among different values of parameter are also reported.
Similar content being viewed by others
References
Basak, D., Pal, S., Patranabts, D.C.: Support vector regression. Neural Inf. Process. Lett. Rev. 11, 203–224 (2007)
Clarke, F.H.: Opimization and Nonsmooth Analysis. Wiley, New York (1983)
Dolan, E., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Friedman, J.H.: Multivariate adaptive regression splines. Ann. Stat. 19, 1–67 (1991)
Huang, C.-M., Lee, Y.-J.: Reduced support vector machines: a statistical theory. IEEE Trans. Neural Netw. 18, 1–13 (2007)
Huang, Z.-H., Zhang, Y., Wu, W.: A smoothing-type algorithm for solving system of inequalities. J. Comput. Appl. Math. 220, 355–363 (2008)
Lee, Y.-J., Hsieh, W.-F., Huang, C.-M.: \(\varepsilon \)-SSVR: a smooth support vector machine for \(\varepsilon \)-insensitive regression. IEEE Trans. Knowl. Eng. 17, 678–685 (2005)
Lee, Y.-J., Mangasarian, O.L.: SSVM: a smooth support vector machine for classification. Comput. Optim. Appl. 20, 5–22 (2001)
Lee, Y.-J., Mangasarian, O.L.: RSVM: Reduced support vector machines. In: Proceedings of the 2001 SIAM International Conference on Data Mining, https://doi.org/10.1137/1.9781611972719.13, (2001)
Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control Optim. 15, 957–972 (1977)
Musicant, D.R., Feinberg, A.: Active set support vector regression. IEEE Trans. Neural Netw. 15, 268–275 (2004)
Palais, R.S., Terng, C.-L.: Critical Point Theory and Submanifold Geometry, Lecture Notes in Mathematics, vol. 1353, Springer, Berlin (1988)
Platt, J.: Sequential minimal optimization: a fast algorithm for training support vector machines. In: Advances in Kernel Methods, Support Vector Learning, vol. 208, pp. 1–21, MIT Press, Boston. (1998)
Qi, L.-Q., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58, 353–367 (1993)
Tseng, P., Yun, S.: A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training. Comput. Optim. Appl. 47, 179–206 (2010)
Vapnik, V.: Estimation of Dependences Based on Empirical Data. Springer, New York (1982)
Vapnik, V.: The Natrure of Statistical Theory. Springer, New York (1995)
Vapnik, V., Golowith, S., Smola, A.: Support vector method for function approximation, regression estimation, and signal processing. Neural Inf. Process. Syst. 9, 281–287 (1997)
Yuan, Y.-B., Huang, T.-Z.: A polynomial smooth support vector machine for classification. Adv. Data Mining Appl. 3584, 157–164 (2005)
Author information
Authors and Affiliations
Corresponding author
Additional information
J.-S. Chen work is supported by Ministry of Science and Technology, Taiwan.
Rights and permissions
About this article
Cite this article
Gu, W., Chen, WP., Ko, CH. et al. Two smooth support vector machines for \(\varepsilon \)-insensitive regression. Comput Optim Appl 70, 171–199 (2018). https://doi.org/10.1007/s10589-017-9975-9
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-017-9975-9