Skip to main content
Log in

Two smooth support vector machines for \(\varepsilon \)-insensitive regression

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose two new smooth support vector machines for \(\varepsilon \)-insensitive regression. According to these two smooth support vector machines, we construct two systems of smooth equations based on two novel families of smoothing functions, from which we seek the solution to \(\varepsilon \)-support vector regression (\(\varepsilon \)-SVR). More specifically, using the proposed smoothing functions, we employ the smoothing Newton method to solve the systems of smooth equations. The algorithm is shown to be globally and quadratically convergent without any additional conditions. Numerical comparisons among different values of parameter are also reported.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Basak, D., Pal, S., Patranabts, D.C.: Support vector regression. Neural Inf. Process. Lett. Rev. 11, 203–224 (2007)

    Google Scholar 

  2. Clarke, F.H.: Opimization and Nonsmooth Analysis. Wiley, New York (1983)

    Google Scholar 

  3. Dolan, E., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  4. Friedman, J.H.: Multivariate adaptive regression splines. Ann. Stat. 19, 1–67 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  5. Huang, C.-M., Lee, Y.-J.: Reduced support vector machines: a statistical theory. IEEE Trans. Neural Netw. 18, 1–13 (2007)

    Article  Google Scholar 

  6. Huang, Z.-H., Zhang, Y., Wu, W.: A smoothing-type algorithm for solving system of inequalities. J. Comput. Appl. Math. 220, 355–363 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  7. Lee, Y.-J., Hsieh, W.-F., Huang, C.-M.: \(\varepsilon \)-SSVR: a smooth support vector machine for \(\varepsilon \)-insensitive regression. IEEE Trans. Knowl. Eng. 17, 678–685 (2005)

    Article  Google Scholar 

  8. Lee, Y.-J., Mangasarian, O.L.: SSVM: a smooth support vector machine for classification. Comput. Optim. Appl. 20, 5–22 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  9. Lee, Y.-J., Mangasarian, O.L.: RSVM: Reduced support vector machines. In: Proceedings of the 2001 SIAM International Conference on Data Mining, https://doi.org/10.1137/1.9781611972719.13, (2001)

  10. Mifflin, R.: Semismooth and semiconvex functions in constrained optimization. SIAM J. Control Optim. 15, 957–972 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  11. Musicant, D.R., Feinberg, A.: Active set support vector regression. IEEE Trans. Neural Netw. 15, 268–275 (2004)

    Article  Google Scholar 

  12. Palais, R.S., Terng, C.-L.: Critical Point Theory and Submanifold Geometry, Lecture Notes in Mathematics, vol. 1353, Springer, Berlin (1988)

  13. Platt, J.: Sequential minimal optimization: a fast algorithm for training support vector machines. In: Advances in Kernel Methods, Support Vector Learning, vol. 208, pp. 1–21, MIT Press, Boston. (1998)

  14. Qi, L.-Q., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58, 353–367 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  15. Tseng, P., Yun, S.: A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training. Comput. Optim. Appl. 47, 179–206 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  16. Vapnik, V.: Estimation of Dependences Based on Empirical Data. Springer, New York (1982)

    MATH  Google Scholar 

  17. Vapnik, V.: The Natrure of Statistical Theory. Springer, New York (1995)

    Google Scholar 

  18. Vapnik, V., Golowith, S., Smola, A.: Support vector method for function approximation, regression estimation, and signal processing. Neural Inf. Process. Syst. 9, 281–287 (1997)

    Google Scholar 

  19. Yuan, Y.-B., Huang, T.-Z.: A polynomial smooth support vector machine for classification. Adv. Data Mining Appl. 3584, 157–164 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jein-Shan Chen.

Additional information

J.-S. Chen work is supported by Ministry of Science and Technology, Taiwan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gu, W., Chen, WP., Ko, CH. et al. Two smooth support vector machines for \(\varepsilon \)-insensitive regression. Comput Optim Appl 70, 171–199 (2018). https://doi.org/10.1007/s10589-017-9975-9

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-017-9975-9

Keywords

Navigation