Advertisement

SVM Regression Parameters Optimization Using Parallel Global Search Algorithm

  • Konstantin Barkalov
  • Alexey Polovinkin
  • Iosif Meyerov
  • Sergey Sidorov
  • Nikolai Zolotykh
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7979)

Abstract

The problem of optimal parameters selection for the regression construction method using Support Vector Machine is stated. Cross validation error function is taken as the criterion. Arising bound constrained nonlinear optimization problem is solved using parallel global search algorithm by R. Strongin with a number of modifications. Efficiency of the proposed approach is demonstrated using model problems. A possibility of the algorithm usage on large-scale cluster systems is evaluated. Linear speed-up of combined parallel global search algorithm is demonstrated.

Keywords

Machine learning global optimization support vector machine global search algorithm 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  2. 2.
    Ito, K., Nakano, R.: Optimization Support Vector Regression Hyperparameters Based on Cross-Validation. In: Proceedings of the International Joint Conference on Neural Networks, vol. 3, pp. 2077–2083 (2003)Google Scholar
  3. 3.
    Ren, Y., Bai, G.: Determination of Optimal SVM Parameters by Using GA/PSO. Journal of Computers 5(8), 1160–1168 (2010)CrossRefGoogle Scholar
  4. 4.
    Strongin, R.G.: Algorithms for multi-extremal mathematical programming problems employing the set of joint space-filling curves. J. of Global Optimization 2, 357–378 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Strongin, R.G., Sergeyev, Y.D.: Global optimization with non-convex constraints. Sequential and parallel algorithms. Kluwer Academic Publishers, Dordrecht (2000)CrossRefzbMATHGoogle Scholar
  6. 6.
    Gergel, V.P., Strongin, R.G.: Parallel computing for globally optimal decision making on cluster systems. Future Generation Computer Systems 21(5), 673–678 (2000)CrossRefGoogle Scholar
  7. 7.
    Barkalov, K., Ryabov, V., Sidorov, S.: Parallel Scalable Algorithms with Mixed Local-Global Strategy for Global Optimization Problems. In: Hsu, C.-H., Malyshkin, V. (eds.) MTPP 2010. LNCS, vol. 6083, pp. 232–240. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  8. 8.
    Jin, R., Chen, W., Simpson, T.W.: Comparative Studies of Meta-modeling Techniques under Multiple Modeling Criteria. Struct. Multidiscip. Optim. 23(1), 1–13 (2001)CrossRefGoogle Scholar
  9. 9.
    Wang, Y., Liu, Y., Ye, N., Yao, G.: The Parameters Selection for SVM Based on Improved Chaos Optimization Algorithm. In: Zhang, J. (ed.) ICAIC 2011, Part V. CCIS, vol. 228, pp. 376–383. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  10. 10.
    Zhang, H., He, Y.: Comparative study of chaotic neural networks with different models of chaotic noise. In: Wang, L., Chen, K., S. Ong, Y. (eds.) ICNC 2005. LNCS, vol. 3610, pp. 273–282. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  11. 11.
    Fröhlich, H., Zell, A.: Efficient parameter selection for support vector machines in classification and regression via model-based global optimization. In: IEEE International Joint Conference on Neural Networks, IJCNN 2005, vol. 3, pp. 1431–1436 (2005)Google Scholar
  12. 12.
    Jones, D., Schonlau, M., Welch, W.: Efficient global optimization of expensive black-box functions. J. Global Optimization 13, 455–492 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Momma, M., Bennett, K.P.: A pattern search method for model selection of support vector regression. In: Proceedings of SIAM Conference on Data Mining, pp. 261–274. SIAM, Philadelphia (2002)Google Scholar
  14. 14.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1996)Google Scholar
  15. 15.
    Kearns, M.: A bound on the error of cross validation using the approximation and estimation rates, with consequences for the training-test split. In: Adv. In Neural Information Processing Systems, vol. 8, pp. 183–189. MIT Press (1996)Google Scholar
  16. 16.
    Ng, A.Y.: Preventing of overfitting of cross-validation data. In: Proc. 14th Int. Conf. on Machine Leaning, pp. 245–253. Morgan Kaufmann (1997)Google Scholar
  17. 17.
    Grishagin, V.A., Sergeyev, Y.D., Strongin, R.G.: Parallel characteristical global optimization algorithms. Journal of Global Optimization 10(2), 185–206 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Sidorov, S.V.: Two-level parallel index algorithm for global optimization. Vestnik of Lobachevsky State University of Nizhni Novogorod 5(2), 208–213 (2012) (in Russian)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Konstantin Barkalov
    • 1
  • Alexey Polovinkin
    • 1
  • Iosif Meyerov
    • 1
  • Sergey Sidorov
    • 1
  • Nikolai Zolotykh
    • 1
  1. 1.N.I. Lobachevsky State University of Nizhni NovgorodNizhni NovgorodRussia

Personalised recommendations