A Geometric Viewpoint of the Selection of the Regularization Parameter in Some Support Vector Machines

  • Nandyala Hemachandra
  • Puja Sahu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9468)


The regularization parameter of support vector machines is intended to improve their generalization performance. Since the feasible region of binary class support vector machines with finite dimensional feature space is a polytope, we note that classifiers at vertices of this unbounded polytope correspond to certain ranges of the regularization parameter. This reduces the search for a suitable regularization parameter to a search of (finite number of) vertices of this polytope. We propose an algorithm that identifies neighbouring vertices of a given vertex and thereby identifies the classifiers corresponding to the set of vertices of this polytope. A classifier can then be chosen from them based on a suitable test error criterion. We illustrate our results with an example which demonstrates that this path can be complicated. A portion of the path is sandwiched between two finite intervals of path, each generated by separate sets of vertices and edges.


Support vector machines Regularization path Polytopes Neighbouring vertices Prediction error Parameter tuning Linear programming 


  1. 1.
    Bertsimas, D., Tsitsiklis, J.N.: Introduction to linear optimization. Athena Scientifc Belmont, MA (1997) Google Scholar
  2. 2.
    Chang, Y.-W., Hsieh, C.-J., Chang, K.-W., Ringgaard, M., Lin, C.-J.: Training and testing low-degree polynomial data mappings via linear SVM. J. Mach. Learn. Res. 11, 1471–1490 (2010)MathSciNetzbMATHGoogle Scholar
  3. 3.
    Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Mach. Learn. 46(1), 131–159 (2002)CrossRefzbMATHGoogle Scholar
  4. 4.
    Domingos, P.: A unified bias-variance decomposition. In: Proceedings of 17th International Conference on Machine Learning, pp. 231–238. Morgan Kaufmann, Stanford CA (2000)Google Scholar
  5. 5.
    Geyer, C.J., Meeden, G.D.: Incorporates code from cddlib written by Komei Fukuda. rcdd: Computational Geometry, R package version 1.1-9 (2015)Google Scholar
  6. 6.
    Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer series in statistics. Springer, Heidelberg (2001) CrossRefGoogle Scholar
  7. 7.
    Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: The entire regularization path for the support vector machine. J. Mach. Learn. Res. 5, 1391–1415 (2004)MathSciNetzbMATHGoogle Scholar
  8. 8.
    Hemachandra, N., Sahu, P.: A geometric viewpoint of the selection of the regularization parameter in some support vector machines. Technical report, IE & OR, IIT Bombay, Mumbai, September 2015., September 30, 2015
  9. 9.
    Hiriart-Urruty, J.-B., Lemaréchal, C.: Fundamentals of Convex Analysis. Grundlehren Text Editions. Springer, Heidelberg (2004) Google Scholar
  10. 10.
    Jawanpuria, P., Varma, M., Nath, S.: On p-norm path following in multiple kernel learning for non-linear feature selection. In: Proceedings of the 31st International Conference on Machine Learning, pp. 118–126 (2014)Google Scholar
  11. 11.
    Ingo Steinwart and Andreas Christmann. Support vector machines. Springer Science & Business Media (2008)Google Scholar
  12. 12.
    Valentini, G., Dietterich, T.G.: Bias-variance analysis of support vector machines for the development of svm-based ensemble methods. J. Mach. Learn. Res. 5, 725–775 (2004)MathSciNetzbMATHGoogle Scholar
  13. 13.
    Huan, X., Caramanis, C., Mannor, S.: Robustness and regularization of support vector machines. J. Mach. Learn. Res. 10, 1485–1510 (2009)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Zhang, T.: Statistical behavior and consistency of classification methods based on convex risk minimization. Ann. Stat. 32, 56–85 (2004)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Indian Institute of Technology BombayMumbaiIndia

Personalised recommendations