Multi-objective Parameters Selection for SVM Classification Using NSGA-II

  • Li Xu
  • Chunping Li
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4065)


Selecting proper parameters is an important issue to extend the classification ability of Support Vector Machine (SVM), which makes SVM practically useful. Genetic Algorithm (GA) has been widely applied to solve the problem of parameters selection for SVM classification due to its ability to discover good solutions quickly for complex searching and optimization problems. However, traditional GA in this field relys on single generalization error bound as fitness function to select parameters. Since there have several generalization error bounds been developed, picking and using single criterion as fitness function seems intractable and insufficient. Motivated by the multi-objective optimization problems, this paper introduces an efficient method of parameters selection for SVM classification based on multi-objective evolutionary algorithm NSGA-II. We also introduce an adaptive mutation rate for NSGA-II. Experiment results show that our method is better than single-objective approaches, especially in the case of tiny training sets with large testing sets.


Support Vector Machine Support Vector Generalization Error Support Vector Machine Classification Nondominated Sorting 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Machine Learning 46(1), 131–159 (2002)MATHCrossRefGoogle Scholar
  2. 2.
    Chapelle, O., Vapnik, V.: Model Selection for Support Vector Machines. In: Advances in Neural Information Processing Systems, vol. 12. MIT Press, Cambridge (2000)Google Scholar
  3. 3.
    Vapnik, V.: The nature of statistical learning theory. Springer, Berlin (1995)MATHGoogle Scholar
  4. 4.
    Vapnik, V.: Statistical learning theory. John Wiley Sons, New York (1998)MATHGoogle Scholar
  5. 5.
    Vapnik, V., Chapelle, O.: Bounds on Error Expectation for Support Vector Machines. Neural Computation 12(9), 2013–2036 (2000)CrossRefGoogle Scholar
  6. 6.
    Lee, J.H., Lin, C.J.: Automatic Model Selection for Support Vector Machines (2000),
  7. 7.
    Gunn, S.R.: Support Vector Machines for Classification and Regression. Technical Report, Image Speech and Intelligent Systems Research Group, University of Southampton (1997)Google Scholar
  8. 8.
    Deb, K., Ptatap, A., Agarwal, S., Meyarivan, T.: A Fast and Elitist Multiobjective Genetic Algorithm NSGA-II. IEEE Transactions on Evolutionary Computation 6, 182–197 (2002)CrossRefGoogle Scholar
  9. 9.
    Deb, K.: Multiobjective Optimization Using Evolutionary Algorithms. Wiley, Chichester (2001)Google Scholar
  10. 10.
    Zhao, X.-M., Huang, D.-S., Cheung, Y.-m., Wang, H.-Q., Huang, X.: A Novel Hybrid GA/SVM System for Protein Sequences Classification. In: Yang, Z.R., Yin, H., Everson, R.M. (eds.) IDEAL 2004. LNCS, vol. 3177, pp. 11–16. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Zheng, C.H., Jiao, L.C.: Automatic parameters selection for SVM based on GA. In: Intelligent Control and Automation, vol. 2, pp. 1869–1872. Springer, Heidelberg (2004)Google Scholar
  12. 12.
    Joachims, T.: Estimating the generalization performance of a SVM efficiently. In: Proceedings of the International Conference on Machine Learning. Morgan Kaufmann, San FranciscoGoogle Scholar
  13. 13.
    Joachims, T.: Making large-Scale SVM Learning Practical. In: Advances in Kernel Methods-Support Vector Learning, ch. 11. MIT Press, Cambridge (1999)Google Scholar
  14. 14.
    Li, H., Wang, S.-Y., Qi, F.-h.: SVM Model Selection with the VC Bound. In: Zhang, J., He, J.-H., Fu, Y. (eds.) CIS 2004. LNCS, vol. 3314, pp. 1067–1071. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  15. 15.
    Ohn, S.-Y., Nguyen, H.-N., Kim, D.S., Park, J.-S.: Determining optimal decision model for support vector machine by genetic algorithm. In: Zhang, J., He, J.-H., Fu, Y. (eds.) CIS 2004. LNCS, vol. 3314, pp. 895–902. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  16. 16.
    Khoa, D.T.: Elitist Non-Dominated Sorting GA-II(NSGA-II) as a Parameterless Multi-Objective Genetic Algorithm. In: SoutheastCon Proceedings, pp. 359–367. IEEE, Los Alamitos (2005)CrossRefGoogle Scholar
  17. 17.
    Luntz, A., Brailovsky, V.: On estimation of characters obtained in statistical procedure of recognition. Technicheskaya Kibernetica 3 (in Russian) (1969)Google Scholar
  18. 18.
    Jaakkola, T.S., Haussler, D.: Probabilistic kernel regression models. In: Proceedings of the 1999 Conference on AI and Statistics (1999)Google Scholar
  19. 19.
    Opper, M., Winther, O.: Gaussian processes and svm: Mean field and leave-one-out. Advances in large margin classifiers. MIT Press, Cambridge (2000)Google Scholar
  20. 20.
    De Jong, K.A.: An analysis of the behavior of a class of genetic adaptive systems. University of Michigan, Ann Arbor (1975)Google Scholar
  21. 21.
    Coello Coello, C.A.: A short tutorial on evolutionary multiobjective optimization. In: Zitzler, E., Deb, K., Thiele, L., Coello Coello, C.A., Corne, D.W. (eds.) EMO 2001. LNCS, vol. 1993, p. 21. Springer, Heidelberg (2001)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Li Xu
    • 1
  • Chunping Li
    • 1
  1. 1.School of SoftwareTsinghua UniversityChina

Personalised recommendations