Multi-objective Parameters Selection for SVM Classification Using NSGA-II
Selecting proper parameters is an important issue to extend the classification ability of Support Vector Machine (SVM), which makes SVM practically useful. Genetic Algorithm (GA) has been widely applied to solve the problem of parameters selection for SVM classification due to its ability to discover good solutions quickly for complex searching and optimization problems. However, traditional GA in this field relys on single generalization error bound as fitness function to select parameters. Since there have several generalization error bounds been developed, picking and using single criterion as fitness function seems intractable and insufficient. Motivated by the multi-objective optimization problems, this paper introduces an efficient method of parameters selection for SVM classification based on multi-objective evolutionary algorithm NSGA-II. We also introduce an adaptive mutation rate for NSGA-II. Experiment results show that our method is better than single-objective approaches, especially in the case of tiny training sets with large testing sets.
KeywordsSupport Vector Machine Support Vector Generalization Error Support Vector Machine Classification Nondominated Sorting
Unable to display preview. Download preview PDF.
- 2.Chapelle, O., Vapnik, V.: Model Selection for Support Vector Machines. In: Advances in Neural Information Processing Systems, vol. 12. MIT Press, Cambridge (2000)Google Scholar
- 6.Lee, J.H., Lin, C.J.: Automatic Model Selection for Support Vector Machines (2000), http://www.csie.ntu.edu.tw/cjlin/papers/modelselect.ps.gz
- 7.Gunn, S.R.: Support Vector Machines for Classification and Regression. Technical Report, Image Speech and Intelligent Systems Research Group, University of Southampton (1997)Google Scholar
- 9.Deb, K.: Multiobjective Optimization Using Evolutionary Algorithms. Wiley, Chichester (2001)Google Scholar
- 11.Zheng, C.H., Jiao, L.C.: Automatic parameters selection for SVM based on GA. In: Intelligent Control and Automation, vol. 2, pp. 1869–1872. Springer, Heidelberg (2004)Google Scholar
- 12.Joachims, T.: Estimating the generalization performance of a SVM efficiently. In: Proceedings of the International Conference on Machine Learning. Morgan Kaufmann, San FranciscoGoogle Scholar
- 13.Joachims, T.: Making large-Scale SVM Learning Practical. In: Advances in Kernel Methods-Support Vector Learning, ch. 11. MIT Press, Cambridge (1999)Google Scholar
- 17.Luntz, A., Brailovsky, V.: On estimation of characters obtained in statistical procedure of recognition. Technicheskaya Kibernetica 3 (in Russian) (1969)Google Scholar
- 18.Jaakkola, T.S., Haussler, D.: Probabilistic kernel regression models. In: Proceedings of the 1999 Conference on AI and Statistics (1999)Google Scholar
- 19.Opper, M., Winther, O.: Gaussian processes and svm: Mean field and leave-one-out. Advances in large margin classifiers. MIT Press, Cambridge (2000)Google Scholar
- 20.De Jong, K.A.: An analysis of the behavior of a class of genetic adaptive systems. University of Michigan, Ann Arbor (1975)Google Scholar