Data Mining pp 277-297 | Cite as

Evolutionary Optimization of Least-Squares Support Vector Machines

  • Arjan GijsbertsEmail author
  • Giorgio MettaEmail author
  • Léon RothkrantzEmail author
Part of the Annals of Information Systems book series (AOIS, volume 8)


The performance of kernel machines depends to a large extent on its kernel function and hyperparameters. Selecting these is traditionally done using intuition or a costly “trial-and-error” approach,which typically prevents these methods frombeing used to their fullest extent. Therefore, two automated approaches are presented for the selection of a suitable kernel function and optimal hyperparameters for the least-squares support vector machine. The first approach uses evolution strategies, genetic algorithms, and genetic algorithms with floating point representation to find optimal hyperparameters in a timely manner. On benchmark data sets the standard genetic algorithms approach outperforms the two other evolutionary algorithms and is shown to bemore efficient than grid search. The second approach aims to improve the generalization capacity of the machine by evolving combined kernel functions using genetic programming.Empirical studies show that this model indeed increases the generalization performance of the machine, although this improvement comes at a high computational cost. This suggests that the approach may be justified primarily in applications where prediction errors can have severe consequences, such as in medical settings.


Genetic Algorithm Support Vector Machine Genetic Programming Support Vector Regression Grid Search 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.



This study has partially been funded by EU projects RobotCub (IST-004370) and CONTACT (NEST-5010). The authors gratefully acknowledge Francesco Orabona for his constructive comments and Francesco Nori and Lorenzo Natale for supplying the Reaching data sets.


  1. 1.
    Arthur Asuncion and David J. Newman. UCI machine learning repository, 2007.Google Scholar
  2. 2.
    Hans-Georg Beyer. The Theory of Evolution Strategies. Springer-Verlag New York, Inc., New York, NY, USA, 2001.Google Scholar
  3. 3.
    Hans-Georg Beyer and Hans-Paul Schwefel. Evolution strategies – a comprehensive introduction. Natural Computing: An International Journal, 1(1):3–52, 2002.Google Scholar
  4. 4.
    Christopher J. C. Burges. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2(2):121–167, 1998.Google Scholar
  5. 5.
    Olivier Chapelle, Vladimir N. Vapnik, Olivier Bousquet, and Sayan Mukherjee. Choosing multiple parameters for support vector machines. Machine Learning, 46(1–3):131–159, 2002.Google Scholar
  6. 6.
    Peng-Wei Chen, Jung-Ying Wang, and Hahn-Ming Lee. Model selection of svms using ga approach. Proceedings of the 2004 IEEE International Joint Conference on Neural Networks, 3(2):2035–2040, July 2004.Google Scholar
  7. 7.
    Zheng Chunhong and Jiao Licheng. Automatic parameters selection for svm based on ga. In WCICA 2004: Fifth World Congress on Intelligent Control and Automation, volume 2, pages 1869–1872, June 2004.Google Scholar
  8. 8.
    Laura Dioşan and Mihai Oltean. Evolving kernel function for support vector machines. In Cagnoni C., editor, The 17th European Conference on Artificial Intelligence, Evolutionary Computation Workshop, pages 11–16, 2006.Google Scholar
  9. 9.
    Laura Dioşan, Mihai Oltean, Alexandrina Rogozan, and Jean Pierre Pecuchet. Improving svm performance using a linear combination of kernels. In ICANNGA ’07: International Conference on Adaptive and Natural Computing Algorithms, number 4432 in LNCS, pages 218–227. Springer, 2007.Google Scholar
  10. 10.
    Larry J. Eshelman and J. David Schaffer. Real–coded genetic algorithms and interval-schemata. In L. Darrell Whitley, editor, Proceedings of the Second Workshop on Foundations of Genetic Algorithms, pages 187–202, San Mateo, 1993. Morgan Kaufmann.Google Scholar
  11. 11.
    Frauke Friedrichs and Christian Igel. Evolutionary tuning of multiple svm parameters. In ESANN 2004: Proceedings of the 12th European Symposium on Artificial Neural Networks, pages 519–524, April 2004.Google Scholar
  12. 12.
    Holger Fröhlich, Olivier Chapelle, and Bernhard Schölkopf. Feature selection for support vector machines by means of genetic algorithms. In ICTAI ’03: Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence, page 142, Washington, DC, USA, 2003. IEEE Computer Society.Google Scholar
  13. 13.
    Christian Gagné and Marc Parizeau. Genericity in evolutionary computation software tools: Principles and case study. International Journal on Artificial Intelligence Tools, 15(2):173–194, April 2006. 22 pages.Google Scholar
  14. 14.
    Christian Gagné, Marc Schoenauer, Michele Sebag, and Marco Tomassini. Genetic programming for kernel-based learning with co-evolving subsets selection. In Parallel Problem Solving from Nature – PPSN IX, volume 4193 of LNCS, pages 1008–1017, Reykjavik, Iceland, September 2006. Springer-Verlag.Google Scholar
  15. 15.
    John H. Holland. Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, MI, 1975.Google Scholar
  16. 16.
    Tom Howley and Michael G. Madden. The genetic kernel support vector machine: Description and evaluation. Artificial Intelligence Review, 24(3–4):379–395, 2005.Google Scholar
  17. 17.
    Chin-Chia Hsu, Chih-Hung Wu, Shih-Chien Chen, and Kang-Lin Peng. Dynamically optimizing parameters in support vector regression: An application of electricity load forecasting. In HICSS ’06: Proceedings of the 39th Annual Hawaii International Conference on System Sciences, page 30.3, Washington, DC, USA, 2006. IEEE Computer Society.Google Scholar
  18. 18.
    Cheng-Lung Huang and Chieh-Jen Wang. A ga-based feature selection and parameters optimization for support vector machines. Expert Systems with Applications, 31(2):231–240, 2006.CrossRefGoogle Scholar
  19. 19.
    Jaz Kandola, John Shawe-Taylor, and Nello Cristianini. Optimizing kernel alignment over combinations of kernels. Technical Report 121, Department of Computer Science, Royal Holloway, University of London, UK, 2002.Google Scholar
  20. 20.
    S. Sathiya Keerthi, Vikas Sindhwani, and Olivier Chapelle. An efficient method for gradient-based adaptation of hyperparameters in svm models. In B. Schölkopf, J. Platt, and T. Hoffman, editors, Advances in Neural Information Processing Systems 19, pages 673–680. MIT Press, Cambridge, MA, USA, 2007.Google Scholar
  21. 21.
    Ron Kohavi. A study of cross-validation and bootstrap for accuracy estimation and model selection. In International Joint Conference on Artificial Intelligence, pages 1137–1145, 1995.Google Scholar
  22. 22.
    John R. Koza. Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press, Cambridge, MA, USA, 1992.Google Scholar
  23. 23.
    Gert R. G. Lanckriet, Nello Cristianini, Peter L. Bartlett, Laurent El Ghaoui, and Michael I. Jordan. Learning the kernel matrix with semidefinite programming. Journal of Machine Learning Research, 5:27–72, 2004.Google Scholar
  24. 24.
    Wan-Jui Lee, Sergey Verzakov, and Robert P. W. Duin. Kernel combination versus classifier combination. In MCS 2007: Proceedings of the 7th International Workshop on Multiple Classifier Systems, pages 22–31, May 2007.Google Scholar
  25. 25.
    Stefan Lessmann, Robert Stahlbock, and Sven F. Crone. Genetic algorithms for support vector machine model selection. In IJCNN ’06: International Joint Conference on Neural Networks, pages 3063–3069. IEEE Press, July 2006.Google Scholar
  26. 26.
    Sung-Hwan Min, Jumin Lee, and Ingoo Han. Hybrid genetic algorithms and support vector machines for bankruptcy prediction. Expert Systems with Applications, 31(3):652–660, 2006.Google Scholar
  27. 27.
    Michinari Momma and Kristin P. Bennett. A pattern search method for model selection of support vector regression. In Proceedings of the Second SIAM International Conference on Data Mining. SIAM, April 2002.Google Scholar
  28. 28.
    David J. Montana. Strongly typed genetic programming. Evolutionary Computation, 3(2):199–230, 1995.CrossRefGoogle Scholar
  29. 29.
    Syng-Yup Ohn, Ha-Nam Nguyen, Dong Seong Kim, and Jong Sou Park. Determining optimal decision model for support vector machine by genetic algorithm. In CIS 2004: First International Symposium on Computational and Information Science, pages 895–902, Shanghai, China, December 2004.Google Scholar
  30. 30.
    Cheng S. Ong, Alexander J. Smola, and Robert C. Williamson. Learning the kernel with hyperkernels. Journal of Machine Learning Research, 6:1043–1071, 2005.Google Scholar
  31. 31.
    Tanasanee Phienthrakul and Boonserm Kijsirikul. Evolutionary strategies for multi-scale radial basis function kernels in support vector machines. In GECCO ’05: Proceedings of the 2005 Conference on Genetic and Evolutionary Computation, pages 905–911, New York, NY, USA, 2005. ACM Press.Google Scholar
  32. 32.
    Ryan Rifkin, Gene Yeo, and Tomaso Poggio. Regularized least squares classification. In Advances in Learning Theory: Methods, Model and Applications, volume 190, pages 131–154, Amsterdam, 2003. VIOS Press.Google Scholar
  33. 33.
    Sergio A. Rojas and Delmiro Fernandez-Reyes. Adapting multiple kernel parameters for support vector machines using genetic algorithms. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, volume 1, pages 626–631, Edinburgh, Scotland, UK, September 2005. IEEE Press.Google Scholar
  34. 34.
    Hans-Paul Schwefel. Collective phenomena in evolutionary systems. In Problems of Constancy and Change – The Complementarity of Systems Approaches to Complexity, volume 2, pages 1025–1033. International Society for General System Research, 1987.Google Scholar
  35. 35.
    John Shawe-Taylor and Nello Cristianini. Kernel Methods for Pattern Analysis. Cambridge University Press, June 2004.Google Scholar
  36. 36.
    Jian-Tao Sun, Ben-Yu Zhang, Zheng Chen, Yu-Chang Lu, Chun-Yi Shi, and Wei-Ying Ma. Ge-cko: A method to optimize composite kernels for web page classification. In WI ’04: Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence, pages 299–305, Washington, DC, USA, 2004. IEEE Computer Society.Google Scholar
  37. 37.
    Johan A. K. Suykens, Tony Van Gestel, Jos De Brabanter, Bart De Moor, and Joost Vandewalle. Least Squares Support Vector Machines. World Scientific Publishing Co., Pte, Ltd., Singapore, 2002.Google Scholar
  38. 38.
    Vladimir N. Vapnik. The nature of statistical learning theory. Springer-Verlag New York, Inc., New York, NY, USA, 1995.Google Scholar
  39. 39.
    R. Clint Whaley and Antoine Petitet. Minimizing development and maintenance costs in supporting persistently optimized BLAS. Software: Practice and Experience, 35(2):101–121, February 2005.Google Scholar
  40. 40.
    Darrell Whitley. An overview of evolutionary algorithms: practical issues and common pitfalls. Information and Software Technology, 43(14):817–831, 2001.CrossRefGoogle Scholar
  41. 41.
    Darrell Whitley, Marc Richards, Ross Beveridge, and Andre’ da Motta Salles Barreto. Alternative evolutionary algorithms for evolving programs: evolution strategies and steady state gp. In GECCO ’06: Proceedings of the 8th annual conference on Genetic and evolutionary computation, pages 919–926, New York, NY, USA, 2006. ACM Press.Google Scholar
  42. 42.
    Chih-Hung Wu, Gwo-Hshiung Tzeng, Yeong-Jia Goo, and Wen-Chang Fang. A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Systems with Applications, 32(2):397–408, 2007.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Italian Institute of Technology30 – GenoaItaly
  2. 2.Delft University of TechnologyDelftThe Netherlands
  3. 3.University of Genoa, Viale F. Causa13 – GenoaItaly
  4. 4.Netherlands Defence AcademyDen HelderThe Netherlands

Personalised recommendations