Analysis of Algebraic Expressions Derived from Genetic Multivariate Polynomials and Support Vector Machines: A Case Study

  • Ángel Kuri-Morales
  • Iván Mejía-Guevara
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4225)


We discuss how algebraic explicit expressions modeling a complex phenomenon via an adequate set of data can be derived from the application of Genetic Multivariate Polynomials (GMPs), on the one hand, and Support Vector Machines (SVMs) on the other. A polynomial expression is derived in GMPs in a natural way, whereas in SVMs a polynomial kernel is employed to derive a similar one. In any particular problem an evolutionary determined sample of monomials is required in GMP expressions while, on the other hand, there is a large number of monomials implicit in the SVM approach. We make some experiments to compare the modeling characterization and accuracy obtained from the application of both methods.


Genetic Algorithm Support Vector Machine Support Vector Regression Algebraic Expression Polynomial Kernel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Salcedo, S., Fernández, J., Segovia, M., Bousoño, C.: Genetic programming for the prediction of insolvency in non-life insurance companies. Computers & Operations Research 32, 749–765 (2005)MATHCrossRefGoogle Scholar
  2. 2.
    Kuri, A.: Approximation and Classification with Genetic Multivariate Polynomials. WSEAS Transactions on Computers (3), 645–652 (2006)Google Scholar
  3. 3.
    Rudolph, G.: Convergence Analysis of Canonical Genetic Algorithms, IEEE Transactions on Neural Networks 5(1), 96–101 (1994)CrossRefGoogle Scholar
  4. 4.
    Kuri, A.: A Methodology for the Statistical Characterization of Genetic Algorithms. In: Proceedings of the Mexican International Congress on Artificial Intelligence, pp. 79–88. Springer-, Heidelberg (2002)Google Scholar
  5. 5.
    Boser, E., Guyon, I., Vapnik, V.: A training algorithm for optimal margin classifiers. In: 5th Annual ACM Workshop on COLT, pp. 144–152. ACM Press, New York (1992)CrossRefGoogle Scholar
  6. 6.
    Haykin, S.: Neural Networks. A comprehensive foundation, 2nd edn. Prentice Hall, New Jersey (1999)MATHGoogle Scholar
  7. 7.
    Chapelle, O., Haffner, P., Vapnik, V.: Support vector for histogram-based image classification. IEEE transactions on Neural Networks 10(5), 1055–1065 (1999)CrossRefGoogle Scholar
  8. 8.
    Smola, A., Schölkopf, B.: A Tutorial on Support Vector Regression. NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, UK (1998)Google Scholar
  9. 9.
    Kuri, A., Mejía, I.: Evolutionary Training of SVM for Classification Problems with Self-Adaptive Parameters. In: Gelbukh, A., Monroy, R. (eds.) Advances in Artificial Intelligence Theory. IPN, pp. 207–216 (2005)Google Scholar
  10. 10.
    Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, NY (1995)MATHGoogle Scholar
  11. 11.
    Cherkassky, V., Ma, Y.: Practical Selection of SVM Parameters and Noise Estimation for SVM Regression. Neural Networks 17(1), 113–126 (2004)MATHCrossRefGoogle Scholar
  12. 12.
    Chang, C., Lin, C.: LIBSVM: a library for support vector machines, Software (2001) available at :
  13. 13.
    Choi, M.D.: Tricks or Treats with the Hilbert Matrix. American Mathematical Monthly 90, 301–312 (1983)MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ángel Kuri-Morales
    • 1
  • Iván Mejía-Guevara
    • 2
  1. 1.Departamento de ComputaciónInstituto Tecnológico Autónomo de MéxicoMéxico D. F.
  2. 2.Posgrado en Ciencia e Ingeniería de la ComputaciónUniversidad Nacional Autónoma de México, IIMASMéxico D. F.

Personalised recommendations