Evolutionary Intelligence

, Volume 5, Issue 3, pp 171–187

Kernel representations for evolving continuous functions

  • Tobias Glasmachers
  • Jan Koutník
  • Jürgen Schmidhuber
Special Issue

DOI: 10.1007/s12065-012-0070-y

Cite this article as:
Glasmachers, T., Koutník, J. & Schmidhuber, J. Evol. Intel. (2012) 5: 171. doi:10.1007/s12065-012-0070-y


To parameterize continuous functions for evolutionary learning, we use kernel expansions in nested sequences of function spaces of growing complexity. This approach is particularly powerful when dealing with non-convex constraints and discontinuous objective functions. Kernel methods offer a number of beneficial properties for parameterizing continuous functions, such as smoothness and locality, which make them attractive as a basis for mutation operators. Beyond such practical considerations, kernel methods make heavy use of inner products in function space and offer a well established regularization framework. We show how evolutionary computation can profit from these properties. Searching function spaces of iteratively increasing complexity allows the solution to evolve from a simple first guess to a complex and highly refined function. At transition points where the evolution strategy is confronted with the next level of functional complexity, the kernel framework can be used to project the search distribution into the extended search space. The feasibility of the method is demonstrated on challenging trajectory planning problems where redundant robots have to avoid obstacles.


Trajectory planning Collision avoidance Robotics Kernels Nested function spaces Evolution strategy 

Copyright information

© Springer-Verlag 2012

Authors and Affiliations

  • Tobias Glasmachers
    • 1
  • Jan Koutník
    • 2
  • Jürgen Schmidhuber
    • 2
  1. 1.Institute for Neural ComputationRuhr-University Bochum Universitätsstr. 150BochumGermany
  2. 2.IDSIA, University of Lugano and SUPSIManno-LuganoSwitzerland

Personalised recommendations