Local Meta-models for Optimization Using Evolution Strategies

  • Stefan Kern
  • Nikolaus Hansen
  • Petros Koumoutsakos
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4193)


We employ local meta-models to enhance the efficiency of evolution strategies in the optimization of computationally expensive problems. The method involves the combination of second order local regression meta-models with the Covariance Matrix Adaptation Evolution Strategy. Experiments on benchmark problems demonstrate that the proposed meta-models have the potential to reliably account for the ranking of the offspring population resulting in significant computational savings. The results show that the use of local meta-models significantly increases the efficiency of already competitive evolution strategies.


Local Model Evaluation Fraction Meta Model Multimodal Function Bandwidth Parameter 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing 9(1), 3–12 (2005)CrossRefGoogle Scholar
  2. 2.
    Büche, D., Schraudolph, N.N., Koumoutsakos, P.: Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Trans.Sys., Man Cyber. 35(2), 183–194 (2005)CrossRefGoogle Scholar
  3. 3.
    Huang, D., Allen, T.T., Notz, W.I., Zeng, N.: Global optimization of stochastic black-box systems via sequential kriging meta-models. J.Glob.Opt. 34(3), 441–466 (2006)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Jin, Y., Olhofer, M., Sendhoff, B.: A framework fo evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002)CrossRefGoogle Scholar
  5. 5.
    Branke, J., Schmidt, C., Schmeck, H.: Efficient fitness estimation in noisy environments. In: Spector, J. (ed.) Genetic and Evolutionary Computation, pp. 243–250. Morgan Kaufmann, San Francisco (2001)Google Scholar
  6. 6.
    Regis, R.G., Shoemaker, C.A.: Local function approximation in evolutionary algorithms for the optimization of costly functions. IEEE Trans. Evol. Comput. 8(5), 490–504 (2004)CrossRefGoogle Scholar
  7. 7.
    Runarsson, T.P.: Constrained evolutionary optimization by approximate ranking and surrogate models. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 401–410. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  8. 8.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)CrossRefGoogle Scholar
  9. 9.
    Hansen, N., Kern, S.: Evaluating the CMA evolution strategy on multimodal test functions. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 282–291. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  10. 10.
    Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: Proceedings of the IEEE Congress on Evolutionary Computation (2005)Google Scholar
  11. 11.
    Jin, Y., Hüsken, M., Sendhoff, B.: Quality measures for approximate models in evolutionary computation. In: Barry, A.M. (ed.) GECCO 2003. Proceedings of the Bird of a Feather Workshop, Genetic and Evolutionary Computation Conference, pp. 170–173. AAAI, Menlo Park (2003)Google Scholar
  12. 12.
    Atkeson, C.G., Moore, A.W., Schaal, S.: Locally weighted learning. Artificial Intelligence Review 11(1-5), 11–73 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Stefan Kern
    • 1
  • Nikolaus Hansen
    • 1
  • Petros Koumoutsakos
    • 1
  1. 1.Computational Science and Engineering Laboratory, Institute of Computational ScienceETH ZurichSwitzerland

Personalised recommendations