Advertisement

Evolution Strategies

  • Thomas Bäck
  • Christophe Foussette
  • Peter Krause
Chapter
Part of the Natural Computing Series book series (NCS)

Abstract

Prior to introducing the particular algorithms in Sect. 2.2, the more general foundations of evolution strategies are introduced in Sect. 2.1. To start with, the definition of an optimization task as used throughout this book is given in Sect. 2.1.1. Following [58], Sect. 2.1.2 presents a discussion of evolution strategy metaheuristics as a special case of evolutionary algorithms.

Keywords

Covariance Matrix Evolution Strategy Strategy Parameter Multivariate Normal Distribution Cholesky Factor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Bibliography

  1. 1.
    S. Amari, Natural gradient works efficiently in learning. Neural Comput. 10(2), 251–276 (1998)MathSciNetCrossRefGoogle Scholar
  2. 2.
    D.V. Arnold, N. Hansen, Active covariance matrix adaptation for the (1+1)-CMA-ES, in Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation (GECCO’10), Portland, ed. by M. Pelikan, J. Branke (ACM, New York, 2010), pp. 385–392Google Scholar
  3. 3.
    D.V. Arnold, R. Salomon, Evolutionary gradient search revisited. IEEE Trans. Evol. Comput. 11(4), 480–495 (2007)CrossRefGoogle Scholar
  4. 4.
    A. Auger, N. Hansen, Performance evaluation of an advanced local search evolutionary algorithm, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’05), Edinburgh, vol. 2, ed. by B. McKay et al. (IEEE, Piscataway, 2005), pp. 1777–1784Google Scholar
  5. 5.
    A. Auger, N. Hansen, A restart CMA evolution strategy with increasing population size, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’05), Edinburgh, vol. 2, ed. by B. McKay et al. (IEEE, Piscataway, 2005), pp. 1769–1776Google Scholar
  6. 6.
    A. Auger, M. Schoenauer, N. Vanhaecke, LS-CMA-ES: a second-order algorithm for covariance matrix adaptation, in Proceedings of the 8th International Conference on Parallel Problem Solving from Nature (PPSN VIII), Birmingham, ed. by X. Yao et al. Volume 3242 of Lecture Notes in Computer Science (Springer, Berlin, 2004), pp. 182–191Google Scholar
  7. 7.
    A. Auger, D. Brockhoff, N. Hansen, Mirrored sampling in evolution strategies with weighted recombination, in Proceedings of the 13th Annual Genetic and Evolutionary Computation Conference (GECCO’11), Dublin, ed. by N. Krasnogor, P.L. Lanzi (ACM, New York, 2011), pp. 861–868Google Scholar
  8. 8.
    T. Bäck, Evolutionary Algorithms in Theory and Practice (Oxford University Press, New York, 1996)zbMATHGoogle Scholar
  9. 11.
    T. Bartz-Beielstein, C. Lasarczyk, M. Preuss, Sequential parameter optimization, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’05), Edinburgh, ed. by B. McKay et al. (IEEE, Piscataway, 2005), pp. 773–780Google Scholar
  10. 12.
    N. Beume, B. Naujoks, M. Emmerich, SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 181, 1653–1669 (2007)CrossRefzbMATHGoogle Scholar
  11. 13.
    H.-G. Beyer, B. Sendhoff, Covariance matrix adaptation revisited – the CMSA evolution strategy, in Proceedings of the 10th International Conference on Parallel Problem Solving from Nature (PPSN X), Dortmund, ed. by G. Rudolph et al. Volume 5199 in Lecture Notes in Computer Science (Springer, Berlin, 2008), pp. 123–132Google Scholar
  12. 15.
    Z. Bouzarkouna, A. Auger, D.-Y. Ding, Local-meta-model CMA-ES for partially separable functions, in Proceedings of the 13th Annual Genetic and Evolutionary Computation Conference (GECCO’11), Dublin, ed. by N. Krasnogor et al. (ACM, New York, 2011), pp. 869–876Google Scholar
  13. 16.
    D. Brockhoff, A. Auger, N. Hansen, D.V. Arnold, T. Hohm, Mirrored sampling and sequential selection for evolution strategies, in Proceedings of the 11th International Conference on Parallel Problem Solving from Nature (PPSN XI), Kraków, ed. by R. Schaefer et al. Volume 6238 in Lecture Notes in Computer Science. (Springer, Berlin, 2010), pp. 11–21Google Scholar
  14. 17.
    I.N. Bronstein, K.A. Semendjajew, G. Musiol, H. Muehlig, Taschenbuch der Mathematik, 7th edn. (Harri Deutsch, Frankfurt am Main, 2008)zbMATHGoogle Scholar
  15. 18.
    C.A. Coello Coello, Constraint-handling techniques used with evolutionary algorithms, in Proceedings of the 13th Annual Genetic and Evolutionary Computation Conference (GECCO’11), Companion Material, Dublin, ed. by N. Krasnogor et al. (ACM, New York, 2011), pp. 1137–1160Google Scholar
  16. 20.
    K. Deb, Multiobjective Optimization Using Evolutionary Algorithms. Wiley-Interscience Series in Systems and Optimization (Wiley, Chichester, 2001)Google Scholar
  17. 21.
    K. Deb, A. Pratap, S. Agarwal, T. Meyarivan, A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  18. 25.
    A.E. Eiben, J.E. Smith, Introduction to Evolutionary Computing. Natural Computing Series (Springer, Berlin, 2003)CrossRefzbMATHGoogle Scholar
  19. 26.
    T. Glasmachers, T. Schaul, Y. Sun, D. Wierstra, J. Schmidhuber, Exponential natural evolution strategies, in Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation (GECCO’10), Portland, ed. by M. Pelikan, J. Branke (ACM, New York, 2010)Google Scholar
  20. 27.
    D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning (Addison-Wesley, Boston, 1989)zbMATHGoogle Scholar
  21. 28.
    W.H. Greene, Econometric Analysis, 4th edn. (Prentice Hall, Upper Saddle River, 1997)Google Scholar
  22. 29.
    N. Hansen, The CMA evolution strategy: a tutorial. Continuously updated technical report, available via http://www.lri.fr/~hansen/cmatutorial.pdf. Accessed 12 Mar 2011
  23. 30.
    N. Hansen, S. Kern, Evaluating the CMA evolution strategy on multimodal test functions, in Proceedings of the 9th International Conference on Parallel Problem Solving from Nature (PPSN VIII), Birmingham. Volume 3242 of Lecture Notes in Computer Science, ed. by X. Yao et al. (Springer, 2004), pp. 282–291Google Scholar
  24. 31.
    N. Hansen, A. Ostermeier, Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation, in Proceedings of the 1996 IEEE International Conference on Evolutionary Computation (ICEC’96), Nagoya, ed. by Y. Davidor et al. (IEEE, Piscataway, 1996), pp. 312–317Google Scholar
  25. 32.
    N. Hansen, A. Ostermeier, Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)CrossRefGoogle Scholar
  26. 33.
    N. Hansen, A. Ostermeier, A. Gawelczyk, On the adaptation of arbitrary normal mutation distributions in evolution strategies: the generating set adaptation, in Proceedings of the 6th International Conference on Genetic Algorithms (ICGA 6), Pittsburgh, ed. by L.J. Eshelman (Morgan Kaufmann, San Francisco, 1995), pp. 57–64Google Scholar
  27. 35.
    N. Hansen, A. Auger, R. Ros, S. Finck, P. Posik, Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009, in Proceedings of the 12th International Conference on Genetic and Evolutionary Computation Conference (GECCO’10), Companion Material, Portland, ed. by M. Pelikan, J. Branke (ACM, New York, 2010), pp. 1689–1696Google Scholar
  28. 38.
    C. Igel, T. Suttorp, N. Hansen, A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies, in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation (GECCO’06), Seattle, ed. by M. Keijzer et al. (ACM, New York, 2006), pp. 453–460Google Scholar
  29. 39.
    G.A. Jastrebski, Improving evolution strategies through active covariance matrix adaptation. Master’s thesis, Faculty of Computer Science, Dalhousie University, 2005Google Scholar
  30. 40.
    G.A. Jastrebski, D.V. Arnold, Improving evolution strategies through active covariance matrix adaptation, in Proceedings of the 2006 IEEE Congress on Evolutionary Computation (CEC’06), Vancouver, BC, Canada, ed. by G.G. Yen et al. (IEEE, Piscataway, 2006), pp. 2814–2821Google Scholar
  31. 42.
    O. Kramer, A review of constraint-handling techniques for evolution strategies. Appl Comput. Int. Soft Comput. 2010, 1–11 (2010)CrossRefGoogle Scholar
  32. 43.
    R. Li, Mixed-integer evolution strategies for parameter optimization and their applications to medical image analysis. PhD thesis, Leiden Institute of Advanced Computer Science (LIACS), Faculty of Science, Leiden University, 2009Google Scholar
  33. 44.
    D.G. Luenberger, Y. Ye, Linear and Nonlinear Programming, 2nd edn. (Springer, Berlin, 2003)zbMATHGoogle Scholar
  34. 46.
    S.D. Müller, N. Hansen, P. Koumoutsakos, Increasing the serial and the parallel performance of the CMA-evolution strategy with large populations, in Proceedings of the 7th International Conference on Parallel Problem Solving from Nature (PPSN VII), Granada, ed. by J.J. Merelo et al. Volume 2439 of Lecture Notes in Computer Science (Springer, Berlin, 2002), pp. 422–431Google Scholar
  35. 47.
    A. Ostermeier, A. Gawelczyk, N. Hansen, A derandomized approach to self adaptation of evolution strategies. Evol. Comput. 2(4), 369–380 (1994)CrossRefGoogle Scholar
  36. 48.
    A. Ostermeier, A. Gawelczyk, N. Hansen, Step-size adaptation based on non-local use of selection information, in Proceedings of the 3rd International Conference on Parallel Problem Solving from Nature (PPSN III), Jerusalem, ed. by Y. Davidor et al. Volume 866 of Lecture Notes in Computer Science (Springer, Berlin, 1994), pp. 189–198Google Scholar
  37. 52.
    I. Rechenberg, Evolutionsstrategie: Optimierung Technischer Systeme nach Prinzipien der biologischen Evolution (Frommann-Holzboog, Stuttgart, 1973)Google Scholar
  38. 53.
    I. Rechenberg, Evolutionsstrategie’94 (Frommann-Holzboog, Stuttgart, 1994)Google Scholar
  39. 54.
    R. Ros, N. Hansen, A simple modification in CMA-ES achieving linear time and space complexity, in Proceedings of the 10th International Conference on Parallel Problem Solving from Nature (PPSN X), Dortmund, ed. by G. Rudolph et al. Volume 5199 of Lecture Notes in Computer Science (Springer, Berlin, 2008), pp. 296–305Google Scholar
  40. 55.
    G. Rudolph, On correlated mutations in evolution strategies, in Proceedings of the 2nd International Conference on Parallel Problem Solving from Nature (PPSN II), Brussels, ed. by R. Männer, B. Manderick (Elsevier, Amsterdam, 1992), pp. 105–114Google Scholar
  41. 56.
    G. Rudolph, An evolutionary algorithm for integer programming, in Proceedings of the 3rd Conference on Parallel Problem Solving from Nature (PPSN III), Jerusalem, ed. by Y. Davidor et al. Volume 866 of Lecture Notes in Computer Science (Springer, Berlin, 1994), pp. 63–66Google Scholar
  42. 57.
    G. Rudolph, Convergence Properties of Evolutionary Algorithms (Kovač, Hamburg, 1997)Google Scholar
  43. 58.
    G. Rudolph, Evolutionary strategies, in Handbook of Natural Computing, ed. by G. Rozenberg, T. Bäck, J.N. Kok (Springer, Berlin, 2012)Google Scholar
  44. 59.
    H.-P. Schwefel, Kybernetische Evolution als Strategie der experimentellen Forschung in der Strömungstechnik. Diplomarbeit, Technische Universität Berlin, Hermann Föttinger–Institut für Strömungstechnik, 1964Google Scholar
  45. 61.
    H.-P. Schwefel, Numerische Optimierung von Computer-Modellen Mittels der Evolutionsstrategie (Birkhäuser, Basel, 1977)zbMATHGoogle Scholar
  46. 62.
    H.-P. Schwefel, Numerical Optimization of Computer Models (Wiley, Chichester, 1981)zbMATHGoogle Scholar
  47. 63.
    O.M. Shir, Niching in Derandomized Evolution Strategies and its Applications in Quantum Control. PhD thesis, University of Leiden, The Netherlands, 2008Google Scholar
  48. 64.
    A. Stuart, K. Ord, S. Arnold, Kendall’s Advanced Theory of Statistics, Classical Inference and the Linear Model. Volume 2 in Kendall’s Library of Statistics (Wiley, Chichester, 2009)Google Scholar
  49. 66.
    Y. Sun, D. Wierstra, T. Schaul, J. Schmidhuber, Efficient natural evolution strategies, in Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation (GECCO’09), Shanghai, ed. by F. Rothlauf et al. (ACM, New York, 2009), pp. 539–546Google Scholar
  50. 67.
    Y. Sun, D. Wierstra, T. Schaul, J. Schmidhuber, Stochastic search using the natural gradient, in Proceedings of the 26th Annual International Conference on Machine Learning, ICML’09, Montreal, ed. by A. Pohoreckyj Danyluk et al. (ACM, New York, 2009), pp. 1161–1168Google Scholar
  51. 68.
    B. Tang, Orthogonal array-based latin hypercubes. J. Am. Stat. Assoc. 88(424), 1392–1397 (1993)CrossRefzbMATHGoogle Scholar
  52. 70.
    S. Wessing, M. Preuss, G. Rudolph, When parameter tuning actually is parameter control, in Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation (GECCO’11), Dublin, ed. by N. Krasnogor et al. (ACM, New York, 2011), pp. 821–828Google Scholar
  53. 71.
    D. Wierstra, T. Schaul, J. Peters, J. Schmidhuber, Natural evolution strategies, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’08), Hong Kong (IEEE, Piscataway, 2008), pp. 3381–3387Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Thomas Bäck
    • 1
  • Christophe Foussette
    • 2
  • Peter Krause
    • 2
  1. 1.Leiden UniversityLeidenThe Netherlands
  2. 2.divis intelligent solutions GmbHDortmundGermany

Personalised recommendations