Skip to main content

Evolution Strategies

  • Chapter
  • First Online:
Contemporary Evolution Strategies

Abstract

Prior to introducing the particular algorithms in Sect. 2.2, the more general foundations of evolution strategies are introduced in Sect. 2.1. To start with, the definition of an optimization task as used throughout this book is given in Sect. 2.1.1. Following [58], Sect. 2.1.2 presents a discussion of evolution strategy metaheuristics as a special case of evolutionary algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 44.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 44.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    This statement, however, is not meant to support the myth mentioned explicitly by Rudolph [58]: “Since early theoretical publications mainly analyzed simple ES without recombination, somehow the myth arose that ES put more emphasis on mutation than on recombination: This is a fatal misconception! Recombination has been an important ingredient of ES from the early beginning and this is still valid today.”

  2. 2.

    See Sect. 12.2.1 in [17] for the definition of a distance measure.

  3. 3.

    In the case of the (1+1)-ES the strategy parameters may be assigned to the algorithm itself instead of the individual, because only one set of strategy parameters is needed. This also holds for any strategy parameters which are not needed on the individual level (for example the covariance matrix of the CMA-ES).

  4. 4.

    Algorithm 3 in [58].

  5. 5.

    The normal distribution achieves maximum entropy among the distributions on the real domain. (See [64] for more details.)

  6. 6.

    A symmetric matrix \(\mathbf{A} \in {\mathbb{R}}^{n\times n}\) is positive definite iff \({\mathbf{x}}^{T}\mathbf{A}\mathbf{x} > 0\) for all \(\mathbf{x} \in {\mathbb{R}}^{n}\setminus \{\mathbf{0}\}\) [17].

  7. 7.

    For an orthogonal matrix A, \(\mathbf{A}{\mathbf{A}}^{T} ={ \mathbf{A}}^{T}\mathbf{A} = \mathbf{I}\) holds.

  8. 8.

    See Sect. 6.2.2.3 in [17].

  9. 9.

    The rectangular corridor model according to [8]: \(f_{1}(\mathbf{x}) = c_{0} + c_{1} \cdot x_{1}\) if the constraints \(g_{j}(\mathbf{x}): x_{j} \leq b\) with \(b \in {\mathbb{R}}^{+}\) for \(j \in \{2,\ldots,n\}\) are fulfilled, f 1(x) = otherwise.

  10. 10.

    The sphere model according to [8]: \(f_{2}(\mathbf{x}) = c_{0} + c_{1} \cdot \sum _{n}^{i=1}{(x_{i} - x_{i}^{{\ast}})}^{2}\).

  11. 11.

    The exact values are 0.184 and 0.2025 for the corridor and sphere models, respectively [8].

  12. 12.

    MSC is an abbreviation of mutative self-adaptation of covariances.

  13. 13.

    In the original publication it is called (1,λ)-ES with derandomized mutative step size.

  14. 14.

    This way, adapting the step size by a factor ξ requires at least 1∕β > 1 generations.

  15. 15.

    In the original paper, the algorithm is called (1,λ)-ES with derandomized mutative step size control using accumulated information.

  16. 16.

    The column vectors of the matrix B form a so-called generating set, which motivates the terminology generating set adaptation.

  17. 17.

    According to [32], the suggestion to use weighted recombination within the CMA-ES is due to Ingo Rechenberg, based on personal communication in 1998.

  18. 18.

    See [17]: \(\Gamma (n) =\int _{ 0}^{\infty }{x}^{n-1}\exp (-x)\,\mbox{ d}x\).

  19. 19.

    With the additional condition for A to consist of at least m = n 2 tuples.

  20. 20.

    Compare Sect. 19.2.1.2 in [17].

  21. 21.

    The term active is motivated by the fact that specifically the bad offspring individuals play an active role, although they would normally not be taken into account after selection has been applied.

  22. 22.

    This is explicitly avoided due to the occurrence of numerical instabilities for certain objective functions; see [40].

  23. 23.

    Population sizes μ and λ are not counted.

  24. 24.

    Instead of the term symmetrical, this is called mirrored in the context of this strategy.

  25. 25.

    In [26] the eNES are called exact natural evolution strategies.

  26. 26.

    The aforementioned techniques self-adaptation (see Sect. 2.2.1.2) or cumulative step size adaptation (see Sect. 2.2.2.1) are suitable.

  27. 27.

    See [70] for literature references on these topics as well as the Kriging modeling method.

  28. 28.

    For λ def the standard setting of a (μ W ,λ)-CMA-ES with \(\lambda _{\mathit{def }} = 4 + \lfloor 3\log n\rfloor \) is used.

  29. 29.

    In principal, any modeling technique can be used to establish the relationship between the exogenous parameters and the performance measure.

  30. 30.

    For example the NP-hard Traveling Salesman Problem.

Bibliography

  1. S. Amari, Natural gradient works efficiently in learning. Neural Comput. 10(2), 251–276 (1998)

    Article  MathSciNet  Google Scholar 

  2. D.V. Arnold, N. Hansen, Active covariance matrix adaptation for the (1+1)-CMA-ES, in Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation (GECCO’10), Portland, ed. by M. Pelikan, J. Branke (ACM, New York, 2010), pp. 385–392

    Google Scholar 

  3. D.V. Arnold, R. Salomon, Evolutionary gradient search revisited. IEEE Trans. Evol. Comput. 11(4), 480–495 (2007)

    Article  Google Scholar 

  4. A. Auger, N. Hansen, Performance evaluation of an advanced local search evolutionary algorithm, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’05), Edinburgh, vol. 2, ed. by B. McKay et al. (IEEE, Piscataway, 2005), pp. 1777–1784

    Google Scholar 

  5. A. Auger, N. Hansen, A restart CMA evolution strategy with increasing population size, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’05), Edinburgh, vol. 2, ed. by B. McKay et al. (IEEE, Piscataway, 2005), pp. 1769–1776

    Google Scholar 

  6. A. Auger, M. Schoenauer, N. Vanhaecke, LS-CMA-ES: a second-order algorithm for covariance matrix adaptation, in Proceedings of the 8th International Conference on Parallel Problem Solving from Nature (PPSN VIII), Birmingham, ed. by X. Yao et al. Volume 3242 of Lecture Notes in Computer Science (Springer, Berlin, 2004), pp. 182–191

    Google Scholar 

  7. A. Auger, D. Brockhoff, N. Hansen, Mirrored sampling in evolution strategies with weighted recombination, in Proceedings of the 13th Annual Genetic and Evolutionary Computation Conference (GECCO’11), Dublin, ed. by N. Krasnogor, P.L. Lanzi (ACM, New York, 2011), pp. 861–868

    Google Scholar 

  8. T. Bäck, Evolutionary Algorithms in Theory and Practice (Oxford University Press, New York, 1996)

    MATH  Google Scholar 

  9. T. Bartz-Beielstein, C. Lasarczyk, M. Preuss, Sequential parameter optimization, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’05), Edinburgh, ed. by B. McKay et al. (IEEE, Piscataway, 2005), pp. 773–780

    Google Scholar 

  10. N. Beume, B. Naujoks, M. Emmerich, SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 181, 1653–1669 (2007)

    Article  MATH  Google Scholar 

  11. H.-G. Beyer, B. Sendhoff, Covariance matrix adaptation revisited – the CMSA evolution strategy, in Proceedings of the 10th International Conference on Parallel Problem Solving from Nature (PPSN X), Dortmund, ed. by G. Rudolph et al. Volume 5199 in Lecture Notes in Computer Science (Springer, Berlin, 2008), pp. 123–132

    Google Scholar 

  12. Z. Bouzarkouna, A. Auger, D.-Y. Ding, Local-meta-model CMA-ES for partially separable functions, in Proceedings of the 13th Annual Genetic and Evolutionary Computation Conference (GECCO’11), Dublin, ed. by N. Krasnogor et al. (ACM, New York, 2011), pp. 869–876

    Google Scholar 

  13. D. Brockhoff, A. Auger, N. Hansen, D.V. Arnold, T. Hohm, Mirrored sampling and sequential selection for evolution strategies, in Proceedings of the 11th International Conference on Parallel Problem Solving from Nature (PPSN XI), Kraków, ed. by R. Schaefer et al. Volume 6238 in Lecture Notes in Computer Science. (Springer, Berlin, 2010), pp. 11–21

    Google Scholar 

  14. I.N. Bronstein, K.A. Semendjajew, G. Musiol, H. Muehlig, Taschenbuch der Mathematik, 7th edn. (Harri Deutsch, Frankfurt am Main, 2008)

    MATH  Google Scholar 

  15. C.A. Coello Coello, Constraint-handling techniques used with evolutionary algorithms, in Proceedings of the 13th Annual Genetic and Evolutionary Computation Conference (GECCO’11), Companion Material, Dublin, ed. by N. Krasnogor et al. (ACM, New York, 2011), pp. 1137–1160

    Google Scholar 

  16. K. Deb, Multiobjective Optimization Using Evolutionary Algorithms. Wiley-Interscience Series in Systems and Optimization (Wiley, Chichester, 2001)

    Google Scholar 

  17. K. Deb, A. Pratap, S. Agarwal, T. Meyarivan, A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  18. A.E. Eiben, J.E. Smith, Introduction to Evolutionary Computing. Natural Computing Series (Springer, Berlin, 2003)

    Book  MATH  Google Scholar 

  19. T. Glasmachers, T. Schaul, Y. Sun, D. Wierstra, J. Schmidhuber, Exponential natural evolution strategies, in Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation (GECCO’10), Portland, ed. by M. Pelikan, J. Branke (ACM, New York, 2010)

    Google Scholar 

  20. D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning (Addison-Wesley, Boston, 1989)

    MATH  Google Scholar 

  21. W.H. Greene, Econometric Analysis, 4th edn. (Prentice Hall, Upper Saddle River, 1997)

    Google Scholar 

  22. N. Hansen, The CMA evolution strategy: a tutorial. Continuously updated technical report, available via http://www.lri.fr/~hansen/cmatutorial.pdf. Accessed 12 Mar 2011

  23. N. Hansen, S. Kern, Evaluating the CMA evolution strategy on multimodal test functions, in Proceedings of the 9th International Conference on Parallel Problem Solving from Nature (PPSN VIII), Birmingham. Volume 3242 of Lecture Notes in Computer Science, ed. by X. Yao et al. (Springer, 2004), pp. 282–291

    Google Scholar 

  24. N. Hansen, A. Ostermeier, Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation, in Proceedings of the 1996 IEEE International Conference on Evolutionary Computation (ICEC’96), Nagoya, ed. by Y. Davidor et al. (IEEE, Piscataway, 1996), pp. 312–317

    Google Scholar 

  25. N. Hansen, A. Ostermeier, Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  26. N. Hansen, A. Ostermeier, A. Gawelczyk, On the adaptation of arbitrary normal mutation distributions in evolution strategies: the generating set adaptation, in Proceedings of the 6th International Conference on Genetic Algorithms (ICGA 6), Pittsburgh, ed. by L.J. Eshelman (Morgan Kaufmann, San Francisco, 1995), pp. 57–64

    Google Scholar 

  27. N. Hansen, A. Auger, R. Ros, S. Finck, P. Posik, Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009, in Proceedings of the 12th International Conference on Genetic and Evolutionary Computation Conference (GECCO’10), Companion Material, Portland, ed. by M. Pelikan, J. Branke (ACM, New York, 2010), pp. 1689–1696

    Google Scholar 

  28. C. Igel, T. Suttorp, N. Hansen, A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies, in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation (GECCO’06), Seattle, ed. by M. Keijzer et al. (ACM, New York, 2006), pp. 453–460

    Google Scholar 

  29. G.A. Jastrebski, Improving evolution strategies through active covariance matrix adaptation. Master’s thesis, Faculty of Computer Science, Dalhousie University, 2005

    Google Scholar 

  30. G.A. Jastrebski, D.V. Arnold, Improving evolution strategies through active covariance matrix adaptation, in Proceedings of the 2006 IEEE Congress on Evolutionary Computation (CEC’06), Vancouver, BC, Canada, ed. by G.G. Yen et al. (IEEE, Piscataway, 2006), pp. 2814–2821

    Google Scholar 

  31. O. Kramer, A review of constraint-handling techniques for evolution strategies. Appl Comput. Int. Soft Comput. 2010, 1–11 (2010)

    Article  Google Scholar 

  32. R. Li, Mixed-integer evolution strategies for parameter optimization and their applications to medical image analysis. PhD thesis, Leiden Institute of Advanced Computer Science (LIACS), Faculty of Science, Leiden University, 2009

    Google Scholar 

  33. D.G. Luenberger, Y. Ye, Linear and Nonlinear Programming, 2nd edn. (Springer, Berlin, 2003)

    MATH  Google Scholar 

  34. S.D. Müller, N. Hansen, P. Koumoutsakos, Increasing the serial and the parallel performance of the CMA-evolution strategy with large populations, in Proceedings of the 7th International Conference on Parallel Problem Solving from Nature (PPSN VII), Granada, ed. by J.J. Merelo et al. Volume 2439 of Lecture Notes in Computer Science (Springer, Berlin, 2002), pp. 422–431

    Google Scholar 

  35. A. Ostermeier, A. Gawelczyk, N. Hansen, A derandomized approach to self adaptation of evolution strategies. Evol. Comput. 2(4), 369–380 (1994)

    Article  Google Scholar 

  36. A. Ostermeier, A. Gawelczyk, N. Hansen, Step-size adaptation based on non-local use of selection information, in Proceedings of the 3rd International Conference on Parallel Problem Solving from Nature (PPSN III), Jerusalem, ed. by Y. Davidor et al. Volume 866 of Lecture Notes in Computer Science (Springer, Berlin, 1994), pp. 189–198

    Google Scholar 

  37. I. Rechenberg, Evolutionsstrategie: Optimierung Technischer Systeme nach Prinzipien der biologischen Evolution (Frommann-Holzboog, Stuttgart, 1973)

    Google Scholar 

  38. I. Rechenberg, Evolutionsstrategie’94 (Frommann-Holzboog, Stuttgart, 1994)

    Google Scholar 

  39. R. Ros, N. Hansen, A simple modification in CMA-ES achieving linear time and space complexity, in Proceedings of the 10th International Conference on Parallel Problem Solving from Nature (PPSN X), Dortmund, ed. by G. Rudolph et al. Volume 5199 of Lecture Notes in Computer Science (Springer, Berlin, 2008), pp. 296–305

    Google Scholar 

  40. G. Rudolph, On correlated mutations in evolution strategies, in Proceedings of the 2nd International Conference on Parallel Problem Solving from Nature (PPSN II), Brussels, ed. by R. Männer, B. Manderick (Elsevier, Amsterdam, 1992), pp. 105–114

    Google Scholar 

  41. G. Rudolph, An evolutionary algorithm for integer programming, in Proceedings of the 3rd Conference on Parallel Problem Solving from Nature (PPSN III), Jerusalem, ed. by Y. Davidor et al. Volume 866 of Lecture Notes in Computer Science (Springer, Berlin, 1994), pp. 63–66

    Google Scholar 

  42. G. Rudolph, Convergence Properties of Evolutionary Algorithms (Kovač, Hamburg, 1997)

    Google Scholar 

  43. G. Rudolph, Evolutionary strategies, in Handbook of Natural Computing, ed. by G. Rozenberg, T. Bäck, J.N. Kok (Springer, Berlin, 2012)

    Google Scholar 

  44. H.-P. Schwefel, Kybernetische Evolution als Strategie der experimentellen Forschung in der Strömungstechnik. Diplomarbeit, Technische Universität Berlin, Hermann Föttinger–Institut für Strömungstechnik, 1964

    Google Scholar 

  45. H.-P. Schwefel, Numerische Optimierung von Computer-Modellen Mittels der Evolutionsstrategie (Birkhäuser, Basel, 1977)

    MATH  Google Scholar 

  46. H.-P. Schwefel, Numerical Optimization of Computer Models (Wiley, Chichester, 1981)

    MATH  Google Scholar 

  47. O.M. Shir, Niching in Derandomized Evolution Strategies and its Applications in Quantum Control. PhD thesis, University of Leiden, The Netherlands, 2008

    Google Scholar 

  48. A. Stuart, K. Ord, S. Arnold, Kendall’s Advanced Theory of Statistics, Classical Inference and the Linear Model. Volume 2 in Kendall’s Library of Statistics (Wiley, Chichester, 2009)

    Google Scholar 

  49. Y. Sun, D. Wierstra, T. Schaul, J. Schmidhuber, Efficient natural evolution strategies, in Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation (GECCO’09), Shanghai, ed. by F. Rothlauf et al. (ACM, New York, 2009), pp. 539–546

    Google Scholar 

  50. Y. Sun, D. Wierstra, T. Schaul, J. Schmidhuber, Stochastic search using the natural gradient, in Proceedings of the 26th Annual International Conference on Machine Learning, ICML’09, Montreal, ed. by A. Pohoreckyj Danyluk et al. (ACM, New York, 2009), pp. 1161–1168

    Google Scholar 

  51. B. Tang, Orthogonal array-based latin hypercubes. J. Am. Stat. Assoc. 88(424), 1392–1397 (1993)

    Article  MATH  Google Scholar 

  52. S. Wessing, M. Preuss, G. Rudolph, When parameter tuning actually is parameter control, in Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation (GECCO’11), Dublin, ed. by N. Krasnogor et al. (ACM, New York, 2011), pp. 821–828

    Google Scholar 

  53. D. Wierstra, T. Schaul, J. Peters, J. Schmidhuber, Natural evolution strategies, in Proceedings of the IEEE Congress on Evolutionary Computation (CEC’08), Hong Kong (IEEE, Piscataway, 2008), pp. 3381–3387

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Bäck, T., Foussette, C., Krause, P. (2013). Evolution Strategies. In: Contemporary Evolution Strategies. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40137-4_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-40137-4_2

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-40136-7

  • Online ISBN: 978-3-642-40137-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics