How to Assess Step-Size Adaptation Mechanisms in Randomised Search

  • Nikolaus Hansen
  • Asma Atamna
  • Anne Auger
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8672)


Step-size adaptation for randomised search algorithms like evolution strategies is a crucial feature for their performance. The adaptation must, depending on the situation, sustain a large diversity or entertain fast convergence to the desired optimum. The assessment of step-size adaptation mechanisms is therefore non-trivial and often done in too restricted scenarios, possibly only on the sphere function. This paper introduces a (minimal) methodology combined with a practical procedure to conduct a more thorough assessment of the overall population diversity of a randomised search algorithm in different scenarios. We illustrate the methodology on evolution strategies with σ-self-adaptation, cumulative step-size adaptation and two-point adaptation. For the latter, we introduce a variant that abstains from additional samples by constructing two particular individuals within the given population to decide on the step-size change. We find that results on the sphere function alone can be rather misleading to assess mechanisms to control overall population diversity. The most striking flaws we observe for self-adaptation: on the linear function, the step-size increments are rather small, and on a moderately conditioned ellipsoid function, the adapted step-size is 20 times smaller than optimal.


Convergence Rate Sphere Function Stationary Sphere Optimal Convergence Rate Offspring Population 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Arnold, D.V.: Optimal weighted recombination. In: Wright, A.H., Vose, M.D., De Jong, K.A., Schmitt, L.M. (eds.) FOGA 2005. LNCS, vol. 3469, pp. 215–237. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  2. 2.
    Auger, A., Hansen, N.: Reconsidering the progress rate theory for evolution strategies in finite dimensions. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp. 445–452. ACM (2006)Google Scholar
  3. 3.
    Auger, A., Hansen, N.: On Proving Linear Convergence of Comparison-based Step-size Adaptive Randomized Search on Scaling-Invariant Functions via Stability of Markov Chains (2013) ArXiv eprintGoogle Scholar
  4. 4.
    Beyer, H.-G., Deb, K.: On self-adaptive features in real-parameter evolutionary algorithms. IEEE Transactions on Evolutionary Computation 5(3), 250–270 (2001)CrossRefGoogle Scholar
  5. 5.
    Brockhoff, D., Auger, A., Hansen, N., Arnold, D.V., Hohm, T.: Mirrored sampling and sequential selection for evolution strategies. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN XI. LNCS, vol. 6238, pp. 11–21. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  6. 6.
    Glasmachers, T., Schaul, T., Yi, S., Wierstra, D., Schmidhuber, J.: Exponential natural evolution strategies. In: Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, pp. 393–400. ACM (2010)Google Scholar
  7. 7.
    Hansen, N.: An analysis of mutative σ-self-adaptation on linear fitness functions. Evolutionary Computation 14(3), 255–275 (2006)CrossRefGoogle Scholar
  8. 8.
    Hansen, N.: CMA-ES with two-point step-size adaptation. CoRR, abs/0805.0231 (2008)Google Scholar
  9. 9.
    Hansen, N.: The CMA evolution strategy: A tutorial (2011)Google Scholar
  10. 10.
    Hansen, N., Kern, S.: Evaluating the CMA evolution strategy on multimodal test functions. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 282–291. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  12. 12.
    Hansen, N., Ros, R., Mauny, N., Schoenauer, M., Auger, A.: Impacts of invariance in search: When CMA-ES and PSO face ill-conditioned and non-separable problems. Applied Soft Computing 11(8), 5755–5769 (2011)CrossRefGoogle Scholar
  13. 13.
    Salomon, R.: Evolutionary algorithms and gradient search: Similarities and differences. IEEE Transactions on Evolutionary Computation 2(2), 45–55 (1998)CrossRefGoogle Scholar
  14. 14.
    Schwefel, H.-P.: Evolution and Optimum Seeking. In: Sixth-Generation Computer Technology, Wiley Interscience, New York (1995)Google Scholar
  15. 15.

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Nikolaus Hansen
    • 1
  • Asma Atamna
    • 1
  • Anne Auger
    • 1
  1. 1.Inria LRI (UMR 8623)University of Paris-Sud (UPSud)France

Personalised recommendations