Advertisement

Algorithmica

, Volume 80, Issue 5, pp 1658–1709 | Cite as

Optimal Static and Self-Adjusting Parameter Choices for the \((1+(\lambda ,\lambda ))\) Genetic Algorithm

  • Benjamin Doerr
  • Carola Doerr
Article
Part of the following topical collections:
  1. Special Issue on Genetic and Evolutionary Computation

Abstract

The \((1+(\lambda ,\lambda ))\) genetic algorithm proposed in Doerr et al. (Theor Comput Sci 567:87–104, 2015) is one of the few examples for which a super-constant speed-up of the expected optimization time through the use of crossover could be rigorously demonstrated. It was proven that the expected optimization time of this algorithm on OneMax is \(O(\max \{n \log (n) / \lambda , \lambda n\})\) for any offspring population size \(\lambda \in \{1,\ldots ,n\}\) (and the other parameters suitably dependent on \(\lambda \)) and it was shown experimentally that a self-adjusting choice of \(\lambda \) leads to a better, most likely linear, runtime. In this work, we study more precisely how the optimization time depends on the parameter choices, leading to the following results on how to optimally choose the population size, the mutation probability, and the crossover bias both in a static and a dynamic fashion. For the mutation probability and the crossover bias depending on \(\lambda \) as in Doerr et al. (2015), we improve the previous runtime bound to \(O(\max \{n \log (n) / \lambda , n \lambda \log \log (\lambda )/\log (\lambda )\})\). This expression is minimized by a value of \(\lambda \) slightly larger than what the previous result suggested and gives an expected optimization time of \(O\left( n \sqrt{\log (n) \log \log \log (n) / \log \log (n)}\right) \). We show that no static choice in the three-dimensional parameter space of offspring population, mutation probability, and crossover bias gives an asymptotically better runtime. We also prove that the self-adjusting parameter choice suggested in Doerr et al. (2015) outperforms all static choices and yields the conjectured linear expected runtime. This is asymptotically optimal among all possible parameter choices.

Keywords

Theory of randomized search heuristics Runtime analysis Genetic algorithms Parameter choice Parameter control 

Notes

Acknowledgements

This work was supported by a public Grant as part of the Investissement d’avenir Project, reference ANR-11-LABX-0056-LMH, LabEx LMH, in a joint call with Programme Gaspard Monge en Optimisation et Recherche Opérationnelle. It is also supported by a second Project within the “FMJH Program Gaspard Monge in optimization and operation research” program, and from the support to this program from EDF (Électricité de France).

References

  1. 1.
    Anil, G., Wiegand, R.P.: Black-box search by elimination of fitness functions. In: Proceedings of Foundations of Genetic Algorithms (FOGA’09), pp. 67–78. ACM (2009)Google Scholar
  2. 2.
    Auger, A.: Benchmarking the (1+1) evolution strategy with one-fifth success rule on the BBOB-2009 function testbed. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’09), (Companion), pp. 2447–2452. ACM (2009)Google Scholar
  3. 3.
    Auger, A., Hansen, N.: Linear Convergence on Positively Homogeneous Functions of a Comparison Based Step-size Adaptive Randomized Search: The (1+1) ES with Generalized One-fifth Success Rule (2013). http://arxiv.org/abs/1310.8397
  4. 4.
    Badkobeh, G., Lehre, P.K., Sudholt, D.: Unbiased black-box complexity of parallel search. In: Proceedings of Parallel Problem Solving from Nature (PPSN’14), Lecture Notes in Computer Science, vol. 8672, pp. 892–901. Springer (2014)Google Scholar
  5. 5.
    Böttcher, S., Doerr, B., Neumann, F.: Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In: Proceedings of Parallel Problem Solving from Nature (PPSN’10), Lecture Notes in Computer Science, vol. 6238, pp. 1–10. Springer (2010)Google Scholar
  6. 6.
    Dang, D., Friedrich, T., Kötzing, T., Krejca, M.S., Lehre, P.K., Oliveto, P.S., Sudholt, D., Sutton, A.M.: Emergence of diversity and its benefits for crossover in genetic algorithms. In: Proceedings of Parallel Problem Solving from Nature (PPSN’16), Lecture Notes in Computer Science, vol. 9921, pp. 890–900. Springer (2016)Google Scholar
  7. 7.
    Dang, D., Friedrich, T., Kötzing, T., Krejca, M.S., Lehre, P.K., Oliveto, P.S., Sudholt, D., Sutton, A.M.: Escaping local optima with diversity mechanisms and crossover. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’16), pp. 645–652. ACM (2016)Google Scholar
  8. 8.
    Dang, D.C., Lehre, P.K.: Self-adaptation of mutation rates in non-elitist populations. In: Proceedings of Parallel Problem Solving from Nature (PPSN’16), Lecture Notes in Computer Science, vol. 9921, pp. 803–813. Springer (2016)Google Scholar
  9. 9.
    De Jong, K.A.: An analysis of the behavior of a class of genetic adaptive systems. Ph.D. thesis, Ann Arbor, MI, USA (1975)Google Scholar
  10. 10.
    Devroye, L.: The compound random search. Ph.D. dissertation, Purdue University, West Lafayette, IN (1972)Google Scholar
  11. 11.
    Doerr, B.: Analyzing randomized search heuristics: tools from probability theory. In: Auger, A., Doerr, B. (eds.) Theory of Randomized Search Heuristics, pp. 1–20. World Scientific Publishing (2011). http://www.worldscientific.com/doi/suppl/10.1142/7438/suppl_file/7438_chap01.pdf
  12. 12.
    Doerr, B.: Optimal parameter settings for the \((1+(\lambda , \lambda ))\) genetic algorithm. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’16), pp. 1107–1114. ACM (2016)Google Scholar
  13. 13.
    Doerr, B., Doerr, C.: Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings. In: Proceedigns of Genetic and Evolutionary Computation Conference (GECCO’15), pp. 1335–1342. ACM (2015)Google Scholar
  14. 14.
    Doerr, B., Doerr, C.: A tight runtime analysis of the (1+(\(\lambda \), \(\lambda \))) genetic algorithm on OneMax. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’15), pp. 1423–1430. ACM (2015)Google Scholar
  15. 15.
    Doerr, B., Doerr, C., Ebel, F.: Lessons from the black-box: Fast crossover-based genetic algorithms. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’13), pp. 781–788. ACM (2013)Google Scholar
  16. 16.
    Doerr, B., Doerr, C., Ebel, F.: From black-box complexity to designing new genetic algorithms. Theor. Comput. Sci. 567, 87–104 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Doerr, B., Doerr, C., Kötzing, T.: Provably optimal self-adjusting step sizes for multi-valued decision variables. In: Proceedings of Parallel Problem Solving from Nature (PPSN’16), Lecture Notes in Computer Science, vol. 9921, pp. 782–791. Springer (2016)Google Scholar
  18. 18.
    Doerr, B., Doerr, C., Kötzing, T.: The right mutation strength for multi-valued decision variables. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’16), pp. 1115–1122. ACM (2016). http://arxiv.org/abs/1604.03277
  19. 19.
    Doerr, B., Doerr, C., Yang, J.: \(k\)-bit mutation with self-adjusting \(k\) outperforms standard bit mutation. In: Proceedings of Parallel Problem Solving from Nature (PPSN’16), Lecture Notes in Computer Science, vol. 9921, pp. 824–834. Springer (2016)Google Scholar
  20. 20.
    Doerr, B., Goldberg, L.A.: Drift analysis with tail bounds. In: Proceedings of Parallel Problem Solving from Nature (PPSN’10), Lecture Notes in Computer Science, vol. 6238, pp. 174–183. Springer (2010)Google Scholar
  21. 21.
    Doerr, B., Goldberg, L.A.: Adaptive drift analysis. Algorithmica 65, 224–250 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    Doerr, B., Happ, E., Klein, C.: Crossover can provably be useful in evolutionary computation. Theor. Comput. Sci. 425, 17–33 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Doerr, B., Jansen, T., Sudholt, D., Winzen, C., Zarges, C.: Mutation rate matters even when optimizing monotonic functions. Evol. Comput. 21, 1–27 (2013)CrossRefGoogle Scholar
  24. 24.
    Doerr, B., Johannsen, D., Kötzing, T., Lehre, P.K., Wagner, M., Winzen, C.: Faster black-box algorithms through higher arity operators. In: Proceedings of Foundations of Genetic Algorithms (FOGA’11), pp. 163–172. ACM (2011)Google Scholar
  25. 25.
    Doerr, B., Johannsen, D., Kötzing, T., Neumann, F., Theile, M.: More effective crossover operators for the all-pairs shortest path problem. Theor. Comput. Sci. 471, 12–26 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Doerr, B., Johannsen, D., Winzen, C.: Multiplicative drift analysis. Algorithmica 64, 673–697 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Doerr, B., Künnemann, M.: Optimizing linear functions with the (1+\(\lambda \)) evolutionary algorithm-different asymptotic runtimes for different instances. Theor. Comput. Sci. 561, 3–23 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Doerr, B., Theile, M.: Improved analysis methods for crossover-based algorithms. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’09), pp. 247–254. ACM (2009)Google Scholar
  29. 29.
    Droste, S., Jansen, T., Wegener, I.: On the analysis of the (1+1) evolutionary algorithm. Theor. Comput. Sci. 276, 51–81 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  30. 30.
    Droste, S., Jansen, T., Wegener, I.: Upper and lower bounds for randomized search heuristics in black-box optimization. Theory Comput. Syst. 39, 525–544 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3, 124–141 (1999)CrossRefGoogle Scholar
  32. 32.
    Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Berlin (2003)CrossRefzbMATHGoogle Scholar
  33. 33.
    Erdős, P., Rényi, A.: On two problems of information theory. Magyar Tudományos Akadémia Matematikai Kutató Intézet Közleményei 8, 229–243 (1963)MathSciNetzbMATHGoogle Scholar
  34. 34.
    Fischer, S., Wegener, I.: The Ising model on the ring: mutation versus recombination. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’04), Lecture Notes in Computer Science, vol. 3102, pp. 1113–1124. Springer (2004)Google Scholar
  35. 35.
    Gießen, C., Witt, C.: Population size vs. mutation strength for the (1+\(\lambda \)) EA on OneMax. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’15), pp. 1439–1446. ACM (2015)Google Scholar
  36. 36.
    Goldman, B.W., Punch, W.F.: Parameter-less population pyramid. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’14), pp. 785–792. ACM (2014)Google Scholar
  37. 37.
    Hansen, N., Gawelczyk, A., Ostermeier, A.: Sizing the population with respect to the local progress in (1,\( \lambda \))-evolution strategies—a theoretical analysis. In: Proceedings of the IEEE Congress on Evolutionary Computation (CEC’95), pp. 80–85. IEEE (1995)Google Scholar
  38. 38.
    He, J., Yao, X.: Drift analysis and average time complexity of evolutionary algorithms. Artif. Intell. 127, 57–85 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Jägersküpper, J.: Rigorous runtime analysis of the (1+1) ES: 1/5-rule and ellipsoidal fitness landscapes. In: Proceedings of Foundations of Genetic Algorithms (FOGA’05), Lecture Notes in Computer Science, vol. 3469, pp. 260–281. Springer (2005)Google Scholar
  40. 40.
    Jansen, T.: Analyzing Evolutionary Algorithms: The Computer Science Perspective. Springer, Berlin (2013)CrossRefzbMATHGoogle Scholar
  41. 41.
    Jansen, T., De Jong, K.A., Wegener, I.: On the choice of the offspring population size in evolutionary algorithms. Evol. Comput. 13, 413–440 (2005)CrossRefGoogle Scholar
  42. 42.
    Jansen, T., Wegener, I.: The analysis of evolutionary algorithms–a proof that crossover really can help. Algorithmica 34, 47–66 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  43. 43.
    Jansen, T., Wegener, I.: On the analysis of a dynamic evolutionary algorithm. J. Discrete Algorithms 4, 181–199 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  44. 44.
    Karafotias, G., Hoogendoorn, M., Eiben, A.: Parameter control in evolutionary algorithms: trends and challenges. IEEE Trans. Evol. Comput. 19, 167–187 (2015)CrossRefGoogle Scholar
  45. 45.
    Kern, S., Müller, S.D., Hansen, N., Büche, D., Ocenasek, J., Koumoutsakos, P.: Learning probability distributions in continuous evolutionary algorithms–a comparative review. Nat. Comput. 3, 77–112 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  46. 46.
    Kötzing, T.: Concentration of first hitting times under additive drift. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’14), pp. 1391–1398. ACM (2014)Google Scholar
  47. 47.
    Lässig, J., Sudholt, D.: Adaptive population models for offspring populations and parallel evolutionary algorithms. In: Proceedings of Foundations of Genetic Algorithms (FOGA’11), pp. 181–192. ACM (2011)Google Scholar
  48. 48.
    Lehre, P.K., Witt, C.: Black-box search by unbiased variation. Algorithmica 64, 623–642 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  49. 49.
    Mironovich, V., Buzdalov, M.: Hard test generation for maximum flow algorithms with the fast crossover-based evolutionary algorithm. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’15) (Companion Material), pp. 1229–1232. ACM (2015)Google Scholar
  50. 50.
    Mitchell, M., Holland, J.H., Forrest, S.: When will a genetic algorithm outperform hill climbing? In: Proceedings of the 7th Neural Information Processing Systems Conference (NIPS’93), Advances in Neural Information Processing Systems, vol. 6, pp. 51–58. Morgan Kaufmann (1993)Google Scholar
  51. 51.
    Oliveto, P.S., Lehre, P.K., Neumann, F.: Theoretical analysis of rank-based mutation—combining exploration and exploitation. In: Proceedings of Congress on Evolutionary Computation (CEC’09), pp. 1455–1462. IEEE (2009)Google Scholar
  52. 52.
    Oliveto, P.S., Yao, X.: Runtime analysis of evolutionary algorithms for discrete optimization. In: Auger, A., Doerr, B. (eds.) Theory of Randomized Search Heuristics, pp. 21–52. World Scientific Publishing, Singapore (2011)CrossRefGoogle Scholar
  53. 53.
    Raab, M., Steger, A.: “Balls into bins”—a simple and tight analysis. In: Proceedings of Randomization and Approximation Techniques in Computer Science (RANDOM’98), Lecture Notes in Computer Science, vol. 1518, pp. 159–170. Springer (1998)Google Scholar
  54. 54.
    Rechenberg, I.: Evolutionsstrategie. Friedrich Fromman Verlag (Günther Holzboog KG), Stuttgart (1973)Google Scholar
  55. 55.
    Schumer, M.A., Steiglitz, K.: Adaptive step size random search. IEEE Trans. Autom. Control 13, 270–276 (1968)CrossRefGoogle Scholar
  56. 56.
    Sudholt, D.: Crossover is provably essential for the Ising model on trees. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’05), pp. 1161–1167. ACM Press (2005)Google Scholar
  57. 57.
    Sudholt, D.: Crossover speeds up building-block assembly. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’12), pp. 689–702. ACM (2012)Google Scholar
  58. 58.
    Wegener, I.: Theoretical aspects of evolutionary algorithms. In: Orejas, F., Spirakis, P.G., van Leeuwen, J. (eds.) Proc. of the 28th International Colloquium on Automata, Languages and Programming (ICALP’01), Lecture Notes in Computer Science, vol. 2076, pp. 64–78. Springer (2001)Google Scholar
  59. 59.
    Wegener, I.: Methods for the analysis of evolutionary algorithms on pseudo-Boolean functions. In: Sarker, R., Mohammadian, M., Yao, X. (eds.) Evolutionary Optimization, pp. 349–369. Kluwer, Berlin (2002)Google Scholar
  60. 60.
    Wegener, I.: Simulated annealing beats metropolis in combinatorial optimization. In: Proceedings of International Colloquium on Automata, Languages and Programming (ICALP’05), Lecture Notes in Computer Science, vol. 3580, pp. 589–601. Springer (2005)Google Scholar
  61. 61.
    Witt, C.: Tight bounds on the optimization time of a randomized search heuristic on linear functions. Comb. Probab. Comput. 22, 294–318 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  62. 62.
    Witt, C.: Fitness levels with tail bounds for the analysis of randomized search heuristics. Inf. Process. Lett. 114, 38–41 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  63. 63.
    Zarges, C.: Rigorous runtime analysis of inversely fitness proportional mutation rates. In: Proceedings of Parallel Problem Solving from Nature (PPSN’08), Lecture Notes in Computer Science, vol. 5199, pp. 112–122. Springer (2008)Google Scholar
  64. 64.
    Zarges, C.: On the utility of the population size for inversely fitness proportional mutation rates. In: Proceedings of Foundations of Genetic Algorithms (FOGA’09), pp. 39–46. ACM (2009)Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.École PolytechniqueLIX - UMR 7161PalaiseauFrance
  2. 2.Sorbonne Universités, UPMC Univ Paris 06, CNRS, LIP6 UMR 7606ParisFrance

Personalised recommendations