Skip to main content
Log in

Does Comma Selection Help to Cope with Local Optima?

  • Published:
Algorithmica Aims and scope Submit manuscript

Abstract

One hope when using non-elitism in evolutionary computation is that the ability to abandon the current-best solution aids leaving local optima. To improve our understanding of this mechanism, we perform a rigorous runtime analysis of a basic non-elitist evolutionary algorithm (EA), the \((\mu ,\lambda )\) EA, on the most basic benchmark function with a local optimum, the jump function. We prove that for all reasonable values of the parameters and the problem, the expected runtime of the \((\mu ,\lambda )\) EA is, apart from lower order terms, at least as large as the expected runtime of its elitist counterpart, the \((\mu +\lambda )\) EA (for which we conduct the first runtime analysis on jump functions to allow this comparison). Consequently, the ability of the \((\mu ,\lambda )\) EA to leave local optima to inferior solutions does not lead to a runtime advantage. We complement this lower bound with an upper bound that, for broad ranges of the parameters, is identical to our lower bound apart from lower order terms. This is the first runtime result for a non-elitist algorithm on a multi-modal problem that is tight apart from lower order terms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. We use the term precise to denote runtime estimates that are asymptotically tight including the leading constant, that is, where the estimated runtime \({\tilde{T}}(n)\) and the true runtime T(n) for problem size n satisfy \(\lim \limits _{n \rightarrow \infty } {\tilde{T}}(n) / T(n) = 1\).

  2. To ease the presentation, we only consider the standard mutation rate 1/n, but we are confident that our results in an analogous fashion hold for general mutation rates \(\chi /n\), \(\chi \) a constant. Previous works have shown that the constant \(\chi \) has an influence (again by constant factors) on where the boundary between the “imitating elitism” and “no efficient progress” regimes is located. Since our result is that the \({(\mu ,\lambda )}~\mathrm{EA}\) for no realistic parameter settings beats the \((\mu +\lambda )~\mathrm{EA}\), we do not expect that a constant factor change of the mutation rate leads to substantially different findings.

  3. More precisely, Cn could be replaced by \(t_0\) from (6).

  4. This argument, ignoring however the initial search points, was made already in [15] to show this runtime bound for the \((1 + 1)~\mathrm{EA}\).

References

  1. Antipov, D., Buzdalov, M., Doerr, B.: Fast mutation in crossover-based algorithms. In: Genetic and Evolutionary Computation Conference, GECCO 2020, pp. 1268–1276. ACM (2020)

  2. Auger, A., Doerr, B. (eds.): Theory of Randomized Search Heuristics. World Scientific Publishing, Singapore (2011)

    MATH  Google Scholar 

  3. Antipov, D., Doerr, B.: Precise runtime analysis for plateaus. In: Parallel Problem Solving From Nature, PPSN 2018, Part II, pp. 117–128. Springer (2018)

  4. Antipov, D., Doerr, B., Fang, J., Hetet, T.: Runtime analysis for the \({(\mu +\lambda )}\) EA optimizing OneMax. In: Genetic and Evolutionary Computation Conference, GECCO 2018, pp. 1459–1466. ACM (2018)

  5. Antipov, D., Doerr, B., Karavaev, V.: The \((1 + (\lambda ,\lambda ))\) GA is even faster on multimodal problems. In: Genetic and Evolutionary Computation Conference, GECCO 2020, pp. 1259–1267. ACM (2020)

  6. Antipov, D., Doerr, B., Yang, Q.: The efficiency threshold for the offspring population size of the \({(\mu ,\lambda )}\) EA. In: Genetic and Evolutionary Computation Conference, GECCO 2019, pp. 1461–1469. ACM (2019)

  7. Alanazi, F., Lehre, P.K.: Runtime analysis of selection hyper-heuristics with classical learning mechanisms. In: Congress on Evolutionary Computation, CEC 2104, pp. 2515–2523. IEEE (2014)

  8. Böttcher, S., Doerr, B., Neumann, F.: Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In: Parallel Problem Solving from Nature, PPSN 2010, pp. 1–10. Springer (2010)

  9. Corus, D., Dang, D.-C., Eremeev, A.V., Lehre, P.K.: Level-based analysis of genetic algorithms and other search processes. IEEE Trans. Evol. Comput. 22, 707–719 (2018)

    Article  Google Scholar 

  10. Corus, D., Oliveto, P.S., Yazdani, D.: On the runtime analysis of the Opt-IA artificial immune system. In: Genetic and Evolutionary Computation Conference, GECCO 2017, pp. 83–90. ACM (2017)

  11. Corus, D., Oliveto, P.S., Yazdani, D.: Fast artificial immune systems. In: Parallel Problem Solving from Nature, PPSN 2018, Part II, pp. 67–78. Springer (2018)

  12. Dang, D.-C., Friedrich, T., Kötzing, T., Krejca, M.S., Lehre, P.K., Oliveto, P.S., Sudholt, D., Sutton, A.M.: Escaping local optima with diversity mechanisms and crossover. In: Genetic and Evolutionary Computation Conference, GECCO 2016, pp. 645–652. ACM (2016)

  13. Dang, D.-C., Friedrich, T., Kötzing, T., Krejca, M.S., Lehre, P.K., Oliveto, P.S., Sudholt, D., Sutton, A.M.: Escaping local optima using crossover with emergent diversity. IEEE Trans. Evol. Comput. 22, 484–497 (2018)

    Article  Google Scholar 

  14. Doerr, B., Goldberg, L.A.: Adaptive drift analysis. Algorithmica 65, 224–250 (2013)

    Article  MathSciNet  Google Scholar 

  15. Droste, S., Jansen, T., Wegener, I.: On the analysis of the (1+1) evolutionary algorithm. Theoret. Comput. Sci. 276, 51–81 (2002)

  16. Doerr, B., Künnemann, M.: Optimizing linear functions with the \((1+\lambda )\) evolutionary algorithm-different asymptotic runtimes for different instances. Theoret. Comput. Sci. 561, 3–23 (2015)

    Article  MathSciNet  Google Scholar 

  17. Doerr, B., Kötzing, T.: Multiplicative up-drift. Algorithmica (2021). https://doi.org/10.1007/s00453-020-00775-7

  18. Dang, D.-C., Lehre, P.K.: Runtime analysis of non-elitist populations: from classical optimisation to partial information. Algorithmica 75, 428–461 (2016)

    Article  MathSciNet  Google Scholar 

  19. Dang, D.-C., Lehre, P.K.: Self-adaptation of mutation rates in non-elitist populations. In: Parallel Problem Solving from Nature, PPSN 2016, pp. 803–813. Springer (2016)

  20. Doerr, B., Le, H.P., Makhmara, R., Nguyen, T.D.: Fast genetic algorithms. In: Genetic and Evolutionary Computation Conference, GECCO 2017, pp. 777–784. ACM (2017)

  21. Dang, D.-C., Lehre, P.K., Nguyen, P.T.H.: Level-based analysis of the univariate marginal distribution algorithm. Algorithmica 81, 668–702 (2019)

    Article  MathSciNet  Google Scholar 

  22. Doerr, B., Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: On the runtime analysis of selection hyper-heuristics with adaptive learning periods. In: Genetic and evolutionary computation conference, GECCO 2018, pp. 1015–1022. ACM (2018)

  23. Doerr, B., Neumann, F.: editors. Theory of Evolutionary Computation—Recent Developments in Discrete Optimization. Springer (2020). https://cs.adelaide.edu.au/~frank/papers/TheoryBook2019-selfarchived.pdf

  24. Doerr, B.: Analyzing randomized search heuristics via stochastic domination. Theoret. Comput. Sci. 773, 115–137 (2019)

    Article  MathSciNet  Google Scholar 

  25. Doerr, B.: An exponential lower bound for the runtime of the compact genetic algorithm on jump functions. In: Foundations of genetic algorithms, FOGA 2019, pp. 25–33. ACM (2019)

  26. Doerr, B.: A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost. In: Genetic and Evolutionary Computation Conference, GECCO 2019, pp. 1488–1496. ACM (2019)

  27. Doerr, B.: Does comma selection help to cope with local optima? In: Genetic and Evolutionary Computation Conference, GECCO 2020, pp. 1304–1313. ACM (2020)

  28. Doerr, B.: Lower bounds for non-elitist evolutionary algorithms via negative multiplicative drift. In: Parallel Problem Solving From Nature, PPSN 2020, Part II, pp. 604–618. Springer (2020)

  29. Doerr, B.: Probabilistic tools for the analysis of randomized optimization heuristics. In: Doerr, B., Neumann, F. (eds), Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pp. 1–87. Springer (2020). arXiv:1801.06733

  30. Doerr, B., Witt, C., Yang, J.: Runtime analysis for self-adaptive mutation rates. Algorithmica 83, 1012–1053 (2021)

    Article  MathSciNet  Google Scholar 

  31. Eremeev, A.V.: Modeling and analysis of genetic algorithm with tournament selection. In: Artificial Evolution, AE 1999, pp. 84–95. Springer (1999)

  32. Friedrich, T., Göbel, A., Quinzan, F., Wagner, M.: Evolutionary algorithms and submodular functions: benefits of heavy-tailed mutations. CoRR, arXiv:1805.10902 (2018)

  33. Friedrich, T., Göbel, A., Quinzan, F., Wagner, M.: Heavy-tailed mutation operators in single-objective combinatorial optimization. In: Parallel Problem Solving from Nature, PPSN 2018, Part I, pp. 134–145. Springer (2018)

  34. Friedrich, T., Kötzing, T., Krejca, M.S., Nallaperuma, S., Neumann, F., Schirneck, M.: Fast building block assembly by majority vote crossover. In: Genetic and Evolutionary Computation Conference, GECCO 2016, pp. 661–668. ACM (2016)

  35. Friedrich, T., Quinzan, F., Wagner, M.: Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In: Genetic and Evolutionary Computation Conference, GECCO 2018, pp. 293–300. ACM (2018)

  36. Garnier, J., Kallel, L., Schoenauer, M.: Rigorous hitting times for binary mutations. Evol. Comput. 7, 173–203 (1999)

    Article  Google Scholar 

  37. Gießen, C., Witt, C.: The interplay of population size and mutation probability in the \({(1 + \lambda )}\) EA on OneMax. Algorithmica 78, 587–609 (2017)

  38. Hajek, B.: Hitting-time and occupation-time bounds implied by drift analysis with applications. Adv. Appl. Probab. 13, 502–525 (1982)

    Article  MathSciNet  Google Scholar 

  39. Happ, E., Johannsen, D., Klein, C., Neumann, F.: Rigorous analyses of fitness-proportional selection for optimizing linear functions. In: Genetic and Evolutionary Computation Conference, GECCO 2008, pp. 953–960. ACM (2008)

  40. Hoeffding, W.: Probability inequalities for sums of bounded random variables. J. Am. Stat. Assoc. 58, 13–30 (1963)

    Article  MathSciNet  Google Scholar 

  41. Hasenöhrl, V., Sutton, A.M.: On the runtime dynamics of the compact genetic algorithm on jump functions. In: Genetic and Evolutionary Computation Conference, GECCO 2018, pp. 967–974. ACM (2018)

  42. He, J., Yao, X.: Drift analysis and average time complexity of evolutionary algorithms. Artif. Intell. 127, 51–81 (2001)

    Article  MathSciNet  Google Scholar 

  43. Jansen, T.: A comparison of simulated annealing with a simple evolutionary algorithm. In: Foundations of Genetic Algorithms, FOGA 2005, pp. 37–57. Springer (2005)

  44. Jansen, T.: Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer, Berlin (2013)

    Book  Google Scholar 

  45. Jansen, T., De, J., Kenneth, A., Wegener, I.: On the choice of the offspring population size in evolutionary algorithms. Evolut. Comput. 13, 413–440 (2005)

    Article  Google Scholar 

  46. Jägersküpper, J., Storch, T.: When the plus strategy outperforms the comma strategy and when not. In: Foundations of Computational Intelligence, FOCI 2007, pp. 25–32. IEEE (2007)

  47. Jansen, T., Wegener, I.: The analysis of evolutionary algorithms - a proof that crossover really can help. Algorithmica 34, 47–66 (2002)

    Article  MathSciNet  Google Scholar 

  48. Jansen, T., Wegener, I.: A comparison of simulated annealing with a simple evolutionary algorithm on pseudo-Boolean functions of unitation. Theoret. Comput. Sci. 386, 73–93 (2007)

  49. Kötzing, T.: Concentration of first hitting times under additive drift. Algorithmica 75, 490–506 (2016)

    Article  MathSciNet  Google Scholar 

  50. Lehre, P.K.: Negative drift in populations. In: Parallel Problem Solving from Nature, PPSN 2010, pp. 244–253. Springer (2010)

  51. Lehre, P.K.: Fitness-levels for non-elitist populations. In: Genetic and Evolutionary Computation Conference, GECCO 2011, pp. 2075–2082. ACM (2011)

  52. Lengler, J.: Drift analysis. In: Doerr, B., Neumann, F. (eds.), Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pp. 89–131. Springer, (2020). arXiv:1712.00964

  53. Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: On the runtime analysis of generalised selection hyper-heuristics for pseudo-Boolean optimisation. In: Genetic and Evolutionary Computation Conference, GECCO 2017, pp. 849–856. ACM (2017)

  54. Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: On the time complexity of algorithm selection hyper-heuristics for multimodal optimisation. In: Conference on Artificial Intelligence, AAAI 2019, pp. 2322–2329. AAAI Press (2019)

  55. Lengler, J., Steger, A.: Drift analysis and evolutionary algorithms revisited. Combinat. Probab. Comput. 27, 643–666 (2018)

    Article  MathSciNet  Google Scholar 

  56. Neumann, F., Oliveto, P.S., Witt, C.: Theoretical analysis of fitness-proportional selection: landscapes and efficiency. In: Genetic and Evolutionary Computation Conference, GECCO 2009, pp. 835–842. ACM (2009)

  57. Nguyen, P.T.H., Sudholt, D.: Memetic algorithms outperform evolutionary algorithms in multimodal optimisation. Artif. Intell. 287, 103345 (2020)

    Article  MathSciNet  Google Scholar 

  58. Neumann, F., Witt, C.: Bioinspired Computation in Combinatorial Optimization – Algorithms and Their Computational Complexity. Springer (2010)

  59. Oliveto, P.S., Paixão, T., Heredia, J.P., Sudholt, D., Trubenová, B.: How to escape local optima in black box optimisation: when non-elitism outperforms elitism. Algorithmica 80, 1604–1633 (2018)

  60. Oliveto, P.S., Witt, C.: Simplified drift analysis for proving lower bounds in evolutionary computation. Algorithmica 59, 369–386 (2011)

  61. Oliveto, P.S., Witt, C.: Erratum: simplified drift analysis for proving lower bounds in evolutionary computation. CoRR, arXiv:1211.7184 (2012)

  62. Oliveto, P.S., Witt, C.: Improved time complexity analysis of the simple genetic algorithm. Theoret. Comput. Sci. 605, 21–41 (2015)

    Article  MathSciNet  Google Scholar 

  63. Paixão, T., Heredia, J. P., Sudholt, D., Trubenová, B.: Towards a runtime comparison of natural and artificial evolution. Algorithmica 78, 681–713 (2017)

  64. Rowe, J.E.: Aishwaryaprajna: The benefits and limitations of voting mechanisms in evolutionary optimisation. In: Foundations of Genetic Algorithms, FOGA 2019, pp. 34–42. ACM (2019)

  65. Rowe, J.E., Sudholt, D.: The choice of the offspring population size in the (1, \(\lambda \)) evolutionary algorithm. Theoret. Comput. Sci. 545, 20–38 (2014)

  66. Sudholt, D.: The impact of parametrization in memetic evolutionary algorithms. Theoret. Comput. Sci. 410, 2511–2528 (2009)

    Article  MathSciNet  Google Scholar 

  67. Sudholt, D.: A new method for lower bounds on the running time of evolutionary algorithms. IEEE Trans. Evol. Comput. 17, 418–435 (2013)

    Article  Google Scholar 

  68. Wegener, I.: Simulated annealing beats Metropolis in combinatorial optimization. In: Automata, Languages and Programming, ICALP 2005, pp. 589–601. Springer (2005)

  69. Witt, C.: Runtime analysis of the (\(\mu \) + 1) EA on simple pseudo-Boolean functions. Evol. Comput. 14, 65–86 (2006)

    Google Scholar 

  70. Witt, C.: Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combin. Probab. Comput. 22, 294–318 (2013)

    Article  MathSciNet  Google Scholar 

  71. Witt, C.: Upper bounds on the running time of the univariate marginal distribution algorithm on OneMax. Algorithmica 81, 632–667 (2019)

    Article  MathSciNet  Google Scholar 

  72. Wu, M., Qian, C., Tang, K.: Dynamic mutation based Pareto optimization for subset selection. In: Intelligent Computing Methodologies, ICIC 2018, Part III, pp. 25–35. Springer (2018)

  73. Whitley, D., Varadarajan, S., Hirsch, R., Mukhopadhyay, A.: Exploration and exploitation without mutation: solving the jump function in \({\Theta (n)}\) time. In: Parallel Problem Solving from Nature, PPSN 2018, Part II, pp. 55–66. Springer (2018)

  74. Wang, S., Zheng, W., Doerr, B.: Choosing the right algorithm with hints from complexity theory. In: International Joint Conference on Artificial Intelligence, IJCAI 2021, pp. 1697–1703. ijcai.org (2021)

Download references

Acknowledgements

This work was supported by a public grant as part of the Investissements d’avenir project, reference ANR-11-LABX-0056-LMH, LabEx LMH.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Doerr.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This is the full version of a paper [27] that appeared at GECCO 2020. It contains all proofs and additional information that had to be omitted from the conference version for reasons of space.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Doerr, B. Does Comma Selection Help to Cope with Local Optima?. Algorithmica 84, 1659–1693 (2022). https://doi.org/10.1007/s00453-021-00896-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00453-021-00896-7

Navigation