Skip to main content
Log in

An Extended Jump Functions Benchmark for the Analysis of Randomized Search Heuristics

  • Published:
Algorithmica Aims and scope Submit manuscript

Abstract

Jump functions are the most-studied non-unimodal benchmark in the theory of randomized search heuristics, in particular, evolutionary algorithms (EAs). They have significantly improved our understanding of how EAs escape from local optima. However, their particular structure—to leave the local optimum one can only jump directly to the global optimum—raises the question of how representative such results are. For this reason, we propose an extended class \(\textsc {Jump}_{k,\delta }\) of jump functions that contain a valley of low fitness of width \(\delta \) starting at distance k from the global optimum. We prove that several previous results extend to this more general class: for all \(k \le \frac{n^{1/3}}{\ln {n}}\) and \(\delta < k\), the optimal mutation rate for the \((1+1)\) EA is \(\frac{\delta }{n}\), and the fast \((1+1)\) EA runs faster than the classical \((1+1)\) EA by a factor super-exponential in \(\delta \). However, we also observe that some known results do not generalize: the randomized local search algorithm with stagnation detection, which is faster than the fast \((1+1)\) EA by a factor polynomial in k on \(\textsc {Jump}_k\), is slower by a factor polynomial in n on some \(\textsc {Jump}_{k,\delta }\) instances. Computationally, the new class allows experiments with wider fitness valleys, especially when they lie further away from the global optimum.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Antipov, D., Buzdalov, M., Doerr, B.: Fast mutation in crossover-based algorithms. In: Genetic and Evolutionary Computation Conference, GECCO 2020, pp. 1268–1276. ACM (2020)

  2. Antipov, D., Buzdalov, M., Doerr, B.: First steps towards a runtime analysis when starting with a good solution. In: Parallel Problem Solving From Nature, PPSN 2020, Part II, pp. 560–573. Springer (2020)

  3. Antipov, D., Buzdalov, M., Doerr, B.: Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution. In: Genetic and Evolutionary Computation Conference, GECCO 2021, pp. 1115–1123. ACM (2021)

  4. Auger, A., Doerr, B., (eds).: Theory of Randomized Search Heuristics. World Scientific Publishing (2011)

  5. Antipov, D., Doerr, B.: Runtime analysis of a heavy-tailed \((1+(\lambda , \lambda ))\) genetic algorithm on jump functions. In: Parallel Problem Solving From Nature, PPSN 2020, Part II, pp. 545–559. Springer (2020)

  6. Antipov, D., Doerr, B., Karavaev, V.: The \((1 + (\lambda ,\lambda ))\) GA is even faster on multimodal problems. In: Genetic and Evolutionary Computation Conference, GECCO 2020, pp. 1259–1267. ACM (2020)

  7. Bambury, H., Bultel, A., Doerr, B.: Generalized jump functions. In: Genetic and Evolutionary Computation Conference, GECCO 2021, pp. 1124–1132. ACM (2021)

  8. Benbaki, R., Benomar, Z., Doerr, B.: A rigorous runtime analysis of the 2-MMAS\(_{\rm ib}\) on jump functions: ant colony optimizers can cope well with local optima. In: Genetic and Evolutionary Computation Conference, GECCO 2021, pp. 4–13. ACM (2021)

  9. Böttcher, S., Doerr, B., Neumann, F.: Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In: Parallel Problem Solving from Nature, PPSN 2010, pp. 1–10. Springer (2010)

  10. Corus, D., Oliveto, P.S.: Standard steady state genetic algorithms can hillclimb faster than mutation-only evolutionary algorithms. IEEE Trans. Evol. Computut. 22, 720–732 (2018)

    Article  Google Scholar 

  11. Corus, D., Oliveto, P.S.: On the benefits of populations for the exploitation speed of standard steady-state genetic algorithms. Algorithmica 82, 3676–3706 (2020)

    Article  MathSciNet  Google Scholar 

  12. Corus, D., Oliveto, P.S., Yazdani, D.: On the runtime analysis of the Opt-IA artificial immune system. In: Genetic and Evolutionary Computation Conference, GECCO 2017, pp. 83–90. ACM (2017)

  13. Corus, D., Oliveto, P.S., Yazdani, D.: Fast artificial immune systems. In: Parallel Problem Solving from Nature, PPSN 2018, Part II, pp. 67–78. Springer (2018)

  14. Doerr, B., Doerr, C., Ebel, F.: From black-box complexity to designing new genetic algorithms. Theor. Comput. Sci. 567, 87–104 (2015)

    Article  MathSciNet  Google Scholar 

  15. Doerr, B., Doerr, C., Kötzing, T.: Static and self-adjusting mutation strengths for multi-valued decision variables. Algorithmica 80, 1732–1768 (2018)

    Article  MathSciNet  Google Scholar 

  16. Doerr, B., Doerr, C., Kötzing, T.: Solving problems with unknown solution length at almost no extra cost. Algorithmica 81, 703–748 (2019)

    Article  MathSciNet  Google Scholar 

  17. Dang, D.C., Friedrich, T., Kötzing, T., Krejca, M.S., Lehre, P.K., Oliveto, P.S., Sudholt, D., Sutton, A.M.: Escaping local optima with diversity mechanisms and crossover. In: Genetic and Evolutionary Computation Conference, GECCO 2016, pp. 645–652. ACM (2016)

  18. Dang, D.C., Friedrich, T., Kötzing, T., Krejca, M.S., Lehre, P.K., Oliveto, P.S., Sudholt, D., Sutton, A.M.: Escaping local optima using crossover with emergent diversity. IEEE Trans. Evol. Comput., 22:484–497 (2018)

  19. Doerr, B., Happ, E., Klein, C.: Crossover can provably be useful in evolutionary computation. Theor. Comput. Sci. 425, 17–33 (2012)

    Article  MathSciNet  Google Scholar 

  20. Doerr, B., Johannsen, D., Kötzing, T., Lehre, P.K., Wagner, M., Winzen, C.: Faster black-box algorithms through higher arity operators. In: Foundations of Genetic Algorithms, FOGA 2011, pp. 163–172. ACM (2011)

  21. Doerr, B., Johannsen, D., Kötzing, T., Neumann, F., Theile, M.: More effective crossover operators for the all-pairs shortest path problem. Theor. Comput. Sci. 471, 12–26 (2013)

    Article  MathSciNet  Google Scholar 

  22. Droste, S., Jansen, T., Wegener, I.: On the analysis of the (1+1) evolutionary algorithm. Theor. Comput. Sci. 276, 51–81 (2002)

    Article  MathSciNet  Google Scholar 

  23. Doerr, B., Kötzing, T.: Lower bounds from fitness levels made easy. In: Genetic and Evolutionary Computation Conference, GECCO 2021, pp. 1142–1150. ACM (2021)

  24. Doerr, B., Krejca, M.S.: The univariate marginal distribution algorithm copes well with deception and epistasis. Evol. Comput. 29, 543–563 (2021)

    Article  Google Scholar 

  25. Doerr, B., Le, H.P., Makhmara, R., Nguyen, T.D.: Fast genetic algorithms. In: Genetic and Evolutionary Computation Conference, GECCO 2017, pp. 777–784. ACM (2017)

  26. Doerr, B., Neumann, F. (eds).: Theory of Evolutionary Computation—Recent Developments in Discrete Optimization. Springer (2020). https://cs.adelaide.edu.au/frank/papers/TheoryBook2019-selfarchived.pdf

  27. Doerr, B.: Analyzing randomized search heuristics via stochastic domination. Theor. Comput. Sci. 773, 115–137 (2019)

    Article  MathSciNet  Google Scholar 

  28. Doerr, B.: Does comma selection help to cope with local optima? In: Genetic and Evolutionary Computation Conference, GECCO 2020, pp. 1304–1313. ACM (2020)

  29. Doerr, B.: The runtime of the compact genetic algorithm on Jump functions. Algorithmica 83, 3059–3107 (2021)

    Article  MathSciNet  Google Scholar 

  30. Doerr, B., Rajabi, A.: Stagnation detection meets fast mutation. In: Evolutionary Computation in Combinatorial Optimization, EvoCOP 2022, pp. 191–207. Springer (2022)

  31. Doerr, B., Zheng, W.: Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In: Conference on Artificial Intelligence, AAAI 2021, pp. 12293–12301. AAAI Press (2021)

  32. Friedrich, T., Göbel, A., Quinzan, F., Wagner, M.: Heavy-tailed mutation operators in single-objective combinatorial optimization. In: Parallel Problem Solving from Nature, PPSN 2018, Part I, pp. 134–145. Springer (2018)

  33. Friedrich, T., Kötzing, T., Krejca, M.S., Nallaperuma, S., Neumann, F., Schirneck, M.: Fast building block assembly by majority vote crossover. In: Genetic and Evolutionary Computation Conference, GECCO 2016, pp. 661–668. ACM (2016)

  34. Friedrich, T., Oliveto, P.S., Sudholt, D., Witt, C.: Analysis of diversity-preserving mechanisms for global exploration. Evol. Comput. 17, 455–476 (2009)

    Article  Google Scholar 

  35. Friedrich, T., Quinzan, F., Wagner, M.: Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In: Genetic and Evolutionary Computation Conference, GECCO 2018, pp. 293–300. ACM (2018)

  36. Fischer, S., Wegener, I.: The Ising model on the ring: mutation versus recombination. In: Genetic and Evolutionary Computation, GECCO 2004, pp. 1113–1124. Springer (2004)

  37. Garnier, J., Kallel, L., Schoenauer, M.: Rigorous hitting times for binary mutations. Evol. Comput. 7, 173–203 (1999)

    Article  Google Scholar 

  38. Gießen, C., Witt, C.: The interplay of population size and mutation probability in the \({(1 + \lambda )}\) EA on OneMax. Algorithmica 78, 587–609 (2017)

    Article  MathSciNet  Google Scholar 

  39. Hasenöhrl, V., Sutton, A.M.: On the runtime dynamics of the compact genetic algorithm on jump functions. In: Genetic and Evolutionary Computation Conference, GECCO 2018, pp. 967–974. ACM (2018)

  40. Jansen, T.: Analyzing Evolutionary Algorithms—The Computer Science Perspective. Springer (2013)

  41. Jansen, T.: On the black-box complexity of example functions: the real jump function. In: Foundations of Genetic Algorithms, FOGA 2015, pp. 16–24. ACM (2015)

  42. Jägersküpper, J., Storch, T.: When the plus strategy outperforms the comma strategy and when not. In: Foundations of Computational Intelligence, FOCI 2007, pp. 25–32. IEEE (2007)

  43. Jansen, T., Wegener, I.: The analysis of evolutionary algorithms—a proof that crossover really can help. Algorithmica 34, 47–66 (2002)

    Article  MathSciNet  Google Scholar 

  44. Lehre, P.K.: Negative drift in populations. In: Parallel Problem Solving from Nature, PPSN 2010, pp. 244–253. Springer (2010)

  45. Lehre, P.K., Nguyen, P.T.H.: On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help. In: Foundations of Genetic Algorithms, FOGA 2019, pp. 154–168. ACM (2019)

  46. Lehre, P.K., Oliveto, P.S.: Theoretical analysis of stochastic search algorithms. In: Martí, R., Pardalos, P.M., Resende, M.G.C. (eds), Handbook of Heuristics, pp. 849–884. Springer (2018)

  47. Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: On the time complexity of algorithm selection hyper-heuristics for multimodal optimisation. In: Conference on Artificial Intelligence, AAAI 2019, pp. 2322–2329. AAAI Press (2019)

  48. Per Kristian Lehre and Xin Yao: Crossover can be constructive when computing unique input-output sequences. Soft. Comput. 15, 1675–1687 (2011)

    Article  Google Scholar 

  49. Mironovich, V., Buzdalov, M.: Evaluation of heavy-tailed mutation operator on maximum flow test generation problem. In: Genetic and Evolutionary Computation Conference, GECCO 2017, Companion Material, pp. 1423–1426. ACM (2017)

  50. Nguyen, P.T.H., Sudholt, D.: Memetic algorithms outperform evolutionary algorithms in multimodal optimisation. Artif. Intell. 287, 103345 (2020)

    Article  MathSciNet  Google Scholar 

  51. Neumann, F., Witt, C.: Bioinspired Computation in Combinatorial Optimization—Algorithms and Their Computational Complexity. Springer (2010)

  52. Oliveto, P.S., Paixão, T., Pérez Heredia, J., Sudholt, D., Trubenová, B.: How to escape local optima in black box optimisation: when non-elitism outperforms elitism. Algorithmica 80, 1604–1633 (2018)

  53. Osuna, E.C., Sudholt, D.: Runtime analysis of crowding mechanisms for multimodal optimization. IEEE Trans. Evol. Comput. 24, 581–592 (2020)

  54. Paixão, T., Pérez Heredia, J., Sudholt, D., Trubenová, B.: Towards a runtime comparison of natural and artificial evolution. Algorithmica 78:681–713 (2017)

  55. Prügel-Bennett, A.: When a genetic algorithm outperforms hill-climbing. Theor. Comput. Sci. 320, 135–153 (2004)

    Article  MathSciNet  Google Scholar 

  56. Quinzan, F., Göbel, A., Wagner, M., Friedrich, T.: Evolutionary algorithms and submodular functions: benefits of heavy-tailed mutations. Nat. Comput. 20, 561–575 (2021)

    Article  MathSciNet  Google Scholar 

  57. Rowe, J.E.: The benefits and limitations of voting mechanisms in evolutionary optimisation. In: Foundations of Genetic Algorithms, FOGA 2019, pp. 34–42. ACM (2019)

  58. Rajabi, A., Witt, C.: Self-adjusting evolutionary algorithms for multimodal optimization. In: Genetic and Evolutionary Computation Conference, GECCO 2020, pp. 1314–1322. ACM (2020)

  59. Rajabi, A., Witt, C.: Stagnation detection in highly multimodal fitness landscapes. In: Genetic and Evolutionary Computation Conference, GECCO 2021, pp. 1178–1186. ACM (2021)

  60. Rajabi, A., Witt, C.: Stagnation detection with randomized local search. In: Evolutionary Computation in Combinatorial Optimization, EvoCOP 2021, pp. 152–168. Springer (2021)

  61. Sudholt, D.: Crossover is provably essential for the Ising model on trees. In: Genetic and Evolutionary Computation Conference, GECCO 2005, pp. 1161–1167. ACM (2005)

  62. Sudholt, D.: A new method for lower bounds on the running time of evolutionary algorithms. IEEE Trans. Evol. Comput. 17, 418–435 (2013)

    Article  Google Scholar 

  63. Sudholt, D.: How crossover speeds up building block assembly in genetic algorithms. Evol. Comput. 25, 237–274 (2017)

    Article  Google Scholar 

  64. Wegener, I.: Theoretical aspects of evolutionary algorithms. In: Automata, Languages and Programming, ICALP 2001, pp. 64–78. Springer (2001)

  65. Witt, C.: Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combin. Probab. Comput. 22, 294–318 (2013)

    Article  MathSciNet  Google Scholar 

  66. Wu, M., Qian, C., Tang, K.: Dynamic mutation based Pareto optimization for subset selection. In: Intelligent Computing Methodologies, ICIC 2018, Part III, pp. 25–35. Springer (2018)

  67. Whitley, D., Varadarajan, S., Hirsch, R., Mukhopadhyay, A.: Exploration and exploitation without mutation: solving the jump function in \({\Theta (n)}\) time. In: Parallel Problem Solving from Nature, PPSN 2018, Part II, pp. 55–66. Springer (2018)

  68. Wang, S., Zheng, W., Doerr, B.: Choosing the right algorithm with hints from complexity theory. In: International Joint Conference on Artificial Intelligence, IJCAI 2021, pp. 1697–1703 (2021)

Download references

Acknowledgements

This work was supported by a public grant as part of the Investissements d’avenir project, reference ANR-11-LABX-0056-LMH, LabEx LMH.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Doerr.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Full version of the paper [7] that appeared in the proceedings of GECCO 2021.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bambury, H., Bultel, A. & Doerr, B. An Extended Jump Functions Benchmark for the Analysis of Randomized Search Heuristics. Algorithmica 86, 1–32 (2024). https://doi.org/10.1007/s00453-022-00977-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00453-022-00977-1

Keywords

Navigation