Skip to main content
Log in

The “One-Fifth Rule” with Rollbacks for Self-Adjustment of the Population Size in the (1 + (λ, λ)) Genetic Algorithm

  • Published:
Automatic Control and Computer Sciences Aims and scope Submit manuscript

Abstract

Self-adjustment of parameters can significantly improve the performance of evolutionary algorithms. A notable example is the (1 + (λ, λ)) genetic algorithm, where adaptation of the population size helps to achieve the linear running time on the OneMax problem. However, on problems which interfere with the assumptions behind the self-adjustment procedure, its usage can lead to the performance degradation. In particular, this is the case with the “one-fifth rule” on problems with weak fitness-distance correlation. We propose a modification of the “one-fifth rule” in order to have less negative impact on the performance in the cases where the original rule is destructive. Our modification, while still yielding a provable linear runtime on OneMax, shows better results on linear functions with random weights, as well as on random satisfiable MAX-3SAT problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.

Similar content being viewed by others

REFERENCES

  1. Holland, J.H., Adaptation in Natural and Artificial Systems, Ann Arbor, Mich.: Univ. of Michigan, 1975.

    Google Scholar 

  2. Rechenberg, I., Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution, Stuttgart: Fromman-Holzboorg Verlag, 1973.

    Google Scholar 

  3. Schwefel, H.-P., Binäre Optimierung durch somatische Mutation, Tech. Rep., TU Berlin and Medizinische Hochschule Hannover, 1975.

    Google Scholar 

  4. Fogel L.G., Autonomous automata¸ Ind. Res., 1962, vol. 4, pp. 14–19.

    Google Scholar 

  5. Koza, J.R., Genetic Programming: On the Programming of Computers by Means of Natural Selection, Cambridge, Mass.: MIT Press, 1992.

    MATH  Google Scholar 

  6. Eberhart, R. and Kennedy, J., A new optimizer using particle swarm theory, in MHS’95. Proc. Sixth Int. Symp. on Micro Machine and Human Science, Nagoya, Japan, 1995, IEEE, 1995, pp. 39–43.  https://doi.org/10.1109/MHS.1995.494215

  7. Dorigo, M. and Gambardella, L.M., Ant colony system: A cooperative learning approach to the traveling salesman problem, IEEE Trans. Evol. Comput., 1997, vol. 1., no. 1, pp. 53–66.  https://doi.org/10.1109/4235.585892

    Article  Google Scholar 

  8. Poli, R., Kennedy, J., and Blackwell, T., Particle swarm optimization: An overview, Swarm Intell., 2007, vol. 1, pp. 33–57.  https://doi.org/10.1007/s11721-007-0002-0

    Article  Google Scholar 

  9. Kirkpatrick, S., Gela, C.D., and Vecchi, M.P., Optimization by simulated annealing, Science, 1983, vol. 220, no. 4598, pp. 671–680.  https://doi.org/10.1126/science.220.4598.671

    Article  MathSciNet  MATH  Google Scholar 

  10. Bishop, J.M., Stochastic searching networks, First IEEE Int. Conf. on Artificial Neural Networks (Conf. Publ. No. 313), London, 1989, IET, 1989, pp. 329–331.

  11. Storn, R. and Price, K., Differential evolution – A simple and efficient heuristic for global optimization over continuous spaces, J. Global Optim., 1997, vol. 11, no. 4, pp. 341–359.  https://doi.org/10.1023/A:1008202821328

    Article  MathSciNet  MATH  Google Scholar 

  12. Pelikan, M., Goldberg, D.E., and Cantú-Paz, E., Linkage problem, distribution estimation, and Bayesian networks, Evol. Comput., 2000, vol. 8, no. 3, pp. 311–340.  https://doi.org/10.1162/106365600750078808

    Article  Google Scholar 

  13. Doerr, B. and Krejca, M.S., Significance-based estimation-of-distribution algorithms, in Proc. Genetic and Evolutionary Computation Conf., Kyoto, 2018, Aguirre, H., Ed., New York: Association for Computing Machinery, 2018, pp. 1483–1490.  https://doi.org/10.1145/3205455.3205553

  14. Luke, S., Essentials of Metaheuristics. Lulu, 2009.

    Google Scholar 

  15. Orlov, A., Kureichik, V., Glushchenko, A., and Kureichik, V., Jr., Hybrid genetic algorithm for cutting stock and packaging problems, in IEEE East-West Design & Test Symp. (EWDTS), Yerevan, 2016, IEEE, 2016, pp. 1–4. https://doi.org/10.1109/EWDTS.2016.7807680

  16. Gladkov, L.A., Gladkova, N.V., and Gromov, S.A., Hybrid models of solving optimization tasks on the basis of integrating evolutionary design and multiagent technologies, Artificial Intelligence Methods in Intelligent Algorithms. CSOC 2019, Silhavy, R., Ed., Advances in Intelligent Systems and Computing, vol. 985, Cham: Springer, 2019, pp. 381–391.  https://doi.org/10.1007/978-3-030-19810-7_38

  17. Kuliev, E.V., Dukkardt, A.N., Kureychik, V.V., and Legebokov, A.A., Neighborhood research approach in swarm intelligence for solving the optimization problems, Proc. IEEE East-West Design & Test Symp., Kiev, 2014, IEEE, 2014, pp. 1–4.  https://doi.org/10.1109/EWDTS.2014.7027084

  18. Kuliev, E.V., Kureichik, V.Vl., and Kursitys, I.O., Decision making in VLSI components placement problem based on grey wolf optimization, Proc. IEEE East-West Design & Test Symp. (EWDTS), Batumi, Georgia, 2019, IEEE, 2019, pp. 1–4.  https://doi.org/10.1109/EWDTS.2019.8884371

  19. Kureichik, V., Kureichik, V., Jr., and Zaruba, D.V., Combined approach to place electronic computing equipment circuit elements, Proc. IEEE East-West Design & Test Symp. (EWDTS), Batumi, Georgia, 2015, IEEE, 2015, pp. 1–5.  https://doi.org/10.1109/EWDTS.2015.7493134

  20. Gladkov, L.A., Gladkova, N.V., and Leiba, S.N., Electronic computing equipment schemes elements placement based on hybrid intelligence approach, Intelligent Systems in Cybernetics and Automation Theory. CSOC 2015, Silhavy, R., Senkerik, R., Oplatkova, Z., Prokopova, Z., and Silhavy, P., Eds., Advances in Intelligent Systems and Computing, vol. 348, Cham: Springer, 2015, pp. 35–44.  https://doi.org/10.1007/978-3-319-18503-3_4

  21. Semenkina, M., Parallel version of self-configuring genetic algorithm application in spacecra. control system design, Proc. 15th Ann. Conf. Companion on Genetic and Evolutionary Computation, Amsterdam, 2013, Blum, C., Ed., New York: Association for Computing Machinery, 2013, pp. 1751–1752. https://doi.org/10.1145/2464576.2480793

  22. Chivilikhin, D.S., Ulyantsev, V.I., and Shalyto, A.A., Modified ant colony algorithm for constructing finite state machines from execution scenarios and temporal formulas, Autom. Remote Control, 2016, vol. 77, no. 3, pp. 473–484.  https://doi.org/10.1134/S0005117916030097

    Article  MathSciNet  MATH  Google Scholar 

  23. Fonseca, C.M. and Fleming, P.J., Nonlinear system identification with multiobjective genetic algorithm, IFAC Proc. Vol., 1996, vol. 29, no. 1, pp. 1169–1174.  https://doi.org/10.1016/S1474-6670(17)57823-4

  24. Buzhinsky, I.P., Ulyantsev, V.I., Chivilikhin, D.S., and Shalyto, A.A., Inducing finite state machines from training samples using ant colony optimization, J. Comput. Syst. Sci. Int., 2014, vol. 53, no. 2, pp. 256–266.  https://doi.org/10.1134/S106423071402004X

    Article  MATH  Google Scholar 

  25. Chivilikhin, D., Ulyantsev, V., and Shalyto, A., Extended finite-state machine inference with parallel ant colony based algorithms, Proc. Student Workshop on Bioinspired Optimization Methods and Their Applications, BIOMA 2014, Ljubljana, 2014, 2014, pp. 117–126.

  26. Chivilikhin, D., Ulyantsev, V., and Shalyto, A., Combining exact and metaheuristic techniques for learning extended finite state machines from test scenarios and temporal properties, 13th Int. Conf. on Machine Learning and Applications, Detroit, 2014, IEEE, 2014, pp. 350–355.  https://doi.org/10.1109/ICMLA.2014.62

  27. Buzhinsky, I., Ulyantsev, V., Tsarev, F., and Shalyto, A., Search-based construction of finite-state machines with real-valued actions: New representation model, Proc. 15th Ann. Conf. Companion on Genetic and Evolutionary Computation, Amsterdam, 2013, Blum, C., Ed., New York: Association for Computing Machinery, 2013, pp. 199–200.  https://doi.org/10.1145/2464576.2464678

  28. El-Khatib, S., Skobtsov, Yu., and Rodzin, S., Improved particle swarm medical image segmentation algorithm for decision making, Intelligent Distributed Computing XIII. IDC 2019, Kotenko, I., Badica, C., Desnitsky, V., El Baz, D., and Ivanovic, M., Eds., Studies in Computational Intelligence, vol. 868, Cham: Springer, 2019, pp. 437–442.  https://doi.org/10.1007/978-3-030-32258-8_51

  29. Eremeev, A.V. and Kovalenko, Yu.V., Genetic algorithm with optimal recombination for the asymmetric travelling salesman problem, in Large-Scale Scientific Computing. LSSC 2017, Lirkov, I. and Margenov, S., Eds., Lecture Notes in Computer Science, vol. 10665, Cham: Springer, 2017, pp. 341–349.  https://doi.org/10.1007/978-3-319-73441-5_36

    Book  Google Scholar 

  30. Eremeev, A.V. and Kovalenko, Yu.V., A memetic algorithm with optimal recombination for the asymmetric travelling salesman problem, Memetic Comput., 2020, vol. 12, no. 1, pp. 23–36.  https://doi.org/10.1007/s12293-019-00291-4

    Article  Google Scholar 

  31. Sanches, D., Whitley, D., and Tinós, R., Improving an exact solver for the traveling salesman problem using partition crossover, Proc. Genetic and Evolutionary Computation Conference, Berlin, 2017, New York: Association for Computing Machinery, 2017, pp. 337–344.  https://doi.org/10.1145/3071178.3071304

  32. Whitley, L.D., Chicano, F., and Goldman, B.W., Gray box optimization for Mk landscapes (NK landscapes and MAX-kSAT), Evol. Comput., 2016, vol. 24, no. 3, pp. 491–519.  https://doi.org/10.1162/EVCO_a_00184

    Article  Google Scholar 

  33. Glotić, A. and Zamuda, A., Short-term combined economic and emission hydrothermal optimization by surrogate differential evolution, Appl. Energy, 2015, vol. 141, pp. 42–56.  https://doi.org/10.1016/j.apenergy.2014.12.020

    Article  Google Scholar 

  34. Liao, T.W., Egbelu, P.J., and Chang, P.C., Two hybrid differential evolution algorithms for optimal inbound and outbound truck sequencing in cross docking operations, Appl. Soft Comput., 2012, vol. 12, no. 11, pp. 3683–3697.  https://doi.org/10.1016/j.asoc.2012.05.023

    Article  Google Scholar 

  35. Feoktistov, V., Pietravalle, S., and Heslot, N., Optimal experimental design of field trials using differential evolution, IEEE Congress on Evolutionary Computation (CEC), Donostia, Spain, 2017, IEEE, 2017, pp. 1690–1696.  https://doi.org/10.1109/CEC.2017.7969505

  36. Bäck, T., Fogel, D.B., and Michalewicz, Z., Evolutionary Computation 1: Basic Algorithms and Operators, Inst. of Physics Publishing, 2000.

  37. Grefenstette, J.J., Optimization of control parameters for genetic algorithms, IEEE Trans. Syst., Man, Cybern., 1986, vol. 16, pp. 122–128.  https://doi.org/10.1109/TSMC.1986.289288

    Article  Google Scholar 

  38. Mühlenbein, H., How genetic algorithms really work: Mutation and hillclimbing, in Parallel Problem Solving from Nature – PPSN II, Elsevier, 1992, pp. 15–26.

    Google Scholar 

  39. Eiben, Á.E., Hinterding, R., and Michalewicz, Z., Parameter control in evolutionary algorithms, IEEE Trans. Evol. Comput., 1999, vol. 3, no. 2, pp. 124–141.  https://doi.org/10.1109/4235.771166

    Article  Google Scholar 

  40. Stanovov, V., Akhmedova, S., Semenkin, E., and Semenkina, M., Generalized Lehmer mean for success history based adaptive differential evolution, IJCCI 2019—Proc. 11th Int. Joint Conf. on Computational Intelligence, Vienna, 2019, 2019, pp. 93–100.  https://doi.org/10.5220/0008163600930100

  41. Stanovov, V., Akhmedova, S., and Semenkin, E., LSHADE algorithm with rank-based selective pressure strategy for solving CEC 2017 benchmark problems, IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, 2018, IEEE, 2018, pp. 1–8.  https://doi.org/10.1109/CEC.2018.8477977

  42. Semenkina, M. and Semenkin, E., Memetic self-configuring genetic programming for solving machine learning problems, IIAI 4th Int. Congress on Advanced Applied Informatics, Okayama, Japan, 2015, IEEE, 2015, pp. 599–604.  https://doi.org/10.1109/IIAI-AAI.2015.290

  43. Semenkina, M., Akhmedova, S., Brester, C., and Semenkin, E., Choice of spacecraft control contour variant with self-configuring stochastic algorithms of multi-criteria optimization, Proc. 13th Int. Conf. on Informatics Control, Automation and Robotics, Lisbon, 2016, Gusikhin, O., Peaucelle, D., and Madani, K., Eds., New York: Association for Computing Machinery, 2016, pp. 281–286.  https://doi.org/10.5220/0006009502810286

  44. Stanovov, V., Semenkin, E., and Semenkina, O., Self-configuring hybrid evolutionary algorithm for fuzzy imbalanced classification with adaptive instance selection, J. Artif. Intell. Soft Comput. Res., 2016, vol. 6, no. 3, pp. 173–188.  https://doi.org/10.1515/jaiscr-2016-0013

    Article  Google Scholar 

  45. Hansen, N. and Ostermeier, A., Completely derandomized self-adaptation in evolution strategies, Evol. Comput., 2001, vol. 9, no. 2, pp. 159–195.  https://doi.org/10.1162/106365601750190398

    Article  Google Scholar 

  46. Tanabe, R. and Fukunaga, A., Success-history based parameter adaptation for differential evolution, IEEE Congress on Evolutionary Computation, Cancun, Mexico, 2013, IEEE, 2013, pp. 71–78.  https://doi.org/10.1109/CEC.2013.6557555

  47. Viktorin, A., Senkerik, R., Pluhacek, M., Kadavy, T., and Zamuda, A., Distance based parameter adaptation for success-history based differential evolution, Swarm and Evolutionary Computation, vol. 50, 2019. https://doi.org/10.1016/j.swevo.2018.10.013

  48. Dang, N. and Doerr, C., Hyper-parameter tuning for the (1 + (λ, λ)) GA, Proc. of Genetic and Evolutionary Computation Conference, Prague, 2019, López-Ibáñez, M., Ed., New York: Association for Computing Machinery, 2019, pp. 889–897.  https://doi.org/10.1145/3321707.3321725

  49. Ridge, E. and Kudenko, D., Tuning an algorithm using design of experiments, in Experimental Methods for the Analysis of Optimization Algorithms, Bartz-Beielstein, T., Chiarandini, M., Paquete, L., and Preuss, M., Eds., Berlin: Springer, 2010, pp. 265–286.  https://doi.org/10.1007/978-3-642-02538-9_11

    Book  MATH  Google Scholar 

  50. López-Ibáñez, M., Dubois-Lacoste, J., Cáceres, L.P., Birattari, M., and Stützle, T., The irace package: Iterated racing for automatic algorithm configuration, Oper. Res. Perspect., 2016, vol. 3, pp. 43–58.  https://doi.org/10.1016/j.orp.2016.09.002

    Article  MathSciNet  Google Scholar 

  51. Hutter, F. Hoos, H.H., and Leyton-Brown, K., Sequential model-based optimization for general algorithm configuration, in Learning and Intelligent Optimization. LION 2011, Coello Coello, C.A., Ed., Lecture Notes in Computer Science, vol. 6683, Berlin: Springer, 2011, pp. 507–523.  https://doi.org/10.1007/978-3-642-25566-3_40

    Book  Google Scholar 

  52. Hutter, F. Hoos, H.H., and Leyton-Brown, K., and Stützle, T., ParamILS: An automatic algorithm configuration framework, J. Artif. Intell. Res., 2009, vol. 36, pp. 267–306.  https://doi.org/10.1613/jair.2861

    Article  Google Scholar 

  53. Wegener, I., Methods for the analysis of evolutionary algorithms on pseudo-Boolean functions, in Evolutionary Optimization, Sarker, R., Mohammadian, M., and Yao, X., Eds., International Series in Operations Research & Management Science, vol. 48, Boston: Springer, 2003, pp. 349–369.  https://doi.org/10.1007/0-306-48041-7_14

  54. Auger, A. and Doerr, B., Theory of Randomized Search Heuristics: Foundations and Recent Developments, River Edge, N.J.: World Scientific Publishing, 2011.

    Book  Google Scholar 

  55. Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, Doerr, B. and Neumann, F., Eds., Natural Computing Series, Cham: Springer, 2020.  https://doi.org/10.1007/978-3-030-29414-4

    Book  MATH  Google Scholar 

  56. Red’ko, V. and Tsoy, Yu., Estimation of the evolution speed for the quasispecies model: Arbitrary alphabet case, in Artificial Intelligence and Soft Computing – ICAISC 2006, Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., and Żurada, J.M., Eds., Lecture Notes in Computer Science, vol. 4029, Berlin: Springer, 2006, pp. 460–469.  https://doi.org/10.1007/11785231_49

    Book  Google Scholar 

  57. Red’ko, V.G., Mosalov, O.P., and Prokhorov, D.V., Investigation of evolving populations of adaptive agents, in Artificial Neural Networks: Biological Inspirations – ICANN 2005, Duch, W., Kacprzyk, J., Oja, E., and Zadrożny, S., Eds., Lecture Notes in Computer Science, vol. 3696, Springer, 2005, pp. 337–342.  https://doi.org/10.1007/11550822_53

    Book  Google Scholar 

  58. Antamoshkin, A. and Semenkin, E., Local search efficiency when optimizing unimodal pseudoboolean functions, Informatica, 1998, vol. 9, no. 3, pp. 279–296.  https://doi.org/10.3233/INF-1998-9302

    Article  MathSciNet  MATH  Google Scholar 

  59. Antamoshkin, A.N., Saraev, V.N., and Semenkin, E.S., Optimization of unimodal monotone pseudoboolean functions, Kybernetika, 1990, vol. 26, no. 5, pp. 432–442.

    MathSciNet  MATH  Google Scholar 

  60. Rodzin, S. and Rodzina, L., Theory of bionic optimization and its application to evolutionary synthesis of digital devices, Proc. of IEEE East-West Design & Test Symp. (EWDTS), Kiev, 2014, IEEE, 2014, pp. 1–5.  https://doi.org/10.1109/EWDTS.2014.7027058

  61. El-Khatib, S., Skobtsov, Yu., Rodzin, S., and Potryasaev, S., Theoretical and experimental evaluation of PSO-K-Means algorithm for MRI images segmentation using drift theorem, in Artificial Intelligence Methods in Intelligent Algorithms. CSOC 2019, Silhavy, R., Ed., Advances in Intelligent Systems and Computing, vol. 985, Cham: Springer, 2019, pp. 316–323.  https://doi.org/10.1007/978-3-030-19810-7_31

  62. Borisovsky, P.A. and Eremeev, A.V., Comparing evolutionary algorithms to the (1+1)-EA, Theor. Comput. Sci., 2008, vol. 403, no. 1, pp. 33–41. https://doi.org/10.1016/j.tcs.2008.03.008

    Article  MathSciNet  MATH  Google Scholar 

  63. Corus, D., Dang, D.-C., Eremeev, A.V., and Lehre, P.K., Level-based analysis of genetic algorithms and other search processes, IEEE Trans. Evol. Comput., 2018, vol. 22, no. 5, pp. 707–719.  https://doi.org/10.1109/TEVC.2017.2753538

    Article  Google Scholar 

  64. Eremeev, A.V., On non-elitist evolutionary algorithms optimizing fitness functions with a plateau, in Mathematical Optimization Theory and Operations Research. MOTOR 2020, Kononov, A., Khachay, M., Kalyagin, V., and Pardalos, P., Eds., Lecture Notes in Computer Science, vol. 12095, Cham: Springer, 2020, pp. 329–342.  https://doi.org/10.1007/978-3-030-49988-4_23

    Book  Google Scholar 

  65. Eremeev, A.V., On proportions of fit individuals in population of mutation-based evolutionary algorithm with tournament selection, Evol. Comput., 2018, vol. 26, no. 2, pp. 269–297.  https://doi.org/10.1162/evco_a_00210

    Article  Google Scholar 

  66. Doerr, B., Doerr, C., and Ebel, F., From black-box complexity to designing new genetic algorithms, Theor. Comput. Sci., 2015, vol. 567, pp. 87–104.  https://doi.org/10.1016/j.tcs.2014.11.028

    Article  MathSciNet  MATH  Google Scholar 

  67. Doerr, B., Doerr, C., and Ebel, F., Lessons from the black-box: Fast crossover-based genetic algorithms, Proc. 15th Ann. Conf. Genetic and Evolutionary Computation Conference, Amsterdam, 2013, Blum, C., Ed., New York: Association for Computing Machinery, 2013, pp. 781–788.  https://doi.org/10.1145/2463372.2463480

  68. Doerr, B. and Doerr, C., Optimal static and self-adjusting parameter choices for the (1 + (λ, λ)) genetic algorithm, Algorithmica, 2018, vol. 80, no. 5, pp. 1658–1709.  https://doi.org/10.1007/s00453-017-0354-9

    Article  MathSciNet  MATH  Google Scholar 

  69. Doerr, B. and Doerr, C., Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings, Proc. 2015 Ann. Conf. Genetic and Evolutionary Computation, Madrid, 2015, Silva, S., Ed., New York: Association for Computing Machinery, 2015, pp. 1335–1342.  https://doi.org/10.1145/2739480.2754684

  70. Antipov, D., Buzdalov, M., and Doerr, B., Fast mutation in crossover-based algorithms, Proc. 2020 Genetic and Evolutionary Computation Conference, Cancún, Mexico, 2020, New York: Association for Computing Machinery, 2020, pp. 1268–1276.  https://doi.org/10.1145/3377930.3390172

  71. Goldman, B.W. and Punch, W.F., Parameter-less population pyramid, Proc. 2014 Ann. Conf. on Genetic and Evolutionary Computation, Vancouver, 2014, Igel, C., Ed., New York: Association for Computing Machinery, 2014, pp. 785–792.  https://doi.org/10.1145/2576768.2598350

  72. Buzdalov, M. and Doerr, B., Runtime analysis of the (1 + (λ, λ)) genetic algorithm on random satisfiable 3‑CNF formulas, Proc. Genetic and Evolutionary Computation Conference, Berlin, 2017, New York: Association for Computing Machinery, 2017, pp. 1343–1350.  https://doi.org/10.1145/3071178.3071297

  73. Gandomi, A.H. and Goldman, B.W., Parameter-less population pyramid for large-scale tower optimization, Expert Syst. Appl., 2018, vol. 96, pp. 175–184.  https://doi.org/10.1016/j.eswa.2017.11.047

    Article  Google Scholar 

  74. Mironovich, V. and Buzdalov, M., Hard test generation for maximum flow algorithms with the fast crossover-based evolutionary algorithm, Proc. Companion Publication of the 2015 Ann. Conf. on Genetic and Evolutionary Computation, Madrid, 2015, Silva, S. Ed., New York: Association for Computing Machinery, 2015, pp. 1229–1232.  https://doi.org/10.1145/2739482.2768487

  75. Hevia Fajardo, M.A. and Sudholt, D., On the choice of the parameter control mechanism in the (1 + (λ, λ)) genetic algorithm, Proc. Genetic and Evolutionary Computation Conference, Cancún, Mexico, 2020, New York: Association for Computing Machinery, 2020, pp. 832–840.  https://doi.org/10.1145/3377930.3390200

  76. Bassin, A. and Buzdalov, M., The 1/5-th rule with rollbacks: On self-adjustment of the population size in the (1 + (λ, λ)) GA, Proc. Genetic and Evolutionary Computation Conference Companion, Prague, 2019, López-Ibáñez, M., Ed., New York: Association for Computing Machinery, 2019, pp. 277–278.  https://doi.org/10.1145/3319619.3322067

  77. Garey, M.R. and Johnson, D.S., Computers and Intractability: A Guide to the Theory of NP-Completeness, New York: W. H. Freeman & Co., 1979.

    MATH  Google Scholar 

  78. Mitchell, D., Selman, B., and Levesque, H., Hard and easy distributions of SAT problems, Proc. AAAI Conference on Artificial Intelligence, 1992, pp. 459–465.

  79. Sutton, A.M. and Neumann, F., Runtime analysis of evolutionary algorithms on randomly constructed high-density satisfiable 3-CNF formulas, Parallel Problem Solving from Nature – PPSN XIII. PPSN 2014, Bartz-Beielstein, T., Branke, J., Filipič, B., and Smith, J., Eds., Lecture Notes in Computer Science, vol. 8672, Cham: Springer, 2014, pp. 942–951.  https://doi.org/10.1007/978-3-319-10762-2_93

    Book  Google Scholar 

  80. B. Doerr, F. Neumann, and A. M. Sutton, Improved runtime bounds for the (1+1) EA on random 3-CNF formulas based on fitness-distance correlation, Proc. 2015 Ann. Conf. on Genetic and Evolutionary Computation, Madrid, 2015, Silva, S., Ed., New York: Association for Computing Machinery, 2015, pp. 1415–1422.  https://doi.org/10.1145/2739480.2754659

  81. Doerr, B. and Doerr, C., A tight runtime analysis of the (1 + (λ, λ)) genetic algorithm on OneMax, Proc. 2015 Ann. Conf. on Genetic and Evolutionary Computation, Madrid, 2015, New York: Association for Computing Machinery, 2015, pp. 1423–1430.  https://doi.org/10.1145/2739480.2754683

  82. Pinto, E.C. and Doerr, C., Towards a more practice-aware runtime analysis of evolutionary algorithms, 2018. arXiv:1812.00493 [cs.NE]

  83. Pinto, E.C. and Doerr, C., A simple proof for the usefulness of crossover in black-box optimization, Parallel Problem Solving from Nature – PPSN XV. PPSN 2018, Auger, A., Fonseca, C., Lourenço, N., Machado, P., Paquete, L., and Whitley, D., Eds., Lecture Notes in Computer Science, vol. 11102, Cham: Springer: 2018, pp. 29–41.  https://doi.org/10.1007/978-3-319-99259-4_3

    Book  Google Scholar 

  84. Doerr, B., Optimal parameter settings for the (1 + (λ, λ)) genetic algorithm, Proc. Genetic and Evolutionary Computation Conference, Denver, Colo., 2016, Friedrich, T., Ed., New York: Association for Computing Machinery, 2016, pp. 1107–1114.  https://doi.org/10.1145/2908812.2908885

Download references

Funding

This research was supported by the Russian Scientific Foundation, agreement no. 17-71-20178.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to A. O. Bassin or M. V. Buzdalov.

Ethics declarations

The authors declare that they have no conflicts of interest.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bassin, A.O., Buzdalov, M.V. & Shalyto, A.A. The “One-Fifth Rule” with Rollbacks for Self-Adjustment of the Population Size in the (1 + (λ, λ)) Genetic Algorithm. Aut. Control Comp. Sci. 55, 885–902 (2021). https://doi.org/10.3103/S0146411621070208

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3103/S0146411621070208

Keywords:

Navigation