Automatic Adaption of Operator Probabilities in Genetic Algorithms with Offspring Selection
- 1k Downloads
When offspring selection is applied in genetic algorithms, multiple crossover and mutation operators can be easily used together as crossover and mutation results of insufficient quality are discarded in the additional selection step after creating new solutions. Therefore, the a priori choice of appropriate crossover and mutation operators becomes less critical and it even turned out that multiple operators reduce the bias, broaden the search, and thus lead to higher solution quality in the end. However, using crossover and mutation operators which often produce solutions not passing the offspring selection criterion also increases the selection pressure and consequently the number of evaluated solutions.
Therefore, we present a new generic scheme for tuning the selection probabilities of multiple crossover and mutation operators in genetic algorithms with offspring selection automatically at runtime. Thereby those operators are applied more frequently which were able to produce good results in the last generation, which leads to comparable solution quality and results in a significant decrease of evaluated solutions.
KeywordsOffspring Survival Genetic Algorithm (GAs) Adaptive Automation Achieved Solution Quality Multiple Crossover
The work described in this paper was done within the COMET Project Heuristic Optimization in Production and Logistics (HOPL), #843532 funded by the Austrian Research Promotion Agency (FFG).
- 1.Affenzeller, M., Wagner, S.: SASEGASA: a new generic parallel evolutionary algorithm for achieving highest quality results. J. Heuristics - Spec. Issue New Adv. Parallel Meta-Heuristics Complex Probl. 10, 239–263 (2004)Google Scholar
- 2.Affenzeller, M., Wagner, S.: Offspring selection: a new self-adaptive selection scheme for genetic algorithms. In: Ribeiro, B., Albrecht, R.F., Dobnikar, A., Pearson, D.W., Steele, N.C. (eds.) Adaptive and Natural Computing Algorithms. Springer Computer Series, pp. 218–221. Springer, Vienna (2005)CrossRefGoogle Scholar
- 7.Wagner, S., Kronberger, G., Beham, A., Kommenda, M., Scheibenpflug, A., Pitzer, E., Vonolfen, S., Kofler, M., Winkler, S.: Architecture and design of the heuristiclab optimization environment. In: Klempous, R., Nikodem, J., Jacak, W., Chaczko, Z. (eds.) Advanced Methods and Applications in Computational Intelligence. Topics in Intelligent Engineering and Informatics, vol. 6, pp. 197–261. Springer, New York (2014)CrossRefGoogle Scholar