Toward Step-Size Adaptation in Evolutionary Multiobjective Optimization

  • Simon Wessing
  • Rosa Pink
  • Kai Brandenbusch
  • Günter Rudolph
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10173)

Abstract

We give a definition for step size optimality in multiobjective optimization and visualize the optimal step sizes for a few two-dimensional example constellations. After that, we try to engineer a step size adaptation mechanism that also works in the real world. For this mechanism, we employ the self-adaptation of mutation strength, which is simple and well-known from single-objective optimization. The resulting approach obtains better results than simulated binary crossover and polynomial mutation on the bi-objective BBOB testbed.

References

  1. 1.
    Auger, A., Brockhoff, D., Hansen, N., Tušar, D., Tušar, T., Wagner, T.: The impact of variation operators on the performance of SMS-EMOA on the bi-objective BBOB-2016 test suite. In: Companion Publication of the 2016 Conference on Genetic and Evolutionary Computation, pp. 1225–1232. ACM (2016)Google Scholar
  2. 2.
    Beume, N., Laumanns, M., Rudolph, G.: Convergence rates of (1 + 1) evolutionary multiobjective optimization algorithms. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 597–606. Springer, Heidelberg (2010). doi:10.1007/978-3-642-15844-5_60 Google Scholar
  3. 3.
    Beume, N., Laumanns, M., Rudolph, G.: Convergence rates of SMS-EMOA on continuous bi-objective problem classes. In: Proceedings of the 11th Workshop on Foundations of Genetic Algorithms, FOGA 2011, pp. 243–252. ACM (2011)Google Scholar
  4. 4.
    Beume, N., Naujoks, B., Emmerich, M.: SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 181(3), 1653–1669 (2007)CrossRefMATHGoogle Scholar
  5. 5.
    Beyer, H.G., Deb, K.: On self-adaptive features in real-parameter evolutionary algorithms. IEEE Trans. Evol. Comput. 5(3), 250–270 (2001)CrossRefGoogle Scholar
  6. 6.
    Beyer, H.G., Schwefel, H.P.: Evolution strategies - a comprehensive introduction. Nat. Comput. 1(1), 3–52 (2002)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Brandenbusch, K.: Experimentelle Analyse selbst-adaptiver Schrittweitenanpassung bei mehrkriteriellen Evolutionären Algorithmen. Bachelor’s thesis. TU Dortmund University, Department of Computer Science, Dortmund, September 2016. (in German)Google Scholar
  8. 8.
    Bringmann, K., Friedrich, T., Klitzke, P.: Generic postprocessing via subset selection for hypervolume and epsilon-indicator. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 518–527. Springer, Heidelberg (2014). doi:10.1007/978-3-319-10762-2_51 Google Scholar
  9. 9.
    Brockhoff, D., Tran, T.D., Hansen, N.: Benchmarking numerical multiobjective optimizers revisited. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, GECCO 2015, pp. 639–646. ACM (2015)Google Scholar
  10. 10.
    Brockhoff, D., Tušar, T., Tušar, D., Wagner, T., Hansen, N., Auger, A.: Biobjective performance assessment with the COCO platform. e-print 1605.01746, arXiv (2016). https://arxiv.org/abs/1605.01746
  11. 11.
    Conover, W.J., Iman, R.L.: Rank transformations as a bridge between parametric and nonparametric statistics. Am. Stat. 35(3), 124–129 (1981)MATHGoogle Scholar
  12. 12.
    Deb, K., Agrawal, R.B.: Simulated binary crossover for continuous search space. Complex Syst. 9, 115–148 (1995)MathSciNetMATHGoogle Scholar
  13. 13.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  14. 14.
    Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3(2), 124–141 (1999)CrossRefGoogle Scholar
  15. 15.
    Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Heidelberg (2003)CrossRefMATHGoogle Scholar
  16. 16.
    Hupkens, I., Deutz, A., Yang, K., Emmerich, M.: Faster exact algorithms for computing expected hypervolume improvement. In: Gaspar-Cunha, A., Henggeler Antunes, C., Coello, C.C. (eds.) EMO 2015. LNCS, vol. 9019, pp. 65–79. Springer, Cham (2015). doi:10.1007/978-3-319-15892-1_5 Google Scholar
  17. 17.
    Igel, C., Hansen, N., Roth, S.: Covariance matrix adaptation for multi-objective optimization. Evol. Comput. 15(1), 1–28 (2007)CrossRefGoogle Scholar
  18. 18.
    Judt, L., Mersmann, O., Naujoks, B.: Non-monotonicity of observed hypervolume in 1-greedy S-metric selection. J. Multi-Criteria Decis. Anal. 20(5–6), 277–290 (2013)CrossRefGoogle Scholar
  19. 19.
    Klinkenberg, J.W., Emmerich, M.T.M., Deutz, A.H., Shir, O.M., Bäck, T.: A reduced-cost SMS-EMOA using kriging, self-adaptation, and parallelization. In: Ehrgott, M., Naujoks, B., Stewart, J.T., Wallenius, J. (eds.) Multiple Criteria Decision Making for Sustainable Energy and Transportation Systems. LNE, vol. 634, pp. 301–311. Springer, Heidelberg (2010). doi:10.1007/978-3-642-04045-0_26 CrossRefGoogle Scholar
  20. 20.
    Kursawe, F.: A variant of evolution strategies for vector optimization. In: Schwefel, H.-P., Männer, R. (eds.) PPSN 1990. LNCS, vol. 496, pp. 193–197. Springer, Heidelberg (1991). doi:10.1007/BFb0029752 CrossRefGoogle Scholar
  21. 21.
    Naujoks, B., Quagliarella, D., Bartz-Beielstein, T.: Sequential parameter optimisation of evolutionary algorithms for airfoil design. In: Winter, G. (ed.) Proceedings of Design and Optimization: Methods and Application (ERCOFTAC 2006), pp. 231–235. University of Las Palmas de Gran Canaria (2006)Google Scholar
  22. 22.
    Pink, R.: Optimale Schrittweiten für einen bikriteriellen Evolutionären Algorithmus mit S-Metrik Selektion. Bachelor’s thesis. TU Dortmund University, Department of Computer Science, Dortmund, September 2016. (in German)Google Scholar
  23. 23.
    Rechenberg, I.: Evolutionsstrategie - Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann Holzboog, Stuttgart (1973)Google Scholar
  24. 24.
    Rudolph, G.: On a multi-objective evolutionary algorithm and its convergence to the Pareto set. In: IEEE International Conference on Evolutionary Computation Proceedings, pp. 511–516 (1998)Google Scholar
  25. 25.
    Schumer, M., Steiglitz, K.: Adaptive step size random search. IEEE Trans. Autom. Control 13(3), 270–276 (1968)CrossRefGoogle Scholar
  26. 26.
    Tušar, T., Brockhoff, D., Hansen, N., Auger, A.: COCO: the bi-objective black box optimization benchmarking (bbob-biobj) test suite. e-print 1604.00359, arXiv (2016). https://www.arxiv.org/abs/1604.00359v2
  27. 27.
    Wessing, S.: Repair methods for box constraints revisited. In: Esparcia-Alcázar, A.I. (ed.) EvoApplications 2013. LNCS, vol. 7835, pp. 469–478. Springer, Heidelberg (2013). doi:10.1007/978-3-642-37192-9_47 CrossRefGoogle Scholar
  28. 28.
    Wessing, S., Beume, N., Rudolph, G., Naujoks, B.: Parameter tuning boosts performance of variation operators in multiobjective optimization. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 728–737. Springer, Heidelberg (2010). doi:10.1007/978-3-642-15844-5_73 Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Simon Wessing
    • 1
  • Rosa Pink
    • 1
  • Kai Brandenbusch
    • 1
  • Günter Rudolph
    • 1
  1. 1.Computer Science DepartmentTechnische Universität DortmundDortmundGermany

Personalised recommendations