Parameter Tuning Boosts Performance of Variation Operators in Multiobjective Optimization

  • Simon Wessing
  • Nicola Beume
  • Günter Rudolph
  • Boris Naujoks
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6238)

Abstract

Typically, the variation operators deployed in evolutionary multiobjective optimization algorithms (EMOA) are either simulated binary crossover with polynomial mutation or differential evolution operators. This empirical study aims at the development of a sound method how to assess which of these variation operators perform best in the multiobjective context. In case of the S-metric selection EMOA our main findings are: (1) The performance of the tuned operators improved significantly compared to the default parameterizations. (2) The performance of the two tuned variation operators is very similar. (3) The optimized parameter configurations for the considered problems are very different.

Keywords

parameter tuning performance assessment benchmarking multiobjective variation operators sequential parameter optimization 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Huang, V.L., Qin, A.K., Deb, K., Zitzler, E., Suganthan, P.N., Liang, J.J., Preuss, M., Huband, S.: Problem definitions for performance assessment of multi-objective optimization algorithms. Technical report, Nanyang Technological University, Singapore (2007), http://www3.ntu.edu.sg/home/epnsugan/index_files/CEC-07/CEC07.htm
  2. 2.
    Deb, K., Agrawal, R.B.: Simulated binary crossover for continuous search space. Complex Systems 9, 115–148 (1995)MathSciNetMATHGoogle Scholar
  3. 3.
    Emmerich, M., Beume, N., Naujoks, B.: An EMO algorithm using the hypervolume measure as selection criterion. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 62–76. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  4. 4.
    Bartz-Beielstein, T.: Experimental Research in Evolutionary Computation - The New Experimentalism. Natural Computing Series. Springer, Berlin (2006)MATHGoogle Scholar
  5. 5.
    Wessing, S., Naujoks, B.: Sequential Parameter Optimization for Multi-Objective Problems. In: Congress on Evolutionary Computation (CEC) within the World Congress on Computational Intelligence, WCCI, pp. 4063–4070 (2010) (accepted for publication)Google Scholar
  6. 6.
    Knowles, J.D., Thiele, L., Zitzler, E.: A tutorial on the performance assessment of stochastic multiobjective optimizers. Technical report, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH) Zurich (2005)Google Scholar
  7. 7.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  8. 8.
    Storn, R., Price, K.: Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization 11(4), 341–359 (1997)MathSciNetMATHCrossRefGoogle Scholar
  9. 9.
    Montgomery, D.C.: Design and Analysis of Experiments, 4th edn. Wiley, New York (1997)MATHGoogle Scholar
  10. 10.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Statistical Science 4(4), 409–423 (1989)MathSciNetMATHCrossRefGoogle Scholar
  11. 11.
    Lophaven, S.N., Nielsen, H.B., Søndergaard, J.: DACE – A Matlab Kriging Toolbox. Technical Report IMM-REP-2002-12, Informatics and Mathematical Modelling, Technical University of Denmark, Copenhagen, Denmark (2002)Google Scholar
  12. 12.
    Hollander, M., Wolfe, D.A.: Nonparametric Statistical Methods. John Wiley & Sons, New York (1973)MATHGoogle Scholar
  13. 13.
    Kukkonen, S., Lampinen, J.: Performance assessment of generalized differential evolution 3 (GDE3) with a given set of problems. In: Congress on Evolutionary Computation (CEC), pp. 3593–3600. IEEE Press, Piscataway (2007)Google Scholar
  14. 14.
    Wessing, S.: Towards optimal parameterizations of the S-metric selection evolutionary multi-objective algorithm. Algorithm Engineering Report TR09-2-006, Technische Universität Dortmund (2009)Google Scholar
  15. 15.
    Naujoks, B., Willmes, L., Bäck, T., Haase, W.: Evaluating multi-criteria evolutionary algorithms for airfoil optimisation. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 841–850. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  16. 16.
    Emmerich, M., Giannakoglou, K., Naujoks, B.: Single- and multi-objective evolutionary optimization assisted by gaussian random field metamodels. IEEE Transactions on Evolutionary Computation 10(4), 421–439 (2006)CrossRefGoogle Scholar
  17. 17.
    Preuss, M., Rudolph, G., Wessing, S.: Tuning optimization algorithms for real-world problems by means of surrogate modeling. In: Genetic and Evolutionary Computation Conference, GECCO, pp. 401–408 (2010) (accepted for publication)Google Scholar
  18. 18.
    Smit, S.K., Eiben, A.E.: Comparing parameter tuning methods for evolutionary algorithms. In: Proceedings of the 2009 IEEE Congress on Evolutionary Computation, pp. 399–406. IEEE Press, Los Alamitos (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Simon Wessing
    • 1
  • Nicola Beume
    • 1
  • Günter Rudolph
    • 1
  • Boris Naujoks
    • 1
  1. 1.Fakultät für InformatikTechnische Universität DortmundGermany

Personalised recommendations