Weighted Preferences in Evolutionary Multi-objective Optimization

  • Tobias Friedrich
  • Trent Kroeger
  • Frank Neumann
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7106)


Evolutionary algorithms have been widely used to tackle multi-objective optimization problems. Incorporating preference information into the search of evolutionary algorithms for multi-objective optimization is of great importance as it allows one to focus on interesting regions in the objective space. Zitzler et al. have shown how to use a weight distribution function on the objective space to incorporate preference information into hypervolume-based algorithms. We show that this weighted information can easily be used in other popular EMO algorithms as well. Our results for NSGA-II and SPEA2 show that this yields similar results to the hypervolume approach and requires less computational effort.


Pareto Front Multiobjective Optimization Objective Space Objective Vector Hypervolume Indicator 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Allmendinger, R., Li, X., Branke, J.: Reference Point-Based Particle Swarm Optimization Using a Steady-State Approach. In: Li, X., Kirley, M., Zhang, M., Green, D., Ciesielski, V., Abbass, H.A., Michalewicz, Z., Hendtlass, T., Deb, K., Tan, K.C., Branke, J., Shi, Y. (eds.) SEAL 2008. LNCS, vol. 5361, pp. 200–209. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  2. 2.
    Auger, A., Bader, J., Brockhoff, D., Zitzler, E.: Articulating user preferences in many-objective problems by sampling the weighted hypervolume. In: Proc. 11th Annual Conference on Genetic and Evolutionary Computation, pp. 555–562 (2009)Google Scholar
  3. 3.
    Beume, N., Naujoks, B., Emmerich, M.T.M.: SMS-EMOA: Multiobjective selection based on dominated hypervolume. European Journal of Operational Research 181, 1653–1669 (2007)CrossRefzbMATHGoogle Scholar
  4. 4.
    Bringmann, K., Friedrich, T.: Approximating the volume of unions and intersections of high-dimensional geometric objects. Computational Geometry: Theory and Applications 43, 601–610 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Coello Coello, C.A., Van Veldhuizen, D.A., Lamont, G.B.: Evolutionary Algorithms for Solving Multi-Objective Problems. Kluwer Academic Publishers, New York (2002)CrossRefzbMATHGoogle Scholar
  6. 6.
    Deb, K.: Multi-objective optimization using evolutionary algorithms. Wiley, Chichester (2001)zbMATHGoogle Scholar
  7. 7.
    Deb, K., Sundar, J.: Reference point based multi-objective optimization using evolutionary algorithms. In: Proc. 8th Annual Conference on Genetic and Evolutionary Computation Conference (GECCO 2006), pp. 635–642 (2006)Google Scholar
  8. 8.
    Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evolutionary Computation 6(2), 182–197 (2002)CrossRefGoogle Scholar
  9. 9.
    Ho, S.-l., Yang, S., Ni, G.: Incorporating a priori preferences in a vector pso algorithm to find arbitrary fractions of the pareto front of multiobjective design problems. IEEE Trans. Magnetics 44, 1038–1041 (2008)CrossRefGoogle Scholar
  10. 10.
    Hu, Q., Xu, L., Goodman, E.D.: Non-even spread nsga-ii and its application to conflicting multi-objective compatible control. In: Proc. Genetic and Evolutionary Computation Conference Summit (GEC 2009), pp. 223–230 (2009)Google Scholar
  11. 11.
    Igel, C., Hansen, N., Roth, S.: Covariance matrix adaptation for multi-objective optimization. Evolutionary Computation 15, 1–28 (2007)CrossRefGoogle Scholar
  12. 12.
    Suttorp, T., Hansen, N., Igel, C.: Efficient covariance matrix update for variable metric evolution strategies. Machine Learning 75, 167–197 (2009)CrossRefGoogle Scholar
  13. 13.
    Thiele, L., Miettinen, K., Korhonen, P.J., Luque, J.M.: A preference-based evolutionary algorithm for multi-objective optimization. Evolutionary Computation 17(3), 411–436 (2009)CrossRefGoogle Scholar
  14. 14.
    Wickramasinghe, U.K., Li, X.: Integrating user preferences with particle swarms for multi-objective optimization. In: Proc. 10th Annual Conference on Genetic and Evolutionary Computation, pp. 745–752 (2008)Google Scholar
  15. 15.
    Wickramasinghe, U.K., Li, X.: Using a distance metric to guide pso algorithms for many-objective optimization. In: Proc. 11th Annual Conference on Genetic and Evolutionary Computation, pp. 667–674 (2009)Google Scholar
  16. 16.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of Multiobjective Evolutionary Algorithms: Empirical Results. Evolutionary Computation 8(2), 173–195 (2000)CrossRefGoogle Scholar
  17. 17.
    Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the strength Pareto evolutionary algorithm for multiobjective optimization. In: Proc. Evolutionary Methods for Design, Optimisation and Control with Application to Industrial Problems (EUROGEN 2001), pp. 95–100 (2002)Google Scholar
  18. 18.
    Zitzler, E., Brockhoff, D., Thiele, L.: The Hypervolume Indicator Revisited: On the Design of Pareto-compliant Indicators via Weighted Integration. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 862–876. Springer, Heidelberg (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Tobias Friedrich
    • 1
  • Trent Kroeger
    • 2
  • Frank Neumann
    • 2
  1. 1.Max-Planck-Institut für InformatikSaarbrückenGermany
  2. 2.School of Computer ScienceUniversity of AdelaideAdelaideAustralia

Personalised recommendations