Advertisement

Weighted preferences in evolutionary multi-objective optimization

  • Tobias Friedrich
  • Trent KroegerEmail author
  • Frank Neumann
Original Article

Abstract

Evolutionary algorithms have been widely used to tackle multi-objective optimization problems. Incorporating preference information into the search of evolutionary algorithms for multi-objective optimization is of great importance as it allows one to focus on interesting regions in the objective space. Zitzler et al. have shown how to use a weight distribution function on the objective space to incorporate preference information into hypervolume-based algorithms. We show that this weighted information can easily be used in other popular EMO algorithms as well. Our results for NSGA-II and SPEA2 show that this yields similar results to the hypervolume approach and requires less computational effort.

Keywords

Evolutionary algorithms Multi-objective optimization User preferences 

References

  1. 1.
    Allmendinger R, Li X, Branke J (2008) Reference point-based particle swarm optimization using a steady-state approach. In: Proc. simulated evolution and learning (SEAL ’08). Lecture Notes in Computer Science, vol 5361, pp 200–209Google Scholar
  2. 2.
    Auger A, Bader J, Brockhoff D, Zitzler E (2009) Articulating user preferences in many-objective problems by sampling the weighted hypervolume. In: Proc. 11th annual conference on genetic and evolutionary computation, pp 555–562Google Scholar
  3. 3.
    Beume N, Naujoks B, Emmerich MTM (2007) SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur J Oper Res 181:1653–1669zbMATHCrossRefGoogle Scholar
  4. 4.
    Bringmann K, Friedrich T (2010) Approximating the volume of unions and intersections of high-dimensional geometric objects. Comput Geom Theory Appl 43:601–610MathSciNetzbMATHCrossRefGoogle Scholar
  5. 5.
    Coello Coello CA, Van Veldhuizen DA, Lamont GB (2002) Evolutionary algorithms for solving multi-objective problems. Kluwer Academic Publishers, New YorkzbMATHCrossRefGoogle Scholar
  6. 6.
    Deb K (2001) Multi-objective optimization using evolutionary algorithms. Wiley, ChichesterzbMATHGoogle Scholar
  7. 7.
    Deb K, Agrawal S, Pratap A, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197CrossRefGoogle Scholar
  8. 8.
    Deb K, Sundar J (2006) Reference point based multi-objective optimization using evolutionary algorithms. In: Proc. 8th annual conference on genetic and evolutionary computation conference (GECCO ’06), pp 635–642Google Scholar
  9. 9.
    Friedrich T, Kroeger T, Neumann F (2011) Weighted preferences in evolutionary multi-objective optimization. In: Wang D, Reynolds M (eds) Australasian conference on artificial intelligence. Lecture Notes in Computer Science, volume 7106, pp 291–300. SpringerGoogle Scholar
  10. 10.
    Ho S-l, Yang S, Ni G (2008) Incorporating a priori preferences in a vector pso algorithm to find arbitrary fractions of the pareto front of multiobjective design problems. IEEE Trans Magn 44:1038–1041CrossRefGoogle Scholar
  11. 11.
    Hu Q, Xu L, Goodman ED (2009) Non-even spread NSGA-II and its application to conflicting multi-objective compatible control. In: Proc. genetic and evolutionary computation conference summit (GEC ’09), pp 223–230Google Scholar
  12. 12.
    Huband S, Barone L, While L, Hingston P (2005) A scalable multi-objective test problem toolkit. In: Evolutionary multi-criterion optimization, pp 280–295. SpringerGoogle Scholar
  13. 13.
    Igel C, Hansen N, Roth S (2007) Covariance matrix adaptation for multi-objective optimization. Evol Comput 15:1–28CrossRefGoogle Scholar
  14. 14.
    Suttorp T, Hansen N, Igel C (2009) Efficient covariance matrix update for variable metric evolution strategies. Mach Learn 75:167–197CrossRefGoogle Scholar
  15. 15.
    Thiele L, Miettinen K, Korhonen PJ, Luque JM (2009) A preference-based evolutionary algorithm for multi-objective optimization. Evol Comput 17(3):411–436CrossRefGoogle Scholar
  16. 16.
    Wickramasinghe UK, Li X (2008) Integrating user preferences with particle swarms for multi-objective optimization. In: Proc. 10th annual conference on genetic and evolutionary computation, pp 745–752Google Scholar
  17. 17.
    Wickramasinghe UK, Li X (2009) Using a distance metric to guide pso algorithms for many-objective optimization. In: Proc. 11th annual conference on genetic and evolutionary computation, pp 667–674Google Scholar
  18. 18.
    Zitzler E, Brockhoff D, Thiele L (2007) The hypervolume indicator revisited: on the design of Pareto-compliant indicators via weighted integration. In: Proc. fourth international conference on evolutionary multi-criterion optimization (EMO ’07). LNCS, vol 4403, pp 862–876. SpringerGoogle Scholar
  19. 19.
    Zitzler E, Deb K, Thiele L (2000) Comparison of multiobjective evolutionary algorithms: empirical results. Evol Comput 8(2):173–195CrossRefGoogle Scholar
  20. 20.
    Zitzler E, Laumanns M, Thiele L (2002) SPEA2: Improving the strength Pareto evolutionary algorithm for multiobjective optimization. In: Proc. evolutionary methods for design, optimisation and control with application to industrial problems (EUROGEN 2001), pp 95–100Google Scholar

Copyright information

© Springer-Verlag 2012

Authors and Affiliations

  • Tobias Friedrich
    • 1
  • Trent Kroeger
    • 2
    Email author
  • Frank Neumann
    • 2
  1. 1.Max-Planck-Institut für InformatikSaarbrückenGermany
  2. 2.School of Computer ScienceThe University of AdelaideAdelaideAustralia

Personalised recommendations