Advertisement

Journal of Heuristics

, Volume 22, Issue 3, pp 273–300 | Cite as

The hypervolume based directed search method for multi-objective optimization problems

  • Oliver SchützeEmail author
  • Víctor Adrián Sosa Hernández
  • Heike Trautmann
  • Günter Rudolph
Article

Abstract

We present a new hybrid evolutionary algorithm for the effective hypervolume approximation of the Pareto front of a given differentiable multi-objective optimization problem. Starting point for the local search (LS) mechanism is a new division of the decision space as we will argue that in each of these regions a different LS strategy seems to be most promising. For the LS in two out of the three regions we will utilize and adapt the Directed Search method which is capable of steering the search into any direction given in objective space and which is thus well suited for the problem at hand. We further on integrate the resulting LS mechanism into SMS-EMOA, a state-of-the-art evolutionary algorithm for hypervolume approximations. Finally, we will present some numerical results on several benchmark problems with two and three objectives indicating the strength and competitiveness of the novel hybrid.

Keywords

Multi-objective optimization Evolutionary computation  Memetic algorithm Directed search method Hypervolume 

Notes

Acknowledgments

Victor Adrian Sosa Hernández acknowledges support by the Consejo Nacional de Ciencia y Tecnología (CONACYT). Heike Trautmann acknowledges support by the European Center of Information Systems (ERCIS). All authors acknowledge support from CONACYT Project no. 174443 and DFG Project no. TR 891/5-1

References

  1. Auger, A., Bader, J., Brockhoff, D., Zitzler, E.: Theory of the hypervolume indicator: optimal \(\mu \)-distributions and the choice of the reference point. In: Proceedings of the Tenth ACM SIGEVO Workshop on Foundations of Genetic Algorithms, FOGA ’09, pp. 87–102. ACM, New York, NY, USA (2009)Google Scholar
  2. Beume, N.: S-Metric calculation by considering dominated hypervolume as Klee’s measure problem. Evolut. Comput. 17(4), 477–492 (2009)CrossRefGoogle Scholar
  3. Beume, N., Naujoks, B., Emmerich, M.: SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 181(3), 1653–1669 (2006)CrossRefzbMATHGoogle Scholar
  4. Bosman, P.A.N.: On gradients and hybrid evolutionary algorithms for real-valued multiobjective optimization. IEEE Trans. Evolut. Comput. 16(1), 51–69 (2012)CrossRefGoogle Scholar
  5. Bringmann, K., Friedrich, T.: The maximum hypervolume set yields near-optimal approximation. In: Proceedings of the ACM Genetic and Evolutionary Computation Conference (GECCO), pp. 511–518. ACM Press (2010)Google Scholar
  6. Bringmann, K., Friedrich, T.: Tight bounds for the approximation ratio of the hypervolume indicator. In: Proceedings of the 11th International Conference Parallel Problem Solving From Nature (PPSN). Lecture Notes in Computer Science, vol. 6238, pp. 607–616, Krakow, Poland, September. Springer (2010)Google Scholar
  7. Brown, M., Smith, R.E.: Directed multi-objective optimisation. Int. J. Comput. Syst. Signals 6(1), 3–17 (2005)Google Scholar
  8. Coello, C.A., Lamont, G.B., Van Veldhuizen, D.A.: Evolutionary Algorithms for Solving Multi-Objective Problems, 2nd edn. Springer, New York (2007)zbMATHGoogle Scholar
  9. Deb, K.: Multi-Objective Optimization using Evolutionary Algorithms. Wiley, Chichester (2001)zbMATHGoogle Scholar
  10. Emmerich, M., Deutz, A.: Time complexity and zeros of the hypervolume indicator gradient field. In: Schütze, O., et al. (eds.) EVOLVE - A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation III. Studies in Computational Intelligence, vol. 500, pp. 169–193. Springer, Berlin (2014)Google Scholar
  11. Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20, 602–626 (2009)CrossRefzbMATHMathSciNetGoogle Scholar
  12. Fliege, J., Fux Svaiter, B.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)CrossRefzbMATHMathSciNetGoogle Scholar
  13. Hernández, V.A.S., Schütze, O., Emmerich, M.: Hypervolume maximization via set based newtons method. In: Tantar, A., et al. (eds.) EVOLVE—A Bridge Between Probability, Set Oriented Numerics, and Evolutionary Computation V. Advances in Intelligent Systems and Computing, vol. 288, pp. 15–28. springer, Berlin (2014)Google Scholar
  14. Hernández, V.A.S., Schütze, O., Rudolph, G., Trautmann, H.: The directed search method for pareto front approximations with maximum dominated hypervolume. In: Emmerich, M., et al. (eds.) EVOLVE—A Bridge Between Probability, Set Oriented Numerics, and Evolutionary Computation IV. Advances in Intelligent Systems and Computing, vol. 227, pp. 189–205. Springer, Berlin (2013)Google Scholar
  15. Hillermeier, C.: Nonlinear Multiobjective Optimization: A Generalized Homotopy Approach. Springer, Basel (2001)CrossRefzbMATHGoogle Scholar
  16. Ishibuchi, H., Yoshida, T., Murata, T.: Balance between genetic search and local search in memetic algorithms for multiobjective permutation flowshop scheduling. IEEE Trans. Evolut. Comput. 7(2), 204–223 (2003)CrossRefGoogle Scholar
  17. Jaszkiewicz, A., Ishibuchi, H., Zhang, Q.: Multiobjective memetic algorithms. Handbook of Memetic Algorithms. Studies in Computational Intelligence, vol. 379, pp. 201–217. Springer, Berlin (2012)CrossRefGoogle Scholar
  18. Knowles, J., Corne, D.: Memetic algorithms for multiobjective optimization: issues, methods and prospects. In: Hart, W.E., Krasnogor, N., Smith, J.E. (eds.) Recent Advances in Memetic Algorithms. Studies in Fuzziness and Soft Computing, vol. 166, pp. 313–352. Springer, Berlin (2005)CrossRefGoogle Scholar
  19. Knowles, J.D.: Local-search and hybrid evolutionary algorithms for pareto optimization. PhD thesis, University of Reading, UK (2002). (Section 4.3.4 “S metric Archiving”)Google Scholar
  20. Knowles, J.D., Corne, D.: Properties of an adaptive archiving algorithm for storing nondominated vectors. IEEE Trans. Evolut. Comput. 7(2), 100–116 (2003)CrossRefGoogle Scholar
  21. Knowles, J., Corne, D.: On metrics for comparing nondominated sets. In: Proceedings of the 2002 Congress on Evolutionary Computation (CEC’02), vol. 1, pp. 711–716. IEEE (2002)Google Scholar
  22. Lara, A., Coello Coello, C. A., Schütze, O.: A painless gradient-assisted multi-objective memetic mechanism for solving continuous bi-objective optimization problems. In: 2010 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8. IEEE Press, IEEE (2010)Google Scholar
  23. Lara, A., Sanchez, G., Coello Coello, C.A., Schütze, O.: HCS: a new local search strategy for memetic multiobjective evolutionary algorithms. IEEE Trans. Evolut. Comput. 14(1), 112–132 (2010)CrossRefGoogle Scholar
  24. Moscato, P.: On evolution, search, optimization, genetic algorithms and martial arts: toward memetic algorithms. Technical Report C3P Report 826, Caltech Concurrent Computation Program (1989)Google Scholar
  25. Rozenberg, G., Bäck, T., Kok, J.N.: Handbook of Natural Computing. Springer, Berlin (2012)CrossRefzbMATHGoogle Scholar
  26. Schäffler, S., Schultz, R., Weinzierl, K.: A stochastic method for the solution of unconstrained vector optimization problems. J. Optim. Theory Appl. 114(1), 209–222 (2002)CrossRefzbMATHMathSciNetGoogle Scholar
  27. Schütze, O., Martín, A., Lara, A., Alvarado, S., Salinas, E., Coello, C.A.: The directed search method for multiobjective memetic algorithms. J. Comput. Optim. Appl. 63, 305–332 (2016)CrossRefzbMATHGoogle Scholar
  28. Shukla, P.: On gradient based local search methods in unconstrained evolutionary multi-objective optimization. In: Obayashi, S. et al. (ed), EMO 2007, pp. 96–110 (2007)Google Scholar
  29. Shukla, P.K.: Gradient based stochastic mutation operators in evolutionary multi-objective optimization. Adaptive and Natural Computing Algorithms, pp. 58–66. Springer, Berlin (2007)CrossRefGoogle Scholar
  30. Vasile, M., Zuiani, F.: Multi-agent collaborative search: an agent-based memetic multi-objective optimization algorithm applied to space trajectory design. Proc. Inst. Mech. Eng. G 225(11), 1211–1227 (2011)CrossRefGoogle Scholar
  31. Wagner, T., Beume, N., Naujoks, B.: Pareto-, aggregation-, and indicator-based methods in many-objective optimization. In: Proceedings of the 4th International Conference on Evolutionary Multi-criterion Optimization, EMO’07, pp. 742–756 (2007). Springer, BerlinGoogle Scholar
  32. Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evolut. Comput. 11(6), 712–731 (2007)CrossRefGoogle Scholar
  33. Zitzler, E.: Evolutionary algorithms for multiobjective optimization: methods and applications. PhD thesis, ETH Zurich (1999)Google Scholar
  34. Zitzler, E., Künzli, S.: Indicator-based selection in multiobjective search. In: Yao, X. et al., (eds) Proc. Int’l Conf. on Parallel problem Solving from Nature (PPSN VIII), pp. 832–842. Springer, Berlin (2004)Google Scholar
  35. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Da Fonseca, V.G.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evolut. Comput. 7(2), 117–132 (2003)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Oliver Schütze
    • 1
    Email author
  • Víctor Adrián Sosa Hernández
    • 1
  • Heike Trautmann
    • 2
  • Günter Rudolph
    • 3
  1. 1.Computer Science DepartmentCINVESTAV-IPNCol. San Pedro ZacatencoMexico
  2. 2.Department of Information SystemsUniversity of MünsterMünsterGermany
  3. 3.Fakultät für InformatikTechnische Universität DortmundDortmundGermany

Personalised recommendations