Advertisement

Journal of Global Optimization

, Volume 60, Issue 3, pp 575–594 | Cite as

Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization

Article

Abstract

The use of surrogate based optimization (SBO) is widely spread in engineering design to reduce the number of computational expensive simulations. However, “real-world” problems often consist of multiple, conflicting objectives leading to a set of competitive solutions (the Pareto front). The objectives are often aggregated into a single cost function to reduce the computational cost, though a better approach is to use multiobjective optimization methods to directly identify a set of Pareto-optimal solutions, which can be used by the designer to make more efficient design decisions (instead of weighting and aggregating the costs upfront). Most of the work in multiobjective optimization is focused on multiobjective evolutionary algorithms (MOEAs). While MOEAs are well-suited to handle large, intractable design spaces, they typically require thousands of expensive simulations, which is prohibitively expensive for the problems under study. Therefore, the use of surrogate models in multiobjective optimization, denoted as multiobjective surrogate-based optimization, may prove to be even more worthwhile than SBO methods to expedite the optimization of computational expensive systems. In this paper, the authors propose the efficient multiobjective optimization (EMO) algorithm which uses Kriging models and multiobjective versions of the probability of improvement and expected improvement criteria to identify the Pareto front with a minimal number of expensive simulations. The EMO algorithm is applied on multiple standard benchmark problems and compared against the well-known NSGA-II, SPEA2 and SMS-EMOA multiobjective optimization methods.

Keywords

Multiobjective optimization Expected improvement   Probability of improvement Hypervolume Kriging  Gaussian process 

Notes

Acknowledgments

This work was supported by the Fund for Scientific Research in Flanders (FWO-Vlaanderen). Ivo Couckuyt and Dirk Deschrijver are post-doctoral research fellows of FWO-Vlaanderen. This research has (partially) been funded by the Interuniversity Attraction Poles Program BESTCOM initiated by the Belgian Science Policy Office.

References

  1. 1.
    Auger, A., Bader, J., Brockhoff, D., Zitzler, E.: Theory of the hypervolume indicator: optimal \(\mu \)-distributions and the choice of the reference point. In: Workshop Foundation Genetic Algorithms (2009)Google Scholar
  2. 2.
    Bader, J., Deb, K., Zitzler, E.: Faster hypervolume-based search using Monte Carlo sampling. In: Multiple Criteria Decision Making Sustainable Energy Transportation System (2010)Google Scholar
  3. 3.
    Beume, N.: S-metric calculation by considering dominated hypervolume as klee’s measure problem. Evol. Comput. 17(4), 477–492 (2009)CrossRefGoogle Scholar
  4. 4.
    Beume, N., Fonseca, C.M., López-Ibánez, M.: On the complexity of computing the hypervolume indicator. IEEE Trans. Evol. Comput. 13(5), 1075–1082 (2009)CrossRefGoogle Scholar
  5. 5.
    Beume, N., Naujoks, B., Emmerich, M.T.M.: SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 181(3), 1653–1669 (2007)CrossRefGoogle Scholar
  6. 6.
    Bonnans, J.F., Gilbert, J.C., Lemaréchal, C., Sagastizábal, C.A.: Numerical Optimization: Theoretical and Practical Aspects. Springer, Berlin (2006)Google Scholar
  7. 7.
    Bringmann, K., Friedrich, T.: Approximating the least hypervolume contributor: NP-hard in general, but fast in practice. In: Evolutionary Multiobjective Optimization (2009)Google Scholar
  8. 8.
    Bringmann, K., Friedrich, T.: Don’t be greedy when calculating hypervolume contributions. In: Workshop Foundation Genetic Algorithms (2009)Google Scholar
  9. 9.
    Bringmann, K., Friedrich, T.: An efficient algorithm for computing hypervolume contributions. Evol. Comput. 18(3), 383–402 (2010)CrossRefGoogle Scholar
  10. 10.
    Couckuyt, I., Declercq, F., Dhaene, T., Rogier, H.: Surrogate-based infill optimization applied to electromagnetic problems. J. RF Microw. Comput. Aided Eng. Adv. Des. Optim. Microw. RF Circuits Syst. 20(5), 492–501 (2010)Google Scholar
  11. 11.
    Couckuyt, I., Deschrijver, D., Dhaene, T.: Towards efficient multiobjective optimization: multiobjective statistical criterions. In: IEEE World Congress on Computational Intelligence, pp. 1–7 (2012)Google Scholar
  12. 12.
    Dam, E.R., van Husslage, B.G.M., den Hertog, D., Melissen, J.B.M.: Maximin Latin hypercube designs in two dimensions. Oper. Res. 55(1), 158–169 (2007)CrossRefGoogle Scholar
  13. 13.
    de Brabanter, J.: LS-SVM Regression Modelling and Its Applications. PhD thesis. Katholieke Universiteit Leuven (2004)Google Scholar
  14. 14.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. Evol. Comput. IEEE Trans. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  15. 15.
    Deb, K., Thiele, L., Laummans, M., Zitzler, E.: Scalable test problems for evolutionary multi-objective optimization. Technical Report 112, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland (2001)Google Scholar
  16. 16.
    Emmerich, M.T.M., Deutz, A.H., Klinkenberg, J.W.: Hypervolume-based expected improvement: monotonicity properties and exact computation. In: IEEE Congress on Evolutionary Computation (CEC) (2011)Google Scholar
  17. 17.
    Emmerich, M.T.M., Giannakoglou, K., Naujoks, B.: Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Trans. Evol. Comput. 10(4), 421–439 (2006)CrossRefGoogle Scholar
  18. 18.
    Fleischer, M.: The measure of Pareto optima: applications to multi-objective metaheuristics, pp. 519–533. In: Evolutionary Multi-Criterion Optimization (2003)Google Scholar
  19. 19.
    Forrester, A., Sobester, A., Keane, A.: Engineering Design Via Surrogate Modelling: A Practical Guide. Wiley, Chichester (2008)CrossRefGoogle Scholar
  20. 20.
    Friedrich, T., Horoba, C., Neumann, F.: Multiplicative approximations and the hypervolume indicator. In: Genetic Evolutionary Computation (2009)Google Scholar
  21. 21.
    Gaspar-Cunha, A., Vieira, A.: A multi-objective evolutionary algorithm using neural networks to approximate fitness evaluations. Int. J. Comput. Syst. Signals 6(1), 18–36 (2004)Google Scholar
  22. 22.
    Gorissen, D., Crombecq, K., Couckuyt, I., Demeester, P., Dhaene, T.: A surrogate modeling and adaptive sampling toolbox for computer based design. J. Mach. Learn. Res. 11, 2051–2055 (2010)Google Scholar
  23. 23.
    Jeong, S., Obayashi, S.: Efficient global optimization (ego) for multi-objective problem and data mining. In: Congress on Evolutionary Computation (2010)Google Scholar
  24. 24.
    Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. Glob. Optim. 21, 345–383 (2001)CrossRefGoogle Scholar
  25. 25.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)CrossRefGoogle Scholar
  26. 26.
    Keane, A.J.: Statistical improvement criteria for use in multiobjective design optimization. AIAA J. 44(4), 879–891 (2006)CrossRefGoogle Scholar
  27. 27.
    Kleijnen, J.P.C.: DASE: Design and Analysis of Simulation Experiments. Springer, Berlin (2007)Google Scholar
  28. 28.
    Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)CrossRefGoogle Scholar
  29. 29.
    Knowles, J., Corne, D., Reynolds, A.: Noisy multiobjective optimization on a budget of 250 evaluations. In: 5th International Conference on Evolutionary Multi-Criterion Optimization (2009)Google Scholar
  30. 30.
    Knowles, J., Nakayama, H.: Meta-modeling in multiobjective optimization. In: Multiobjective Optimization: Interactive and Evolutionary Approaches, pp. 245–284. Springer, Berlin (2008)Google Scholar
  31. 31.
    Knowles, J. D.: Local-Search and Hybrid Evolutionary Algorithms for Pareto Optimization. PhD thesis. University of Reading, UK (2002)Google Scholar
  32. 32.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA (2006)Google Scholar
  33. 33.
    Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer Series in Statistics. Springer, New York (2003)CrossRefGoogle Scholar
  34. 34.
    Stein, M.L.: Interpolation of Spatial Data: Some Theory for Kriging. Springer, Berlin (1999)CrossRefGoogle Scholar
  35. 35.
    Voutchkov, I., Keane, A.J.: Multiobjective Optimization using Surrogates. In: Parmee, I.C. (ed.) Adaptive Computing in Design and Manufacture 2006. Proceedings of the Seventh International Conference, pp. 167–175. Bristol, UK (2006)Google Scholar
  36. 36.
    Wagner, T., Emmerich, M., Deutz, A., Ponweiser, W.: On expected-improvement criteria for model-based multi-objective optimization. Parallel Probl. Solving Nat. 6238, 718–727 (2010)Google Scholar
  37. 37.
    Wang, G., Shan, S.: Review of metamodeling techniques in support of engineering design optimization. J. Mech. Des. 129(4), 370–380 (2007)CrossRefGoogle Scholar
  38. 38.
    While, L., Bradstreet, L.: Applying the WFG algorithm to calculate incremental hypervolumes. In: IEEE World Congress on Computational Intelligence (WCCI) (2012)Google Scholar
  39. 39.
    While, L., Bradstreet, L., Barone, L.: A fast way of calculating exact hypervolumes. IEEE Trans. Evol. Comput. 16, 86–95 (2012)CrossRefGoogle Scholar
  40. 40.
    While, L., Hingston, P., Barone, L., Huband, S.: A faster algorithm for calculating hypervolume. Evol. Comput. 10(1), 29–38 (2006)CrossRefGoogle Scholar
  41. 41.
    Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by MOEA/D with Gaussian process model. Evol. Comput. 14(3), 456–474 (2010)CrossRefGoogle Scholar
  42. 42.
    Zhou, A., Qu, B.-Y., Li, H., Zhao, S.-Z., Suganthan, P.N., Zhang, Q.: Multiobjective evolutionary algorithms: a survey of the state of the art. Swarm Evol. Comput. 1, 32–49 (2011)CrossRefGoogle Scholar
  43. 43.
    Zitzler, E.: Hypervolume metric calculation. Technical Report. Computer Engineering and Networks Laboratory (TIK). ETH Zurich, Switzerland (2001)Google Scholar
  44. 44.
    Zitzler, E., Laumanns, M., Thiele, L.: Spea2: Improving the Performance of the Strength Pareto Evolutionary Algorithm. Technical Report. Swiss Federal Institute of Technology (2001)Google Scholar
  45. 45.
    Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Grunert da Fonseca, V.: Performance assesment of multiobjective optimizers: an analysis and review. Evol. Comput. 7(2), 117–132 (2003)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.Department of Information Technology (INTEC)Ghent University-iMindsGhentBelgium

Personalised recommendations