Skip to main content
Log in

Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

The use of surrogate based optimization (SBO) is widely spread in engineering design to reduce the number of computational expensive simulations. However, “real-world” problems often consist of multiple, conflicting objectives leading to a set of competitive solutions (the Pareto front). The objectives are often aggregated into a single cost function to reduce the computational cost, though a better approach is to use multiobjective optimization methods to directly identify a set of Pareto-optimal solutions, which can be used by the designer to make more efficient design decisions (instead of weighting and aggregating the costs upfront). Most of the work in multiobjective optimization is focused on multiobjective evolutionary algorithms (MOEAs). While MOEAs are well-suited to handle large, intractable design spaces, they typically require thousands of expensive simulations, which is prohibitively expensive for the problems under study. Therefore, the use of surrogate models in multiobjective optimization, denoted as multiobjective surrogate-based optimization, may prove to be even more worthwhile than SBO methods to expedite the optimization of computational expensive systems. In this paper, the authors propose the efficient multiobjective optimization (EMO) algorithm which uses Kriging models and multiobjective versions of the probability of improvement and expected improvement criteria to identify the Pareto front with a minimal number of expensive simulations. The EMO algorithm is applied on multiple standard benchmark problems and compared against the well-known NSGA-II, SPEA2 and SMS-EMOA multiobjective optimization methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. Improving the overall accuracy of the surrogate model (space-filling).

  2. Enhancing the accuracy of the surrogate model solely in the region of the (current) optimum.

  3. Improving or augmenting the Pareto front.

References

  1. Auger, A., Bader, J., Brockhoff, D., Zitzler, E.: Theory of the hypervolume indicator: optimal \(\mu \)-distributions and the choice of the reference point. In: Workshop Foundation Genetic Algorithms (2009)

  2. Bader, J., Deb, K., Zitzler, E.: Faster hypervolume-based search using Monte Carlo sampling. In: Multiple Criteria Decision Making Sustainable Energy Transportation System (2010)

  3. Beume, N.: S-metric calculation by considering dominated hypervolume as klee’s measure problem. Evol. Comput. 17(4), 477–492 (2009)

    Article  Google Scholar 

  4. Beume, N., Fonseca, C.M., López-Ibánez, M.: On the complexity of computing the hypervolume indicator. IEEE Trans. Evol. Comput. 13(5), 1075–1082 (2009)

    Article  Google Scholar 

  5. Beume, N., Naujoks, B., Emmerich, M.T.M.: SMS-EMOA: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 181(3), 1653–1669 (2007)

    Article  Google Scholar 

  6. Bonnans, J.F., Gilbert, J.C., Lemaréchal, C., Sagastizábal, C.A.: Numerical Optimization: Theoretical and Practical Aspects. Springer, Berlin (2006)

    Google Scholar 

  7. Bringmann, K., Friedrich, T.: Approximating the least hypervolume contributor: NP-hard in general, but fast in practice. In: Evolutionary Multiobjective Optimization (2009)

  8. Bringmann, K., Friedrich, T.: Don’t be greedy when calculating hypervolume contributions. In: Workshop Foundation Genetic Algorithms (2009)

  9. Bringmann, K., Friedrich, T.: An efficient algorithm for computing hypervolume contributions. Evol. Comput. 18(3), 383–402 (2010)

    Article  Google Scholar 

  10. Couckuyt, I., Declercq, F., Dhaene, T., Rogier, H.: Surrogate-based infill optimization applied to electromagnetic problems. J. RF Microw. Comput. Aided Eng. Adv. Des. Optim. Microw. RF Circuits Syst. 20(5), 492–501 (2010)

  11. Couckuyt, I., Deschrijver, D., Dhaene, T.: Towards efficient multiobjective optimization: multiobjective statistical criterions. In: IEEE World Congress on Computational Intelligence, pp. 1–7 (2012)

  12. Dam, E.R., van Husslage, B.G.M., den Hertog, D., Melissen, J.B.M.: Maximin Latin hypercube designs in two dimensions. Oper. Res. 55(1), 158–169 (2007)

    Article  Google Scholar 

  13. de Brabanter, J.: LS-SVM Regression Modelling and Its Applications. PhD thesis. Katholieke Universiteit Leuven (2004)

  14. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. Evol. Comput. IEEE Trans. 6(2), 182–197 (2002)

    Article  Google Scholar 

  15. Deb, K., Thiele, L., Laummans, M., Zitzler, E.: Scalable test problems for evolutionary multi-objective optimization. Technical Report 112, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland (2001)

  16. Emmerich, M.T.M., Deutz, A.H., Klinkenberg, J.W.: Hypervolume-based expected improvement: monotonicity properties and exact computation. In: IEEE Congress on Evolutionary Computation (CEC) (2011)

  17. Emmerich, M.T.M., Giannakoglou, K., Naujoks, B.: Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Trans. Evol. Comput. 10(4), 421–439 (2006)

    Article  Google Scholar 

  18. Fleischer, M.: The measure of Pareto optima: applications to multi-objective metaheuristics, pp. 519–533. In: Evolutionary Multi-Criterion Optimization (2003)

  19. Forrester, A., Sobester, A., Keane, A.: Engineering Design Via Surrogate Modelling: A Practical Guide. Wiley, Chichester (2008)

    Book  Google Scholar 

  20. Friedrich, T., Horoba, C., Neumann, F.: Multiplicative approximations and the hypervolume indicator. In: Genetic Evolutionary Computation (2009)

  21. Gaspar-Cunha, A., Vieira, A.: A multi-objective evolutionary algorithm using neural networks to approximate fitness evaluations. Int. J. Comput. Syst. Signals 6(1), 18–36 (2004)

    Google Scholar 

  22. Gorissen, D., Crombecq, K., Couckuyt, I., Demeester, P., Dhaene, T.: A surrogate modeling and adaptive sampling toolbox for computer based design. J. Mach. Learn. Res. 11, 2051–2055 (2010)

    Google Scholar 

  23. Jeong, S., Obayashi, S.: Efficient global optimization (ego) for multi-objective problem and data mining. In: Congress on Evolutionary Computation (2010)

  24. Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. Glob. Optim. 21, 345–383 (2001)

    Article  Google Scholar 

  25. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)

    Article  Google Scholar 

  26. Keane, A.J.: Statistical improvement criteria for use in multiobjective design optimization. AIAA J. 44(4), 879–891 (2006)

    Article  Google Scholar 

  27. Kleijnen, J.P.C.: DASE: Design and Analysis of Simulation Experiments. Springer, Berlin (2007)

    Google Scholar 

  28. Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)

    Article  Google Scholar 

  29. Knowles, J., Corne, D., Reynolds, A.: Noisy multiobjective optimization on a budget of 250 evaluations. In: 5th International Conference on Evolutionary Multi-Criterion Optimization (2009)

  30. Knowles, J., Nakayama, H.: Meta-modeling in multiobjective optimization. In: Multiobjective Optimization: Interactive and Evolutionary Approaches, pp. 245–284. Springer, Berlin (2008)

  31. Knowles, J. D.: Local-Search and Hybrid Evolutionary Algorithms for Pareto Optimization. PhD thesis. University of Reading, UK (2002)

  32. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA (2006)

    Google Scholar 

  33. Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer Series in Statistics. Springer, New York (2003)

    Book  Google Scholar 

  34. Stein, M.L.: Interpolation of Spatial Data: Some Theory for Kriging. Springer, Berlin (1999)

    Book  Google Scholar 

  35. Voutchkov, I., Keane, A.J.: Multiobjective Optimization using Surrogates. In: Parmee, I.C. (ed.) Adaptive Computing in Design and Manufacture 2006. Proceedings of the Seventh International Conference, pp. 167–175. Bristol, UK (2006)

  36. Wagner, T., Emmerich, M., Deutz, A., Ponweiser, W.: On expected-improvement criteria for model-based multi-objective optimization. Parallel Probl. Solving Nat. 6238, 718–727 (2010)

    Google Scholar 

  37. Wang, G., Shan, S.: Review of metamodeling techniques in support of engineering design optimization. J. Mech. Des. 129(4), 370–380 (2007)

    Article  Google Scholar 

  38. While, L., Bradstreet, L.: Applying the WFG algorithm to calculate incremental hypervolumes. In: IEEE World Congress on Computational Intelligence (WCCI) (2012)

  39. While, L., Bradstreet, L., Barone, L.: A fast way of calculating exact hypervolumes. IEEE Trans. Evol. Comput. 16, 86–95 (2012)

    Article  Google Scholar 

  40. While, L., Hingston, P., Barone, L., Huband, S.: A faster algorithm for calculating hypervolume. Evol. Comput. 10(1), 29–38 (2006)

    Article  Google Scholar 

  41. Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by MOEA/D with Gaussian process model. Evol. Comput. 14(3), 456–474 (2010)

    Article  Google Scholar 

  42. Zhou, A., Qu, B.-Y., Li, H., Zhao, S.-Z., Suganthan, P.N., Zhang, Q.: Multiobjective evolutionary algorithms: a survey of the state of the art. Swarm Evol. Comput. 1, 32–49 (2011)

    Article  Google Scholar 

  43. Zitzler, E.: Hypervolume metric calculation. Technical Report. Computer Engineering and Networks Laboratory (TIK). ETH Zurich, Switzerland (2001)

  44. Zitzler, E., Laumanns, M., Thiele, L.: Spea2: Improving the Performance of the Strength Pareto Evolutionary Algorithm. Technical Report. Swiss Federal Institute of Technology (2001)

  45. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Grunert da Fonseca, V.: Performance assesment of multiobjective optimizers: an analysis and review. Evol. Comput. 7(2), 117–132 (2003)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the Fund for Scientific Research in Flanders (FWO-Vlaanderen). Ivo Couckuyt and Dirk Deschrijver are post-doctoral research fellows of FWO-Vlaanderen. This research has (partially) been funded by the Interuniversity Attraction Poles Program BESTCOM initiated by the Belgian Science Policy Office.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ivo Couckuyt.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Couckuyt, I., Deschrijver, D. & Dhaene, T. Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization. J Glob Optim 60, 575–594 (2014). https://doi.org/10.1007/s10898-013-0118-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-013-0118-2

Keywords

Navigation