Advertisement

Targeting solutions in Bayesian multi-objective optimization: sequential and batch versions

  • David GaudrieEmail author
  • Rodolphe Le Riche
  • Victor Picheny
  • Benoît Enaux
  • Vincent Herbert
Article

Abstract

Multi-objective optimization aims at finding trade-off solutions to conflicting objectives. These constitute the Pareto optimal set. In the context of expensive-to-evaluate functions, it is impossible and often non-informative to look for the entire set. As an end-user would typically prefer a certain part of the objective space, we modify the Bayesian multi-objective optimization algorithm which uses Gaussian Processes and works by maximizing the Expected Hypervolume Improvement, to focus the search in the preferred region. The cumulated effects of the Gaussian Processes and the targeting strategy lead to a particularly efficient convergence to the desired part of the Pareto set. To take advantage of parallel computing, a multi-point extension of the targeting criterion is proposed and analyzed.

Keywords

Gaussian processes Bayesian optimization Computer experiments Preference-based optimization Parallel optimization 

Mathematics Subject Classification (2010)

65Kxx 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgments

This research was performed within the framework of a CIFRE grant (convention #2016/0690) established between the ANRT and the Groupe PSA for the doctoral work of David Gaudrie.

References

  1. 1.
    Auger, A., Bader, J., Brockhoff, D., Zitzler, E.: Theory of the hypervolume indicator: optimal μ-distributions and the choice of the reference point. In: Proceedings of the Tenth ACM SIGEVO Workshop on Foundations of Genetic Algorithms, pp 87–102. ACM (2009)Google Scholar
  2. 2.
    Auger, A., Bader, J., Brockhoff, D., Zitzler, E.: Hypervolume-based multiobjective optimization: theoretical foundations and practical implications. Theor. Comput. Sci. 425, 75–103 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Auger, A., Hansen, N.: Performance evaluation of an advanced local search evolutionary algorithm. In: IEEE Congress on Evolutionary Computation, vol. 2, p 2005. IEEE (2005)Google Scholar
  4. 4.
    Bechikh, S., Kessentini, M., Said, L.B., Ghédira, K.: Chap. 4: preference incorporation in evolutionary multiobjective optimization: a survey of the state-of-the-art. Adv. Comput. 98, 141–207 (2015)CrossRefGoogle Scholar
  5. 5.
    Beume, N., Naujoks, B., Emmerich, M.: Sms-emoa: multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 181(3), 1653–1669 (2007)CrossRefzbMATHGoogle Scholar
  6. 6.
    Binois, M., Ginsbourger, D., Roustant, O.: Quantifying uncertainty on Pareto fronts with Gaussian process conditional simulations. Eur. J. Oper. Res. 243 (2), 386–394 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Branke, J., Deb, K., Dierolf, H., Osswald, M.: Finding knees in multi-objective optimization. In: International Conference on Parallel Problem Solving from Nature, pp 722–731. Springer (2004)Google Scholar
  8. 8.
    Branke, J., Deb, K., Miettinen, K., Slowiński, R.: Multiobjective Optimization: Interactive and Evolutionary Approaches, vol. 5252. Springer, Berlin (2008)CrossRefzbMATHGoogle Scholar
  9. 9.
    Buchanan, J., Gardiner, L.: A comparison of two reference point methods in multiple objective mathematical programming. Eur. J. Oper. Res. 149(1), 17–34 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Chevalier, C., Ginsbourger, D.: Fast computation of the multi-points expected improvement with applications in batch selection. In: International Conference on Learning and Intelligent Optimization, pp 59–69. Springer (2013)Google Scholar
  11. 11.
    Couckuyt, I., Deschrijver, D., Dhaene, T.: Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization. J. Glob. Optim. 60(3), 575–594 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  13. 13.
    Deb, K., Sundar, J.: Reference point based multi-objective optimization using evolutionary algorithms. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp 635–642. ACM (2006)Google Scholar
  14. 14.
    Emmerich, M., Deutz, A., Klinkenberg, J.W.: Hypervolume-based expected improvement: monotonicity properties and exact computation. In: IEEE Congress on Evolutionary Computation (CEC), 2011, pp 2147–2154. IEEE (2011)Google Scholar
  15. 15.
    Emmerich, M., Yang, K., Deutz, A., Wang, H., Fonseca, C.M.: A multicriteria generalization of Bayesian global optimization. In: Advances in Stochastic and Deterministic Global Optimization, pp 229–242. Springer (2016)Google Scholar
  16. 16.
    Feliot, P.: Une approche Bayesienne pour L’optimisation multi-objectif sous contraintes. PhD thesis, Universite Paris-Saclay (2017)Google Scholar
  17. 17.
    Feliot, P., Bect, J., Vazquez, E.: User preferences in Bayesian multi-objective optimization: the expected weighted hypervolume improvement criterion. In: Giuseppe Nicosia, Panos Pardalos, Giovanni Giuffrida, Renato Umeton, and Vincenzo Sciacca, editors, Machine Learning, Optimization, and Data Science, pp 533–544. Springer, Cham (2019)Google Scholar
  18. 18.
    Frazier, P.I., Clark, S.C.: Parallel global optimization using an improved multi-points expected improvement criterion. In: INFORMS Optimization Society Conference, Miami FL, vol. 26 (2012)Google Scholar
  19. 19.
    Gal, T., Stewart, T., Hanne, T.: Multicriteria Decision Making: Advances in MCDM Models, Algorithms, Theory, and Applications, vol. 21. Springer, Berlin (1999)CrossRefzbMATHGoogle Scholar
  20. 20.
    Gaudrie, D., Le Riche, R., Enaux, B., Herbert, V.: Budgeted multi-objective optimization with a focus on the central part of the pareto front-extended version. arXiv:1809.10482 (2018)
  21. 21.
    Ginsbourger, D., Janusevskis, J., Le Riche, R.: Dealing with asynchronicity in parallel gaussian process based global optimization. In: 4th International Conference of the ERCIM WG on computing and statistics (ERCIM’11) (2011)Google Scholar
  22. 22.
    Ginsbourger, D., Riche, R.L.: Towards GP-based optimization with finite time horizon. Technical report, Centre d’Hydrogéologie et de Géothermie de Neuchâtel (2009)Google Scholar
  23. 23.
    Ginsbourger, D., Le Riche, R., Carraro, L.: Kriging is well-suited to parallelize optimization. In: Computational Intelligence in Expensive Optimization Problems, pp 131–162. Springer (2010)Google Scholar
  24. 24.
    Horn, D., Wagner, T., Biermann, D., Weihs, C., Bischl, B.: Model-based multi-objective optimization: taxonomy, multi-point proposal, toolbox and benchmark. In: International Conference on Evolutionary Multi-Criterion Optimization, pp 64–78. Springer (2015)Google Scholar
  25. 25.
    Ishibuchi, H., Hitotsuyanagi, Y., Tsukamoto, N., Nojima, Y.: Many-objective test problems to visually examine the behavior of multiobjective evolution in a decision space. In: International Conference on Parallel Problem Solving from Nature, pp 91–100. Springer (2010)Google Scholar
  26. 26.
    Janusevskis, J., Riche, R.L., Ginsbourger, D.: Parallel expected improvements for global optimization: summary, bounds and speed-up. Technical report, Institut Fayol, École des Mines de Saint-Étienne (2011)Google Scholar
  27. 27.
    Janusevskis, J., Le Riche, R., Ginsbourger, D., Girdziusas, R.: Expected improvements for the asynchronous parallel global optimization of expensive functions: potentials and challenges. In: Learning and Intelligent Optimization, pp 413–418. Springer (2012)Google Scholar
  28. 28.
    Jones, D.R, Schonlau, M., Welch, W.: Efficient Global Optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Kalai, E., Smorodinsky, M.: Other solutions to Nash’s bargaining problem. Econometrica: J. Econometric Society, 513–518 (1975)Google Scholar
  30. 30.
    Keane, A.J.: Statistical improvement criteria for use in multiobjective design optimization. AIAA J. 44(4), 879–891 (2006)CrossRefGoogle Scholar
  31. 31.
    Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)CrossRefGoogle Scholar
  32. 32.
    Marmin, S., Chevalier, C., Ginsbourger, D.: Differentiating the multipoint expected improvement for optimal batch design. In: International Workshop on Machine Learning, Optimization and Big Data, pp 37–48. Springer (2015)Google Scholar
  33. 33.
    Marmin, S., Chevalier, C., Ginsbourger, D.: Efficient batch-sequential bayesian optimization with moments of truncated gaussian vectors. arXiv:1609.02700 (2016)
  34. 34.
    Miettinen, K.: Nonlinear Multiobjective Optimization, vol. 12. Springer, Berlin (1998)CrossRefGoogle Scholar
  35. 35.
    Molchanov, I.: Theory of Random Sets, vol. 19. Springer, Berlin (2005)Google Scholar
  36. 36.
    Pardalos, P.M, žilinskas, A., žilinskas, J.: Non-convex Multi-Objective Optimization. Springer, Berlin (2017)CrossRefzbMATHGoogle Scholar
  37. 37.
    Parr, J.: Improvement criteria for constraint handling and multiobjective optimization. PhD thesis, University of Southampton (2013)Google Scholar
  38. 38.
    Picheny, V.: Multiobjective optimization using Gaussian process emulators via stepwise uncertainty reduction. Stat. Comput. 25(6), 1265–1280 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted S-metric selection. In: International Conf. on Parallel Problem Solving from Nature, pp 784–794. Springer (2008)Google Scholar
  40. 40.
    Ribaud, M.: Krigeage pour la conception de turbomachines: grande dimension et optimisation robuste. PhD thesis Université de Lyon (2018)Google Scholar
  41. 41.
    Sawaragi, Y., Nakayama, H., Tanino, T.: Theory of Multiobjective Optimization, vol. 176. Elsevier, Amsterdam (1985)zbMATHGoogle Scholar
  42. 42.
    Schonlau, M.: Computer experiments and global optimization. PhD thesis, University of Waterloo (1997)Google Scholar
  43. 43.
    Svenson, J.: Computer experiments: multiobjective optimization and sensitivity analysis. PhD thesis, The Ohio State University (2011)Google Scholar
  44. 44.
    Svenson, J., Santner, T.J.: Multiobjective Optimization of Expensive Black-Box Functions via Expected Maximin Improvement, p 32. The Ohio State University Columbus, Ohio (2010)Google Scholar
  45. 45.
    Triantaphyllou, E.: Multi-criteria decision making methods. In: Multi-Criteria Decision Making Methods: a Comparative Study, pp 5–21. Springer (2000)Google Scholar
  46. 46.
    While, L., Bradstreet, L., Barone, L.: A fast way of calculating exact hypervolumes. IEEE Trans. Evol. Comput. 16(1), 86–95 (2012)CrossRefGoogle Scholar
  47. 47.
    Wierzbicki, A.: The use of reference objectives in multiobjective optimization. In: Multiple Criteria Decision Making Theory and Application, pp 468–486. Springer (1980)Google Scholar
  48. 48.
    Wierzbicki, A.: Reference point approaches. In: Gal, T., Stewart, T., Hanne, T. (eds.) Multicriteria Decision Making: Advances in MCDM Models, Algorithms, Theory, and Applications, pp 237–275. Springer (1999)Google Scholar
  49. 49.
    Yang, K., Emmerich, M., Deutz, A., Bäck, T.: Multi-objective Bayesian global optimization using expected hypervolume improvement gradient. Swarm Evol. Comput. 44, 945–956 (2019)CrossRefGoogle Scholar
  50. 50.
    Yang, K., Emmerich, M., Deutz, A., Fonseca, C.M.: Computing 3-D expected hypervolume improvement and related integrals in asymptotically optimal time. In: International Conference on Evolutionary Multi-Criterion Optimization, pp 685–700. Springer (2017)Google Scholar
  51. 51.
    Yang, K., Li, L., Deutz, A., Bäck, T., Emmerich, M.: Preference-based multiobjective optimization using truncated expected hypervolume improvement. In: 2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD), pp 276–281. IEEE (2016)Google Scholar
  52. 52.
    Zeleny, M.: The theory of the displaced ideal. In: Multiple Criteria Decision Making Kyoto 1975, pp 153–206. Springer (1976)Google Scholar
  53. 53.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms empirical results. Evol. Comput. 8(2), 173–195 (2000)CrossRefGoogle Scholar
  54. 54.
    Zitzler, E.: Evolutionary algorithms for multiobjective optimization: methods and applications. Citeseer (1999)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Groupe PSAVélizy-VillacoublayFrance
  2. 2.CNRS LIMOSÉcole Nationale Supérieure des Mines de Saint-ÉtienneSaint-ÉtienneFrance
  3. 3.Prowler.ioCambridgeUK

Personalised recommendations