Advertisement

Journal of Global Optimization

, Volume 75, Issue 4, pp 1079–1109 | Cite as

Surrogate-assisted Bounding-Box approach for optimization problems with tunable objectives fidelity

  • M. RivierEmail author
  • P. M. Congedo
Article
  • 102 Downloads

Abstract

In this work, we present a novel framework to perform multi-objective optimization when considering expensive objective functions computed with tunable fidelity. This case is typical in many engineering optimization problems, for example with simulators relying on Monte Carlo or on iterative solvers. The objectives can only be estimated, with an accuracy depending on the computational resources allocated by the user. We propose here a heuristic for allocating the resources efficiently to recover an accurate Pareto front at low computational cost. The approach is independent from the choice of the optimizer and overall very flexible for the user. The framework is based on the concept of Bounding-Box, where the estimation error can be regarded with the abstraction of an interval (in one-dimensional problems) or a product of intervals (in multi-dimensional problems) around the estimated value, naturally allowing the computation of an approximated Pareto front. This approach is then supplemented by the construction of a surrogate model on the estimated objective values. We first study the convergence of the approximated Pareto front toward the true continuous one under some hypotheses. Secondly, a numerical algorithm is proposed and tested on several numerical test-cases.

Keywords

Multi-objective optimization Uncertainty-based optimization Error Bounding-Boxes Tunable fidelity Surrogate-assisting strategy 

Notes

References

  1. 1.
    Alexander, D.L.J., Bulger, D.W., Calvin, J.M., Romeijn, H.E., Sherriff, R.L.: Approximate implementations of pure random search in the presence of noise. J. Glob. Optim. 31(4), 601–612 (2005).  https://doi.org/10.1007/s10898-004-9970-4 MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Barrico, C., Antunes, C.H.: Robustness analysis in multi-objective optimization using a degree of robustness concept. In: 2006 IEEE International Conference on Evolutionary Computation, pp. 1887–1892 (2006).  https://doi.org/10.1109/CEC.2006.1688537
  3. 3.
    Binh, T.T., Korn, U.: MOBES: A multiobjective evolution strategy for constrained optimization problems. In: The Third International Conference on Genetic Algorithms (Mendel 97), vol. 25 (1997)Google Scholar
  4. 3.
    Buche, D., Schraudolph, N.N., Koumoutsakos, P.: Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 35(2), 183–194 (2005).  https://doi.org/10.1109/TSMCC.2004.841917 CrossRefGoogle Scholar
  5. 4.
    Deb, K., Gupta, H.: Introducing robustness in multi-objective optimization. Evol. Comput. 14(4), 463–494 (2006).  https://doi.org/10.1162/evco.2006.14.4.463 CrossRefGoogle Scholar
  6. 5.
    Du, X.: Unified uncertainty analysis by the first order reliability method. J. Mech. Des. 130(9), 091401–091410 (2008).  https://doi.org/10.1115/1.2943295 CrossRefGoogle Scholar
  7. 6.
    Emmerich, M., Giotis, A., Özdemir, M., Bäck, T., Giannakoglou, K.: Metamodel—Assisted Evolution Strategies, pp. 361–370. Springer, Berlin (2002).  https://doi.org/10.1007/3-540-45712-7_35 CrossRefGoogle Scholar
  8. 7.
    Emmerich, M.T.M., Giannakoglou, K.C., Naujoks, B.: Single- and multiobjective evolutionary optimization assisted by gaussian random field metamodels. IEEE Trans. Evol. Comput. 10(4), 421–439 (2006).  https://doi.org/10.1109/TEVC.2005.859463 CrossRefGoogle Scholar
  9. 8.
    Eskandari, H., Geiger, C.D., Bird, R.: Handling uncertainty in evolutionary multiobjective optimization: SPGA. In: 2007 IEEE Congress on Evolutionary Computation, pp. 4130–4137 (2007).  https://doi.org/10.1109/CEC.2007.4425010
  10. 9.
    Fieldsend, J.E., Everson, R.M.: Multi-objective optimisation in the presence of uncertainty. In: 2005 IEEE Congress on Evolutionary Computation, vol. 1, pp. 243–250 (2005).  https://doi.org/10.1109/CEC.2005.1554691
  11. 10.
    Fusi, F., Congedo, P.M.: An adaptive strategy on the error of the objective functions for uncertainty-based derivative-free optimization. J. Comput. Phys. 309, 241–266 (2016).  https://doi.org/10.1016/j.jcp.2016.01.004 MathSciNetCrossRefzbMATHGoogle Scholar
  12. 11.
    Gong, D., Qin, N., Sun, X.: Evolutionary algorithms for multi-objective optimization problems with interval parameters. In: 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA), pp. 411–420 (2010).  https://doi.org/10.1109/BICTA.2010.5645160
  13. 12.
    Gutjahr, W.J., Pflug, G.C.: Simulated annealing for noisy cost functions. J. Glob. Optim. 8(1), 1–13 (1996).  https://doi.org/10.1007/BF00229298 MathSciNetCrossRefzbMATHGoogle Scholar
  14. 13.
    Hughes, E.J.: Evolutionary Multi-objective Ranking with Uncertainty and Noise, pp. 329–343. Springer, Berlin (2001).  https://doi.org/10.1007/3-540-44719-9_23 CrossRefGoogle Scholar
  15. 14.
    Ishibuchi, H., Tsukamoto, N., Nojima, Y.: Evolutionary many-objective optimization: a short review. In: 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), pp. 2419–2426 (2008).  https://doi.org/10.1109/CEC.2008.4631121
  16. 15.
    Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011).  https://doi.org/10.1016/j.swevo.2011.05.001 CrossRefGoogle Scholar
  17. 16.
    Jin, Y., Branke, J.: Evolutionary optimization in uncertain environments—a survey. IEEE Trans. Evol. Comput. 9(3), 303–317 (2005).  https://doi.org/10.1109/TEVC.2005.846356 CrossRefGoogle Scholar
  18. 17.
    Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002).  https://doi.org/10.1109/TEVC.2002.800884 CrossRefGoogle Scholar
  19. 18.
    Li, M., Azarm, S., Aute, V.: A multi-objective genetic algorithm for robust design optimization. In: Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, GECCO ’05, pp. 771–778. ACM, New York (2005).  https://doi.org/10.1145/1068009.1068140
  20. 19.
    Limbourg, P.: Multi-objective Optimization of Problems with Epistemic Uncertainty, pp. 413–427. Springer, Berlin (2005).  https://doi.org/10.1007/978-3-540-31880-4_29 CrossRefzbMATHGoogle Scholar
  21. 20.
    Limbourg, P., Aponte, D.E.S.: An optimization algorithm for imprecise multi-objective problem functions. In: 2005 IEEE Congress on Evolutionary Computation, vol. 1, pp. 459–466 (2005).  https://doi.org/10.1109/CEC.2005.1554719
  22. 21.
    Mlakar, M., Tusar, T., Filipic, B.: Comparing solutions under uncertainty in multiobjective optimization. Math. Probl. Eng. 2014, 1–10 (2014).  https://doi.org/10.1155/2014/817964 MathSciNetCrossRefzbMATHGoogle Scholar
  23. 22.
    Picheny, V., Ginsbourger, D., Richet, Y.: Noisy expected improvement and on-line computation time allocation for the optimization of simulators with tunable fidelity (2010). https://hal.archives-ouvertes.fr/hal-00489321. Working paper or preprint
  24. 23.
    Soares, G.L., Guimaraes, F.G., Maia, C.A., Vasconcelos, J.A., Jaulin, L.: Interval robust multi-objective evolutionary algorithm. In: 2009 IEEE Congress on Evolutionary Computation, pp. 1637–1643 (2009).  https://doi.org/10.1109/CEC.2009.4983138
  25. 24.
    Tan, K.C., Goh, C.K.: Handling Uncertainties in Evolutionary Multi-objective Optimization, pp. 262–292. Springer, Berlin (2008).  https://doi.org/10.1007/978-3-540-68860-0_13 CrossRefGoogle Scholar
  26. 25.
    Teich, J.: Pareto-Front Exploration with Uncertain Objectives, pp. 314–328. Springer, Berlin (2001).  https://doi.org/10.1007/3-540-44719-9_22 CrossRefGoogle Scholar
  27. 26.
    Torzcon, V., Trosset, M.W.: Using approximations to accelerate engineering design optimization. In: 7th AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization: A Collection of Technical Papers, Part 2, pp. 738–748. American Institute of Aeronautics and Astronautics (1998)Google Scholar
  28. 27.
    Toscano-Palmerin, S., Frazier, P.I.: Bayesian optimization with expensive integrands (2018). arXiv preprint arXiv:1803.08661
  29. 28.
    Žilinskas, A.: On similarities between two models of global optimization: statistical models and radial basis functions. J. Glob. Optim. 48(1), 173–182 (2010).  https://doi.org/10.1007/s10898-009-9517-9 MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Team DeFIInria Saclay Île-de-FrancePalaiseauFrance
  2. 2.ArianeGroupLe Haillan CedexFrance

Personalised recommendations