Journal of Global Optimization

, Volume 75, Issue 4, pp 1079–1109 | Cite as

Surrogate-assisted Bounding-Box approach for optimization problems with tunable objectives fidelity

  • M. RivierEmail author
  • P. M. Congedo


In this work, we present a novel framework to perform multi-objective optimization when considering expensive objective functions computed with tunable fidelity. This case is typical in many engineering optimization problems, for example with simulators relying on Monte Carlo or on iterative solvers. The objectives can only be estimated, with an accuracy depending on the computational resources allocated by the user. We propose here a heuristic for allocating the resources efficiently to recover an accurate Pareto front at low computational cost. The approach is independent from the choice of the optimizer and overall very flexible for the user. The framework is based on the concept of Bounding-Box, where the estimation error can be regarded with the abstraction of an interval (in one-dimensional problems) or a product of intervals (in multi-dimensional problems) around the estimated value, naturally allowing the computation of an approximated Pareto front. This approach is then supplemented by the construction of a surrogate model on the estimated objective values. We first study the convergence of the approximated Pareto front toward the true continuous one under some hypotheses. Secondly, a numerical algorithm is proposed and tested on several numerical test-cases.


Multi-objective optimization Uncertainty-based optimization Error Bounding-Boxes Tunable fidelity Surrogate-assisting strategy 



  1. 1.
    Alexander, D.L.J., Bulger, D.W., Calvin, J.M., Romeijn, H.E., Sherriff, R.L.: Approximate implementations of pure random search in the presence of noise. J. Glob. Optim. 31(4), 601–612 (2005). MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Barrico, C., Antunes, C.H.: Robustness analysis in multi-objective optimization using a degree of robustness concept. In: 2006 IEEE International Conference on Evolutionary Computation, pp. 1887–1892 (2006).
  3. 3.
    Binh, T.T., Korn, U.: MOBES: A multiobjective evolution strategy for constrained optimization problems. In: The Third International Conference on Genetic Algorithms (Mendel 97), vol. 25 (1997)Google Scholar
  4. 3.
    Buche, D., Schraudolph, N.N., Koumoutsakos, P.: Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 35(2), 183–194 (2005). CrossRefGoogle Scholar
  5. 4.
    Deb, K., Gupta, H.: Introducing robustness in multi-objective optimization. Evol. Comput. 14(4), 463–494 (2006). CrossRefGoogle Scholar
  6. 5.
    Du, X.: Unified uncertainty analysis by the first order reliability method. J. Mech. Des. 130(9), 091401–091410 (2008). CrossRefGoogle Scholar
  7. 6.
    Emmerich, M., Giotis, A., Özdemir, M., Bäck, T., Giannakoglou, K.: Metamodel—Assisted Evolution Strategies, pp. 361–370. Springer, Berlin (2002). CrossRefGoogle Scholar
  8. 7.
    Emmerich, M.T.M., Giannakoglou, K.C., Naujoks, B.: Single- and multiobjective evolutionary optimization assisted by gaussian random field metamodels. IEEE Trans. Evol. Comput. 10(4), 421–439 (2006). CrossRefGoogle Scholar
  9. 8.
    Eskandari, H., Geiger, C.D., Bird, R.: Handling uncertainty in evolutionary multiobjective optimization: SPGA. In: 2007 IEEE Congress on Evolutionary Computation, pp. 4130–4137 (2007).
  10. 9.
    Fieldsend, J.E., Everson, R.M.: Multi-objective optimisation in the presence of uncertainty. In: 2005 IEEE Congress on Evolutionary Computation, vol. 1, pp. 243–250 (2005).
  11. 10.
    Fusi, F., Congedo, P.M.: An adaptive strategy on the error of the objective functions for uncertainty-based derivative-free optimization. J. Comput. Phys. 309, 241–266 (2016). MathSciNetCrossRefzbMATHGoogle Scholar
  12. 11.
    Gong, D., Qin, N., Sun, X.: Evolutionary algorithms for multi-objective optimization problems with interval parameters. In: 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA), pp. 411–420 (2010).
  13. 12.
    Gutjahr, W.J., Pflug, G.C.: Simulated annealing for noisy cost functions. J. Glob. Optim. 8(1), 1–13 (1996). MathSciNetCrossRefzbMATHGoogle Scholar
  14. 13.
    Hughes, E.J.: Evolutionary Multi-objective Ranking with Uncertainty and Noise, pp. 329–343. Springer, Berlin (2001). CrossRefGoogle Scholar
  15. 14.
    Ishibuchi, H., Tsukamoto, N., Nojima, Y.: Evolutionary many-objective optimization: a short review. In: 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), pp. 2419–2426 (2008).
  16. 15.
    Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011). CrossRefGoogle Scholar
  17. 16.
    Jin, Y., Branke, J.: Evolutionary optimization in uncertain environments—a survey. IEEE Trans. Evol. Comput. 9(3), 303–317 (2005). CrossRefGoogle Scholar
  18. 17.
    Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002). CrossRefGoogle Scholar
  19. 18.
    Li, M., Azarm, S., Aute, V.: A multi-objective genetic algorithm for robust design optimization. In: Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, GECCO ’05, pp. 771–778. ACM, New York (2005).
  20. 19.
    Limbourg, P.: Multi-objective Optimization of Problems with Epistemic Uncertainty, pp. 413–427. Springer, Berlin (2005). CrossRefzbMATHGoogle Scholar
  21. 20.
    Limbourg, P., Aponte, D.E.S.: An optimization algorithm for imprecise multi-objective problem functions. In: 2005 IEEE Congress on Evolutionary Computation, vol. 1, pp. 459–466 (2005).
  22. 21.
    Mlakar, M., Tusar, T., Filipic, B.: Comparing solutions under uncertainty in multiobjective optimization. Math. Probl. Eng. 2014, 1–10 (2014). MathSciNetCrossRefzbMATHGoogle Scholar
  23. 22.
    Picheny, V., Ginsbourger, D., Richet, Y.: Noisy expected improvement and on-line computation time allocation for the optimization of simulators with tunable fidelity (2010). Working paper or preprint
  24. 23.
    Soares, G.L., Guimaraes, F.G., Maia, C.A., Vasconcelos, J.A., Jaulin, L.: Interval robust multi-objective evolutionary algorithm. In: 2009 IEEE Congress on Evolutionary Computation, pp. 1637–1643 (2009).
  25. 24.
    Tan, K.C., Goh, C.K.: Handling Uncertainties in Evolutionary Multi-objective Optimization, pp. 262–292. Springer, Berlin (2008). CrossRefGoogle Scholar
  26. 25.
    Teich, J.: Pareto-Front Exploration with Uncertain Objectives, pp. 314–328. Springer, Berlin (2001). CrossRefGoogle Scholar
  27. 26.
    Torzcon, V., Trosset, M.W.: Using approximations to accelerate engineering design optimization. In: 7th AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization: A Collection of Technical Papers, Part 2, pp. 738–748. American Institute of Aeronautics and Astronautics (1998)Google Scholar
  28. 27.
    Toscano-Palmerin, S., Frazier, P.I.: Bayesian optimization with expensive integrands (2018). arXiv preprint arXiv:1803.08661
  29. 28.
    Žilinskas, A.: On similarities between two models of global optimization: statistical models and radial basis functions. J. Glob. Optim. 48(1), 173–182 (2010). MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Team DeFIInria Saclay Île-de-FrancePalaiseauFrance
  2. 2.ArianeGroupLe Haillan CedexFrance

Personalised recommendations