Skip to main content
Log in

DMulti-MADS: mesh adaptive direct multisearch for bound-constrained blackbox multiobjective optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

The context of this research is multiobjective optimization where conflicting objectives are present. In this work, these objectives are only available as the outputs of a blackbox for which no derivative information is available. This work proposes a new extension of the mesh adaptive direct search (MADS) algorithm to multiobjective derivative-free optimization with bound constraints. This method does not aggregate objectives and keeps a list of non dominated points which converges to a (local) Pareto set as long as the algorithm unfolds. As in the single-objective optimization MADS algorithm, this method is built around a search step and a poll step. Under classical direct search assumptions, it is proved that the so-called DMulti-MADS algorithm generates multiple subsequences of iterates which converge to a set of local Pareto stationary points. Finally, computational experiments suggest that this approach is competitive compared to the state-of-the-art algorithms for multiobjective blackbox optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Abramson, M.A., Audet, C., Dennis, J.E., Jr., Le Digabel, S.: OrthoMADS: a deterministic MADS instance with orthogonal directions. SIAM J. Optim. 20(2), 948–966 (2009)

    Article  MathSciNet  Google Scholar 

  2. Audet, C., Bigeon, J., Cartier, D., Le Digabel, S., Salomon, L.: Performance indicators in multiobjective optimization. Eur. J. Oper. Res. 292(2), 397–422 (2021). https://doi.org/10.1016/j.ejor.2020.11.016

  3. Audet, C., Dennis, J.E., Jr.: Analysis of generalized pattern searches. SIAM J. Optim. 13(3), 889–903 (2003)

    Article  MathSciNet  Google Scholar 

  4. Audet, C., Dennis, J.E., Jr.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)

    Article  MathSciNet  Google Scholar 

  5. Audet, C., Dennis, J.E., Jr.: A progressive barrier for derivative-free nonlinear programming. SIAM J. Optim. 20(1), 445–472 (2009)

    Article  MathSciNet  Google Scholar 

  6. Audet, C., Hare, W.: Derivative-free and blackbox optimization. Springer Series in Operations Research and Financial Engineering. Springer International Publishing, Cham, Switzerland (2017)

    Book  Google Scholar 

  7. Audet, C., Ianni, A., Le Digabel, S., Tribes, C.: Reducing the number of function evaluations in mesh adaptive direct search algorithms. SIAM J. Optim. 24(2), 621–642 (2014)

    Article  MathSciNet  Google Scholar 

  8. Audet, C., Le Digabel, S., Tribes, C.: Dynamic scaling in the mesh adaptive direct search algorithm for blackbox optimization. Optim. Eng. 17(2), 333–358 (2016)

    Article  MathSciNet  Google Scholar 

  9. Audet, C., Le Digabel, S., Tribes, C.: The mesh adaptive direct search algorithm for granular and discrete variables. SIAM J. Optim. 29(2), 1164–1189 (2019)

    Article  MathSciNet  Google Scholar 

  10. Audet, C., Savard, G., Zghal, W.: Multiobjective optimization through a series of single-objective formulations. SIAM J. Optim. 19(1), 188–210 (2008)

    Article  MathSciNet  Google Scholar 

  11. Audet, C., Savard, G., Zghal, W.: A mesh adaptive direct search algorithm for multiobjective optimization. Eur. J. Op. Res. 204(3), 545–556 (2010)

    Article  MathSciNet  Google Scholar 

  12. Audet, C., Tribes, C.: Mesh-based Nelder-Mead algorithm for inequality constrained optimization. Comput. Optim. Appl. 71(2), 331–352 (2018)

    Article  MathSciNet  Google Scholar 

  13. Blank, J., Deb, K.: PYMOO - multi-objective optimization in python. IEEE Access 8, 89497–89509 (2020)

    Article  Google Scholar 

  14. Brás, C.P., Custódio, A.L.: On the use of polynomial models in multiobjective directional direct search. Comput. Optim. Appl. 77, 897–918 (2020)

    Article  MathSciNet  Google Scholar 

  15. Brockhoff, D., Tran, T.D., Hansen, N.: Benchmarking numerical multiobjective optimizers revisited. In Proceedings of the 2015 Annual conference on genetic and evolutionary computation, GECCO ’15, pages 639–646, New York, NY, USA, 2015. ACM

  16. Clarke, F.H.: Optimization and Nonsmooth Analysis. John Wiley & Sons, New York, 1983. Reissued in 1990 by SIAM Publications, Philadelphia, as Vol. 5 in the series Classics in Applied Mathematics

  17. Cocchi, G., Liuzzi, G., Papini, A., Sciandrone, M.: An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints. Comput. Optim. Appl. 69(2), 267–296 (2018)

    Article  MathSciNet  Google Scholar 

  18. Collette, Y., Siarry, P.: Optimisation multiobjectif. Eyrolles, (2002)

  19. Conn, A.R., Le Digabel, S.: Use of quadratic models with mesh-adaptive direct search for constrained black box optimization. Optim. Methods Softw. 28(1), 139–158 (2013)

    Article  MathSciNet  Google Scholar 

  20. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to derivative-free optimization. MOS-SIAM Series on Optimization. SIAM, Philadelphia (2009)

    Book  Google Scholar 

  21. Custódio, A.L., Madeira, J.F.A.: Multiglods: global and local multiobjective optimization using direct search. J. Glob. Optim. 72(2), 323–345 (2018)

    Article  MathSciNet  Google Scholar 

  22. Custódio, A.L., Madeira, J.F.A., Vaz, A.I.F., Vicente, L.N.: Direct multisearch for multiobjective optimization. SIAM J. Optim. 21(3), 1109–1140 (2011)

    Article  MathSciNet  Google Scholar 

  23. Deb, K., Miettinen, K.: Multiobjective optimization: interactive and evolutionary approaches, volume 5252. Springer Science & Business Media, (2008)

  24. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evolut. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  25. Ehrgott, M.: Multicriteria Optimization. Lecture notes in economics and mathematical systems, vol. 491, 2nd edn. Springer, Berlin (2005)

  26. Fonseca, C.M., Paquete, L., Lopez-Ibanez, M.: An improved dimension-sweep algorithm for the hypervolume indicator. In 2006 IEEE International conference on evolutionary computation, pages 1157–1163. IEEE, (2006)

  27. Gould, N., Scott, J.: A note on performance profiles for benchmarking software. ACM Trans. Math. Softw. 43(2), 1–5 (2016)

    Article  MathSciNet  Google Scholar 

  28. Hasanoglu, M.S., Dolen, M.: Multi-objective feasibility enhanced particle swarm optimization. Eng. Optim. 50(12), 2013–2037 (2018)

    Article  MathSciNet  Google Scholar 

  29. Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evolut. Comput. 10(1), 50–66 (2006)

    Article  Google Scholar 

  30. Li, M., Yao, X.: Quality evaluation of solution sets in multiobjective optimisation: a survey. ACM Computing Surveys, 52(2):26:1–26:38, (2019)

  31. Liuzzi, G., Lucidi, S., Rinaldi, F.: A derivative-free approach to constrained multiobjective nonsmooth optimization. SIAM J. Optim. 26(4), 2744–2774 (2016)

    Article  MathSciNet  Google Scholar 

  32. Moré, J.J., Wild, S.M.: Benchmarking derivative-free optimization algorithms. SIAM J. Optim. 20(1), 172–191 (2009)

    Article  MathSciNet  Google Scholar 

  33. Müller, J.: SOCEMO: surrogate optimization of computationally expensive multiobjective problems. Informs J. Comput. 29(4), 581–596 (2017)

    Article  MathSciNet  Google Scholar 

  34. Regis, R.G.: Multi-objective constrained black-box optimization using radial basis function surrogates. J. Comput. Sci. 16, 140–155 (2016)

    Article  MathSciNet  Google Scholar 

  35. Ryu, J., Kim, S.: A derivative-free trust-region method for biobjective optimization. SIAM J. Optim. 24(1), 334–362 (2014)

    Article  MathSciNet  Google Scholar 

  36. Sayın, S.: Measuring the quality of discrete representations of efficient sets in multiple objective mathematical programming. Math. Program. 87(3), 543–560 (2000)

    Article  MathSciNet  Google Scholar 

  37. Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms. Mathematics and computers in simulation, 141:96–109, November 2017. New trends in numerical analysis: theory, methods, algorithms and applications - NETNA 2015 (dedicated to Professor F.A. Costabile on his 70th birthday) held in Falerna (CZ), Italy during June 18–20, (2015)

  38. Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 8(1), 453 (2018)

    Article  Google Scholar 

  39. Torczon, V.: On the convergence of pattern search algorithms. SIAM J. Optim. 7(1), 1–25 (1997)

    Article  MathSciNet  Google Scholar 

  40. Zitzler, E., Thiele, L.: Multiobjective optimization using evolutionary algorithms — a comparative case study. In Agoston E. Eiben, Thomas Bäck, Marc Schoenauer, and Hans-Paul Schwefel, editors, Parallel Problem Solving from Nature — PPSN V, pages 292–301, Berlin, Heidelberg, 1998. Springer Berlin Heidelberg

  41. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., da Fonseca, V.G.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evolut. Comput. 7(2), 117–132 (2003)

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Professor Ana Luísa Custódio (Universidade Nova de Lisboa) for interesting discussions on derivative-free multiobjective optimization.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sébastien Le Digabel.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work is partly supported by the NSERC CRD RDCPJ 490744-15 grant and by an InnovÉÉ grant, both in collaboration with Hydro-Québec and Rio Tinto.

Appendices

Appendix A A study of the influence of the integer parameter \(w^+\) on the performance of the DMulti-MADS algorithm

This appendix presents some experiments which led to the choice of the value of the integer parameter \(w^+\). For readability, only the DMulti-MADS variants without opportunistic polling and the spread strategy are considered. For the set of \(w^+\) integer values presented here, similar observations can be done as the ones presented in Subsection 6.3 concerning the influence of opportunistic polling and the spread strategy on the concrete performance of the DMulti-MADS algorithm. Thus, the variants compared here are the better ones for each considered \(w^+\) value. They use the same settings as the ones described at the beginning of Subsection 6.3.

Fig. 15
figure 15

Data profiles obtained on \(10\) replications from \(100\) multiobjective optimization problems from [22] for DMulti-MADS variants with strict success strategy without opportunistic polling and with spread strategy for tolerance \(\varepsilon _\tau \in \{10^{-2}, 5 \times 10^{-2}, 10^{-1}\}\)

Figure 15 presents data profiles of the DMulti-MADS strict success strategy variants for several integer values of \(w^+\). Fixing \(w^+\) to \(0\) means that only points with maximum frame size parameter in the current Pareto front approximation can be selected as current poll centers. When \(w^+\) is high (for example \(w^+ \in \{10,14,20\}\)), the poll selection of the algorithm is similar to the one of the DMS algorithm: all points of the current incumbent list can be chosen as poll centers. From Fig. 15, one can note that allowing only points with maximum frame size parameters to be selected as poll centers grandly decreases the performance of DMulti-MADS for all considered tolerances. However, removing all restrictions on the choice of the current incumbent as long as it belongs to the current incumbent list is not the most performant variant (i.e. for example with \(w^+ \in \{10,14,20\}\)). Indeed, for the lowest tolerance (see Fig. 15a), the data profiles reveal than strict strategy variants with high value of \(w^+ \in \{10,14,20\}\) solve slightly less problems compared to the choice of \(w^+ =5\). For higher tolerances, a value \(w^+ \in \{3,5\}\) is preferable, as shown in Fig. 15b, c.

Fig. 16
figure 16

Data profiles obtained on \(10\) replications from \(100\) multiobjective optimization problems from [22] for DMulti-MADS variants with DMS strategy with tolerance \(\varepsilon _\tau \in \{10^{-2}, 5 \times 10^{-2}, 10^{-1}\}\)

One can make similar observations when comparing DMulti-MADS variants with DMS success strategy for different \(w^+\) values, as shown on Fig. 16. For the DMS success strategy, a \(w^+\) integer value comprised between \(3\) and \(5\) implies a better performance.

Appendix B Comparing DMulti-MADS with other algorithms: performance profiles

This appendix reports comparison results between the two best DMulti-MADS variants and the other multiobjective solvers BiMADS, DMS, MOIF and NSGA-II in term of the purity metric, spread metrics \(\Gamma\) and \(\Delta\) metrics as described and used in [22]. All settings for running the solvers are the same as the ones described in Subsection 6.4. Specifically the maximal allowed number of evaluations for each problem and each solver is fixed to \(30,000\). Algorithms are compared in pairs (see [22, 27] for an explanation).

Fig. 17
figure 17

Purity performance profiles using NOMAD (BiMADS), DMS, DMulti-MADS, MOIF and NSGA-II obtained on \(100\) multiobjective optimization problems (\(69\) with \(m=2\), \(29\) with \(m=3\) and \(2\) with \(m=4\)) from [22] with \(50\) different runs of NSGA-II

When looking at Fig. 17, one can observe that the two variants of DMulti-Mads are more efficient in term of purity than DMS and MOIF. On the contrary, it is less efficient than BiMADS (with and without model) and NSGA-II (worst and best versions) in terms of purity metric. This can be explained by the fact that BiMADS generates more points in the Pareto front reference, due to its scalarization approach when DMulti-MADS generates points that are close to the Pareto front reference, but not part of it. Concerning NSGA-II, a closer look at the runs shows that all deterministic solvers can stop before the exhaustion of the whole budget of evaluations (because the solver reaches a threshold), which can prevent them from exploring potential interesting areas in the objective space. NSGA-II always exploits its full budget of evaluations, which allows to generate more points in the Pareto front reference, and consequently to have a better purity metric.

Fig. 18
figure 18

\(\Delta\) spread performance profiles using NOMAD (BiMADS), DMS, DMulti-MADS, MOIF and NSGA-II obtained on \(100\) multiobjective optimization problems (\(69\) with \(m=2\), \(29\) with \(m=3\) and \(2\) with \(m=4\)) from [22] with \(50\) different runs of NSGA-II

In terms of \(\Delta\) spread metric results reported in Fig. 18, one can see that the two DMulti-MADS variants are slightly less performant than the other algorithms. For DMS and MOIF, the use of coordinate directions seems to play a role in the distribution of their generated points for this set of problems. However, the use of dense directions enables DMulti-MADS to find new non-dominated points contrary to DMS and MOIF as shown in Fig. 17.

Fig. 19
figure 19

\(\Gamma\) spread performance profiles using NOMAD (BiMADS), DMS, DMulti-MADS with DMS strategy, MOIF and NSGA-II obtained on \(100\) multiobjective optimization problems (\(69\) with \(m=2\), \(29\) with \(m=3\) and \(2\) with \(m=4\)) from [22] with \(50\) different runs of NSGA-II

In terms of \(\Gamma\) spread metric, both variants of DMulti-MADS generate less dense Pareto front approximations than BiMADS and NSGA-II, as shown in Fig. 19. The DMulti-MADS variant with DMS strategy performs better when compared to MOIF and DMS than the DMulti-MADS variant with strict success strategy. Thus, for an important budget of evaluations, the DMulti-MADS variant with DMS strategy generates denser Pareto front approximations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bigeon, J., Le Digabel, S. & Salomon, L. DMulti-MADS: mesh adaptive direct multisearch for bound-constrained blackbox multiobjective optimization. Comput Optim Appl 79, 301–338 (2021). https://doi.org/10.1007/s10589-021-00272-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-021-00272-9

Keywords

Navigation