Skip to main content
Log in

A variance-based method to rank input variables of the Mesh Adaptive Direct Search algorithm

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

The Mesh Adaptive Direct Search algorithm (Mads) algorithm is designed for nonsmooth blackbox optimization problems in which the evaluation of the functions defining the problems are expensive to compute. The Mads algorithm is not designed for problems with a large number of variables. The present paper uses a statistical tool based on variance decomposition to rank the relative importance of the variables. This statistical method is then coupled with the Mads algorithm so that the optimization is performed either in the entire space of variables or in subspaces associated with statistically important variables. The resulting algorithm is called Stats-Mads and is tested on bound constrained test problems having up to 500 variables. The numerical results show a significant improvement in the objective function value after a fixed budget of function evaluations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Abramson, M.A., Audet, C.: Convergence of Mesh Adaptive Direct Search to second-order stationary points. SIAM J. Optim. 17(2), 606–619 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  2. Abramson, M.A., Audet, C., Couture, G., Dennis, J.E. Jr., Le Digabel, S., Tribes, C.: The NOMAD project. Software available at http://www.gerad.ca/nomad

  3. Abramson, M.A., Audet, C., Dennis, J.E. Jr., Le Digabel, S.: OrthoMADS: a deterministic MADS instance with orthogonal directions. SIAM J. Optim. 20(2), 948–966 (2009)

  4. Audet, C.: Convergence results for generalized pattern search algorithms are tight. Optim. Eng. 5(2), 101–122 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  5. Audet, C.: A survey on direct search methods for blackbox optimization and their applications. Technical Report G-2012-53, Les cahiers du GERAD. In: Pardalos, P.M., Rassias, Th.M. (eds.) Mathematics Without Boundaries - Surveys in Interdisciplinary Mathematics. Springer, New York, 2013 (in preparation)

  6. Audet, C., Dennis, J.E. Jr.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)

    Google Scholar 

  7. Audet, C., Dennis, J.E. Jr.: A progressive barrier for derivative-free nonlinear programming. SIAM J. Optim. 20(1), 445–472 (2009)

    Google Scholar 

  8. Audet, C., Dennis, J.E. Jr., Le Digabel, S.: Parallel space decomposition of the mesh adaptive direct search algorithm. SIAM J. Optim. 19(3), 1150–1170 (2008)

  9. Audet, C., Dennis, J.E. Jr., Le Digabel, S.: Globalization strategies for mesh adaptive direct search. Comput. Optim. Appl. 46(2), 193–215 (2010)

  10. Audet, C., Le Digabel, S., Tribes, C.: NOMAD user guide. Technical Report G-2009-37, Les cahiers du GERAD (2009)

  11. Bagirov, A.M., Jin, L., Karmitsa, N., Al Nuaimat, A., Sultanova, N.: Subgradient method for nonconvex nonsmooth optimization. J. Optim. Theory Appl. 157(2), 416–435 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  12. Ben Yahia, I.: Identification statistique de variables importantes pour l’optimisation de boîtes noires. Master’s thesis, École Polytechnique de Montréal (2012)

  13. Booker, A.J., Dennis, J.E. Jr., Frank, P.D., Serafini, D.B., Torczon, V.: Optimization using surrogate objectives on a helicopter test example. In: Borggaard, J., Burns, J., Cliff, E., Schreck, S. (eds.) Optimal Design and Control. Progress in Systems and Control Theory, pp 49–58. Birkhäuser, Cambridge, Massachusetts (1998)

  14. Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1990) Reissued In: Series Classics in Applied Mathematics, vol. 5. SIAM Publications, Philadelphia (1983)

  15. Conn, A.R., Le Digabel, S.: Use of quadratic models with mesh-adaptive direct search for constrained black box optimization. Optim. Methods Softw. 28(1), 139–158 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  16. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to derivative-free optimization. In: MOS/SIAM Series on Optimization. SIAM, Philadelphia (2009)

  17. Dennis, J.E. Jr., Schnabel, R.B.: Numerical methods for unconstrained optimization and nonlinear equations. Prentice-Hall, Englewood Cliffs, NJ (1983). Reissued In: Series Classics in Applied Mathematics, vol. 16. SIAM Publications, Philadelphia (1996)

  18. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Progr. 91(2), 201–213 (2002)

    Article  MATH  Google Scholar 

  19. Gilbert, J.C.: Automatic differentiation and iterative processes. Optim. Methods Softw. 1(1), 13–21 (1992)

    Article  Google Scholar 

  20. Gould, N.I.M., Orban, D., Toint, PhL.: CUTEr (and SifDec): a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Google Scholar 

  21. Gramacy, R.B., Le Digabel S.: The mesh adaptive direct search algorithm with treed Gaussian process surrogates. Technical Report G-2011-37, Les cahiers du GERAD (2011)

  22. Gramacy, R.B., Taddy, M.A., Wild, S.M.: Variable selection and sensitivity analysis via dynamic trees with an application to computer code performance tuning. Ann. Appl. Stat. 7(1), 51–80 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  23. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black box functions. J. Glob. Optim. 13(4), 455–492 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  24. Le Digabel, S.: Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Trans. Math. Softw. 37(4), 44:1–44:15 (2011)

    Google Scholar 

  25. Lukšan, L., Vlček, J.: Test problems for nonsmooth unconstrained and linearly constrained optimization. Technical Report V-798, ICS AS CR (2000)

  26. McKay, M.: Nonparametric variance-based methods of assessing uncertainty importance. Reliab. Eng. Syst. Saf. 57, 267–279 (1997)

    Article  Google Scholar 

  27. Montgomery, D.C.: Design and Analysis of Experiments. Wiley, New York (2009)

    Google Scholar 

  28. Saltelli, A., Chan, K., Scott, E.M.: Sensitivity Analysis. Wiley, New York (2000)

    MATH  Google Scholar 

  29. Torczon, V.: On the convergence of pattern search algorithms. SIAM J. Optim. 7(1), 1–25 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  30. Vicente, L.N., Custódio, A.L.: Analysis of direct searches for discontinuous functions. Math. Progr. 133(1–2), 299–325 (2012)

    Article  MATH  Google Scholar 

Download references

Acknowledgments

Work of C. Audet was supported by NSERC grant 239436 and AFOSR FA9550-12-1-0198. Research of C. Audet is partially supported by NSERC Discovery Grant 239436 and AFOSR FA9550-09-1-0160.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Charles Audet.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Adjengue, L., Audet, C. & Ben Yahia, I. A variance-based method to rank input variables of the Mesh Adaptive Direct Search algorithm. Optim Lett 8, 1599–1610 (2014). https://doi.org/10.1007/s11590-013-0688-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-013-0688-4

Keywords

Navigation