Skip to main content
Log in

Mesh adaptive direct search with simplicial Hessian update

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

Recently a second directional derivative-based Hessian updating formula was used for Hessian approximation in mesh adaptive direct search (MADS). The approach combined with a quadratic program solver significantly improves the performance of MADS. Unfortunately it imposes some strict requirements on the position of points and the order in which they are evaluated. The subject of this paper is the introduction of a Hessian update formula that utilizes the points from the neighborhood of the incumbent solution without imposing such strict restrictions. The obtained approximate Hessian can then be used for constructing a quadratic model of the objective and the constraints. The proposed algorithm was compared to the reference implementation of MADS (NOMAD) on four sets of test problems. On all but one of them it outperformed NOMAD. The biggest performance difference was observed on constrained problems. To validate the algorithm further the approach was tested on several real-world optimization problems arising from yield approximation and worst case analysis in integrated circuit design. On all tested problems the proposed approach outperformed NOMAD.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Abramson, M.A., Audet, C.: Convergence of mesh adaptive direct search to second-order stationary points. SIAM J. Optim. 17(2), 606–619 (2006)

    Article  MathSciNet  Google Scholar 

  2. Abramson, M.A., Audet, C., Dennis Jr., J.E., Le Digabel, S.: OrthoMADS: a deterministic MADS instance with orthogonal directions. SIAM. J. Optim. 20(2), 948–966 (2009)

    Article  MathSciNet  Google Scholar 

  3. Amaioua, N., Audet, C., Conn, A.R., Le Digabel, S.: Efficient solution of quadratically constrained quadratic subproblems within a direct-search algorithm. Eur. J. Oper. Res. 268(1), 13–24 (2018)

    Article  MathSciNet  Google Scholar 

  4. Andersen, M.S., Dahl, J., Vandenberghe, L.: CVXOPT, Release 1.1.6. http://cvxopt.org/userguide/index.html (2016)

  5. Audet, C., Dennis Jr., J.E.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)

    Article  MathSciNet  Google Scholar 

  6. Audet, C., Dennis Jr., J.E.: A progressive barrier for derivative-free nonlinear programming. SIAM J. Optim. 20(1), 445–472 (2009)

    Article  MathSciNet  Google Scholar 

  7. Audet, C., Ianni, A., Le Digabel, S., Tribes, C.: Reducing the number of function evaluations in mesh adaptive direct search algorithms. SIAM J. Optim. 24(2), 621–642 (2014)

    Article  MathSciNet  Google Scholar 

  8. Bogachev, V.I.: Measure Theory. Springer, Berlin (2006)

    Google Scholar 

  9. Bűrmen, Á., Puhan, J., Tuma, T.: Grid restrained Nelder–Mead algorithm. Comput. Optim. Appl. 34(3), 359–375 (2006)

    Article  MathSciNet  Google Scholar 

  10. Bűrmen, Á., Tuma, T.: Generating poll directions for mesh adaptive direct search with realizations of a uniformly distributed random orthogonal matrix. Pac. J. Optim. 12(4), 813–832 (2016)

    MathSciNet  MATH  Google Scholar 

  11. Bűrmen, Á., Olenšek, J., Tuma, T.: Mesh adaptive direct search with second directional derivative-based Hessian update. Comput. Optim. Appl. 62, 693–715 (2015)

    Article  MathSciNet  Google Scholar 

  12. Clarke, F.: Optimization and Nonsmooth Analysis. SIAM, Philadelphia (1983)

    MATH  Google Scholar 

  13. Conn, A.R., Le Digabel, S.: Use of quadratic models with mesh-adaptive direct search for constrained black box optimization. Optim. Methods Softw. 28(1), 139–158 (2013)

    Article  MathSciNet  Google Scholar 

  14. Coope, I.D., Price, C.J.: Frame-based methods for unconstrained optimization. J. Optim. Theory Appl. 107(2), 261–274 (2000)

    Article  MathSciNet  Google Scholar 

  15. Custódio, A.L., Vincente, L.N.: Using sampling and simplex derivatives in patters search methods. SIAM J. Optim. 18(2), 537–555 (2007)

    Article  MathSciNet  Google Scholar 

  16. Conn, A.R., Scheinberg, K., Vincente, L.N.: Introduction to Derivative-Free Optimization. SIAM, Philadelphia (2009)

    Book  Google Scholar 

  17. Frimannslund, L., Steihaug, T.: A generating set search method using curvature information. Comput. Optim. Appl. 38(1), 105–121 (2007)

    Article  MathSciNet  Google Scholar 

  18. Graeb, H.E.: Analog Design Centering and Sizing. Springer, Berlin (2007)

    Google Scholar 

  19. Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45(3), 385–482 (2003)

    Article  MathSciNet  Google Scholar 

  20. Le Digabel, S.: Algorithm 909: NOMAD nonlinear optimization with the MADS algorithm. ACM Trans. Math. Softw. 37(4), 44:1–44:15 (2011)

    Article  MathSciNet  Google Scholar 

  21. Leventhal, D., Lewis, A.S.: Randomized Hessian estimation and directional search. Optimization 60, 329–345 (2011)

    Article  MathSciNet  Google Scholar 

  22. Moré, J., Wild, S.: Benchmarking derivative-free optimization algorithms. SIAM. J. Optim. 20(1), 172–191 (2009)

    Article  MathSciNet  Google Scholar 

  23. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)

    MATH  Google Scholar 

  24. Oliphant, T.E.: Python for scientific computing. Comput. Sci. Eng. 9(3), 10–20 (2007)

    Article  Google Scholar 

  25. Powell, M.J.D.: Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Prog. 100, 183–215 (2004)

    Article  MathSciNet  Google Scholar 

  26. Rockafellar, R.: Generalized directional derivatives and subgradients of nonconvex functions. Can. J. Math. 32(2), 257–280 (1980)

    Article  MathSciNet  Google Scholar 

  27. Stewart, G.W.: The efficient generation of random orthogonal matrices with an application to condition estimators. SIAM. J. Numer. Anal. 17(3), 403–409 (1980)

    Article  MathSciNet  Google Scholar 

  28. Van Dyke, B., Asaki, T.J.: Using QR decomposition to obtain a new instance of mesh adaptive direct search with uniformly distributed polling directions. J. Optim. Theory App. 159(3), 805–821 (2013)

    Article  MathSciNet  Google Scholar 

  29. Van Rossum, G., et al: The Python language reference. Python Software Foundation (2014). http://docs.python.org/

  30. Vincente, L., Custódio, A.: Analysis of direct searches for discontinuous functions. Math. Prog. 133(1–2), 299–325 (2012)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors acknowledge the financial support from the Slovenian Research Agency (research core funding No. P2-0246-ICT4QoL—Information and Communications Technologies for Quality of Life). The authors would also like to thank two anonymous reviewers for their constructive remarks and comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Árpád Bűrmen.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bűrmen, Á., Fajfar, I. Mesh adaptive direct search with simplicial Hessian update. Comput Optim Appl 74, 645–667 (2019). https://doi.org/10.1007/s10589-019-00133-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-019-00133-6

Keywords

Mathematics Subject Classification

Navigation