Abstract
Recently a second directional derivative-based Hessian updating formula was used for Hessian approximation in mesh adaptive direct search (MADS). The approach combined with a quadratic program solver significantly improves the performance of MADS. Unfortunately it imposes some strict requirements on the position of points and the order in which they are evaluated. The subject of this paper is the introduction of a Hessian update formula that utilizes the points from the neighborhood of the incumbent solution without imposing such strict restrictions. The obtained approximate Hessian can then be used for constructing a quadratic model of the objective and the constraints. The proposed algorithm was compared to the reference implementation of MADS (NOMAD) on four sets of test problems. On all but one of them it outperformed NOMAD. The biggest performance difference was observed on constrained problems. To validate the algorithm further the approach was tested on several real-world optimization problems arising from yield approximation and worst case analysis in integrated circuit design. On all tested problems the proposed approach outperformed NOMAD.
Similar content being viewed by others
References
Abramson, M.A., Audet, C.: Convergence of mesh adaptive direct search to second-order stationary points. SIAM J. Optim. 17(2), 606–619 (2006)
Abramson, M.A., Audet, C., Dennis Jr., J.E., Le Digabel, S.: OrthoMADS: a deterministic MADS instance with orthogonal directions. SIAM. J. Optim. 20(2), 948–966 (2009)
Amaioua, N., Audet, C., Conn, A.R., Le Digabel, S.: Efficient solution of quadratically constrained quadratic subproblems within a direct-search algorithm. Eur. J. Oper. Res. 268(1), 13–24 (2018)
Andersen, M.S., Dahl, J., Vandenberghe, L.: CVXOPT, Release 1.1.6. http://cvxopt.org/userguide/index.html (2016)
Audet, C., Dennis Jr., J.E.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)
Audet, C., Dennis Jr., J.E.: A progressive barrier for derivative-free nonlinear programming. SIAM J. Optim. 20(1), 445–472 (2009)
Audet, C., Ianni, A., Le Digabel, S., Tribes, C.: Reducing the number of function evaluations in mesh adaptive direct search algorithms. SIAM J. Optim. 24(2), 621–642 (2014)
Bogachev, V.I.: Measure Theory. Springer, Berlin (2006)
Bűrmen, Á., Puhan, J., Tuma, T.: Grid restrained Nelder–Mead algorithm. Comput. Optim. Appl. 34(3), 359–375 (2006)
Bűrmen, Á., Tuma, T.: Generating poll directions for mesh adaptive direct search with realizations of a uniformly distributed random orthogonal matrix. Pac. J. Optim. 12(4), 813–832 (2016)
Bűrmen, Á., Olenšek, J., Tuma, T.: Mesh adaptive direct search with second directional derivative-based Hessian update. Comput. Optim. Appl. 62, 693–715 (2015)
Clarke, F.: Optimization and Nonsmooth Analysis. SIAM, Philadelphia (1983)
Conn, A.R., Le Digabel, S.: Use of quadratic models with mesh-adaptive direct search for constrained black box optimization. Optim. Methods Softw. 28(1), 139–158 (2013)
Coope, I.D., Price, C.J.: Frame-based methods for unconstrained optimization. J. Optim. Theory Appl. 107(2), 261–274 (2000)
Custódio, A.L., Vincente, L.N.: Using sampling and simplex derivatives in patters search methods. SIAM J. Optim. 18(2), 537–555 (2007)
Conn, A.R., Scheinberg, K., Vincente, L.N.: Introduction to Derivative-Free Optimization. SIAM, Philadelphia (2009)
Frimannslund, L., Steihaug, T.: A generating set search method using curvature information. Comput. Optim. Appl. 38(1), 105–121 (2007)
Graeb, H.E.: Analog Design Centering and Sizing. Springer, Berlin (2007)
Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45(3), 385–482 (2003)
Le Digabel, S.: Algorithm 909: NOMAD nonlinear optimization with the MADS algorithm. ACM Trans. Math. Softw. 37(4), 44:1–44:15 (2011)
Leventhal, D., Lewis, A.S.: Randomized Hessian estimation and directional search. Optimization 60, 329–345 (2011)
Moré, J., Wild, S.: Benchmarking derivative-free optimization algorithms. SIAM. J. Optim. 20(1), 172–191 (2009)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)
Oliphant, T.E.: Python for scientific computing. Comput. Sci. Eng. 9(3), 10–20 (2007)
Powell, M.J.D.: Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Prog. 100, 183–215 (2004)
Rockafellar, R.: Generalized directional derivatives and subgradients of nonconvex functions. Can. J. Math. 32(2), 257–280 (1980)
Stewart, G.W.: The efficient generation of random orthogonal matrices with an application to condition estimators. SIAM. J. Numer. Anal. 17(3), 403–409 (1980)
Van Dyke, B., Asaki, T.J.: Using QR decomposition to obtain a new instance of mesh adaptive direct search with uniformly distributed polling directions. J. Optim. Theory App. 159(3), 805–821 (2013)
Van Rossum, G., et al: The Python language reference. Python Software Foundation (2014). http://docs.python.org/
Vincente, L., Custódio, A.: Analysis of direct searches for discontinuous functions. Math. Prog. 133(1–2), 299–325 (2012)
Acknowledgements
The authors acknowledge the financial support from the Slovenian Research Agency (research core funding No. P2-0246-ICT4QoL—Information and Communications Technologies for Quality of Life). The authors would also like to thank two anonymous reviewers for their constructive remarks and comments.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Bűrmen, Á., Fajfar, I. Mesh adaptive direct search with simplicial Hessian update. Comput Optim Appl 74, 645–667 (2019). https://doi.org/10.1007/s10589-019-00133-6
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-019-00133-6