Advertisement

Computational Optimization and Applications

, Volume 72, Issue 3, pp 525–559 | Cite as

Direct search based on probabilistic feasible descent for bound and linearly constrained problems

  • S. Gratton
  • C. W. RoyerEmail author
  • L. N. Vicente
  • Z. Zhang
Article
  • 115 Downloads

Abstract

Direct search is a methodology for derivative-free optimization whose iterations are characterized by evaluating the objective function using a set of polling directions. In deterministic direct search applied to smooth objectives, these directions must somehow conform to the geometry of the feasible region, and typically consist of positive generators of approximate tangent cones (which then renders the corresponding methods globally convergent in the linearly constrained case). One knows however from the unconstrained case that randomly generating the polling directions leads to better complexity bounds as well as to gains in numerical efficiency, and it becomes then natural to consider random generation also in the presence of constraints. In this paper, we study a class of direct-search methods based on sufficient decrease for solving smooth linearly constrained problems where the polling directions are randomly generated (in approximate tangent cones). The random polling directions must satisfy probabilistic feasible descent, a concept which reduces to probabilistic descent in the absence of constraints. Such a property is instrumental in establishing almost-sure global convergence and worst-case complexity bounds with overwhelming probability. Numerical results show that the randomization of the polling directions can be beneficial over standard approaches with deterministic guarantees, as it is suggested by the respective worst-case complexity bounds.

Keywords

Derivative-free optimization Direct-search methods Bound constraints Linear constraints Feasible descent Probabilistic feasible descent Worst-case complexity 

Notes

References

  1. 1.
    Abramson, M.A., Brezhneva, O.A., Dennis Jr., J.E., Pingel, R.L.: Pattern search in the presence of degenerate linear constraints. Optim. Methods Softw. 23, 297–319 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Audet, C., Hare, W.: Derivative-Free and Blackbox Optimization. Springer Series in Operations Research and Financial Engineering. Springer, Berlin (2017)CrossRefzbMATHGoogle Scholar
  3. 3.
    Audet, C., Le Digabel, S., Peyrega, M.: Linear equalities in blackbox optimization. Comput. Optim. Appl. 61, 1–23 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Audet, C., Le Digabel, S., Tribes, C.: NOMAD user guide. Technical report G-2009-37, Les cahiers du GERAD (2009)Google Scholar
  5. 5.
    Audet, C., Dennis Jr., J.E.: A progressive barrier for derivative-free nonlinear programming. SIAM J. Optim. 20, 445–472 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Bandeira, A.S., Scheinberg, K., Vicente, L.N.: Convergence of trust-region methods based on probabilistic models. SIAM J. Optim. 24, 1238–1264 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Birgin, E.G., Gardenghi, J.L., Martínez, J.M., Santos, S.A., Toint, Ph.L.: Evaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order models. SIAM J. Optim. 29, 951–967 (2016)Google Scholar
  8. 8.
    Cartis, C., Gould, N.I.M., Toint, Ph.L.: An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity. IMA J. Numer. Anal. 32, 1662–1695 (2012)Google Scholar
  9. 9.
    Cartis, C., Gould, N.I.M., Toint, Ph.L.: On the complexity of finding first-order critical points in constrained nonlinear programming. Math. Program. 144, 93–106 (2014)Google Scholar
  10. 10.
    Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. MPS-SIAM Series on Optimization. SIAM, Philadelphia (2009)CrossRefzbMATHGoogle Scholar
  11. 11.
    Diouane, Y., Gratton, S., Vicente, L.N.: Globally convergent evolution strategies for constrained optimization. Comput. Optim. Appl. 62, 323–346 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Dodangeh, M., Vicente, L.N., Zhang, Z.: On the optimal order of worst case complexity of direct search. Optim. Lett. 10, 699–708 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Dreisigmeyer, D.W.: Equality constraints, Riemannian manifolds and direct-search methods. Technical report LA-UR-06-7406, Los Alamos National Laboratory (2006)Google Scholar
  15. 15.
    Elster, C., Neumaier, A.: A grid algorithm for bound constrained optimization of noisy functions. IMA J. Numer. Anal. 15, 585–608 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Fukuda, K., Prodon, A.: Double description method revisited. In: Deza, M., Euler, R., Manoussakis, I. (eds.) Combinatorics and Computer Science: 8th Franco-Japanese and 4th Franco-Chinese Conference, Brest, France, 3–5 July 1995, Selected Papers, pp. 91–111. Springer (1996)Google Scholar
  17. 17.
    García-Palomares, U.M., García-Urrea, I.J., Rodríguez-Hernández, P.S.: On sequential and parallel non-monotone derivative-free algorithms for box constrained optimization. Optim. Methods Softw. 28, 1233–1261 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEst: a constrained and unconstrained testing environment with safe threads. Comput. Optim. Appl. 60, 545–557 (2015)Google Scholar
  19. 19.
    Gratton, S., Royer, C.W., Vicente, L.N., Zhang, Z.: Direct search based on probabilistic descent. SIAM J. Optim. 25, 1515–1541 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Gratton, S., Vicente, L.N.: A merit function approach for direct search. SIAM J. Optim. 24, 1980–1998 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Kelley, C.T.: Implicit Filtering Software Environment and Tools. SIAM, Philadelphia (2011)CrossRefGoogle Scholar
  22. 22.
    Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45, 385–482 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Kolda, T.G., Lewis, R.M., Torczon, V.: Stationarity results for generating set search for linearly constrained optimization. SIAM J. Optim. 17, 943–968 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Le Digabel, S.: Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Trans. Math. Software 37, 44:1–44:15 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Lewis, R.M., Shepherd, A., Torczon, V.: Implementing generating set search methods for linearly constrained minimization. SIAM J. Sci. Comput. 29, 2507–2530 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Lewis, R.M., Torczon, V.: Pattern search algorithms for bound constrained minimization. SIAM J. Optim. 9, 1082–1099 (1999)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Lewis, R.M., Torczon, V.: Pattern search algorithms for linearly constrained minimization. SIAM J. Optim. 10, 917–941 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Lewis, R.M., Torczon, V.: A direct search approach to nonlinear programming problems using an augmented Lagrangian method with explicit treatment of the linear constraints. Technical report WM-CS-2010-01, College of William & Mary, Department of Computer Science (2010)Google Scholar
  29. 29.
    Liu, L., Zhang, X.: Generalized pattern search methods for linearly equality constrained optimization problems. Appl. Math. Comput. 181, 527–535 (2006)MathSciNetzbMATHGoogle Scholar
  30. 30.
    Lucidi, S., Sciandrone, M.: A derivative-free algorithm for bound constrained minimization. Comput. Optim. Appl. 21, 119–142 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Lucidi, S., Sciandrone, M., Tseng, P.: Objective-derivative-free methods for constrained optimization. Math. Program. 92, 31–59 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Moré, J.J., Wild, S.M.: Benchmarking derivative-free optimization algorithms. SIAM J. Optim. 20, 172–191 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  33. 33.
    Moreau, J.-J.: Décomposition orthogonale d’un espace hilbertien selon deux cônes mutuellement polaires. C. R. Acad. Sci. Paris 255, 238–240 (1962)MathSciNetzbMATHGoogle Scholar
  34. 34.
    Price, C.J., Coope, I.D.: Frames and grids in unconstrained and linearly constrained optimization: a nonsmooth approach. SIAM J. Optim. 14, 415–438 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  35. 35.
    Schilling, R.L.: Measures Integrals and Martingales. Cambridge University Press, Cambridge (2005)CrossRefzbMATHGoogle Scholar
  36. 36.
    The Mathworks, Inc.: Global Optimization Toolbox User’s Guide, version 3.3, Oct 2014Google Scholar
  37. 37.
    Vaz, A.I.F., Vicente, L.N.: A particle swarm pattern search method for bound constrained global optimization. J. Global Optim. 39, 197–219 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  38. 38.
    Vaz, A.I.F., Vicente, L.N.: PSwarm: a hybrid solver for linearly constrained global derivative-free optimization. Optim. Methods Softw. 24, 669–685 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Vicente, L.N.: Worst case complexity of direct search. EURO J. Comput. Optim. 1, 143–153 (2013)CrossRefzbMATHGoogle Scholar
  40. 40.
    Yuan, Y.: Conditions for convergence of trust region algorithms for nonsmooth optimization. Math. Program. 31, 220–228 (1985)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.IRIT, University of ToulouseToulouse Cedex 7France
  2. 2.Wisconsin Institute for DiscoveryUniversity of Wisconsin-MadisonMadisonUSA
  3. 3.CMUC, Department of MathematicsUniversity of CoimbraCoimbraPortugal
  4. 4.Department of Applied MathematicsThe Hong Kong Polytechnic UniversityKowloonChina

Personalised recommendations