Skip to main content
Log in

A derivative-free approximate gradient sampling algorithm for finite minimax problems

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this paper we present a derivative-free optimization algorithm for finite minimax problems. The algorithm calculates an approximate gradient for each of the active functions of the finite max function and uses these to generate an approximate subdifferential. The negative projection of 0 onto this set is used as a descent direction in an Armijo-like line search. We also present a robust version of the algorithm, which uses the ‘almost active’ functions of the finite max function in the calculation of the approximate subdifferential. Convergence results are presented for both algorithms, showing that either f(x k)→−∞ or every cluster point is a Clarke stationary point. Theoretical and numerical results are presented for three specific approximate gradients: the simplex gradient, the centered simplex gradient and the Gupal estimate of the gradient of the Steklov averaged function. A performance comparison is made between the regular and robust algorithms, the three approximate gradients, and a regular and robust stopping condition.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Abramson, A.M., Audet, C., Couture, G., Dennis, J.E. Jr., Le Digabel, S., Tribes, C.: The NOMAD project. Software available at http://www.gerad.ca/nomad

  2. Bagirov, A.M., Karasözen, B., Sezer, M.: Discrete gradient method: derivative-free method for nonsmooth optimization. J. Optim. Theory Appl. 137(2), 317–334 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. CMS Books in Mathematics. Springer, New York (2011)

    Book  MATH  Google Scholar 

  4. Booker, A.J., Dennis, J.E. Jr., Frank, P.D., Serafini, D.B., Torczon, V.: Optimization using surrogate objectives on a helicopter test example. In: Computational Methods for Optimal Design and Control, Arlington, VA, 1997. Progr. Systems Control Theory, vol. 24, pp. 49–58. Birkhäuser, Boston (1998)

    Chapter  Google Scholar 

  5. Bortz, D.M., Kelley, C.T.: The simplex gradient and noisy optimization problems. In: Computational Methods for Optimal Design and Control. Progr. Systems Control Theory, vol. 24, pp. 77–90. Birkhäuser, Boston (1998)

    Chapter  Google Scholar 

  6. Burke, J.V., Lewis, A.S., Overton, M.L.: Approximating subdifferentials by random sampling of gradients. Math. Oper. Res. 27(3), 567–584 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. Burke, J.V., Lewis, A.S., Overton, M.L.: A robust gradient sampling algorithm for nonsmooth, nonconvex optimization. SIAM J. Optim. 15(3), 751–779 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  8. Cai, X., Teo, K., Yang, X., Zhou, X.: Portfolio optimization under a minimax rule. Manag. Sci. 46(7), 957–972 (2000)

    Article  MATH  Google Scholar 

  9. Clarke, F.H.: Optimization and Nonsmooth Analysis, 2nd edn. Classics Appl. Math., vol. 5. SIAM, Philadelphia (1990)

    Book  MATH  Google Scholar 

  10. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. MPS/SIAM Series on Optimization, vol. 8. SIAM, Philadelphia (2009)

    Book  MATH  Google Scholar 

  11. Custódio, A.L., Dennis, J.E. Jr., Vicente, L.N.: Using simplex gradients of nonsmooth functions in direct search methods. IMA J. Numer. Anal. 28(4), 770–784 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  12. Custódio, A.L., Vicente, L.N.: Using sampling and simplex derivatives in pattern search methods. SIAM J. Optim. 18(2), 537–555 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dennis, J.E. Jr., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Classics in Applied Mathematics. SIAM, Philadelphia (1996)

    Book  MATH  Google Scholar 

  14. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  15. Duvigneau, R., Visonneau, M.: Hydrodynamic design using a derivative-free method. Struct. Multidiscip. Optim. 28, 195–205 (2004)

    Article  Google Scholar 

  16. Ermoliev, Y.M., Norkin, V.I., Wets, R.J.-B.: The minimization of semicontinuous functions: mollifier subgradients. SIAM J. Control Optim. 33, 149–167 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  17. Goldstein, A.A.: Optimization of Lipschitz continuous functions. Math. Program. 13(1), 14–22 (1977)

    Article  MATH  Google Scholar 

  18. Gupal, A.M.: A method for the minimization of almost differentiable functions. Kibernetika 1, 114–116 (1977) (in Russian); English translation in: Cybernetics, 13(2), 220–222 (1977)

    MathSciNet  Google Scholar 

  19. Hare, W., Macklem, M.: Derivative-free optimization methods for finite minimax problems. Optim. Methods Softw. 28(2), 300–312 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  20. Hare, W.L.: Using derivative free optimization for constrained parameter selection in a home and community care forecasting model. In: International Perspectives on Operations Research and Health Care, Proceedings of the 34th Meeting of the EURO Working Group on Operational Research Applied to Health Sciences, pp. 61–73 (2010)

    Google Scholar 

  21. Imae, J., Ohtsuki, N., Kikuchi, Y., Kobayashi, T.: A minimax control design for nonlinear systems based on genetic programming: Jung’s collective unconscious approach. Int. J. Syst. Sci. 35, 775–785 (2004)

    Article  MATH  Google Scholar 

  22. Kelley, C.T.: Detection and remediation of stagnation in the Nelder–Mead algorithm using a sufficient decrease condition. SIAM J. Optim. 10(1), 43–55 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  23. Kelley, C.T.: Iterative Methods for Optimization. Frontiers in Applied Mathematics, vol. 18. SIAM, Philadelphia (1999)

    Book  MATH  Google Scholar 

  24. Kiwiel, K.C.: A nonderivative version of the gradient sampling algorithm for nonsmooth nonconvex optimization. SIAM J. Optim. 20(4), 1983–1994 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  25. Le Digabel, S.: Algorithm 909: NOMAD: nonlinear optimization with the MADS algorithm. ACM Trans. Math. Softw. 37(4), 1–15 (2011)

    Article  MathSciNet  Google Scholar 

  26. Liuzzi, G., Lucidi, S., Sciandrone, M.: A derivative-free algorithm for linearly constrained finite minimax problems. SIAM J. Optim. 16(4), 1054–1075 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  27. Lukšan, L., Vlček, J.: Test Problems for Nonsmooth Unconstrained and Linearly Constrained Optimization. Technical report (February 2000)

  28. Madsen, K.: Minimax solution of non-linear equations without calculating derivatives. Math. Program. Stud. 3, 110–126 (1975)

    Article  MathSciNet  Google Scholar 

  29. Marsden, A.L., Feinstein, J.A., Taylor, C.A.: A computational framework for derivative-free optimization of cardiovascular geometries. Comput. Methods Appl. Mech. Eng. 197(21–24), 1890–1905 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  30. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer Series in Operations Research. Springer, New York (1999)

    Book  MATH  Google Scholar 

  31. Di Pillo, G., Grippo, L., Lucidi, S.: A smooth method for the finite minimax problem. Math. Program., Ser. A 60(2), 187–214 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  32. Polak, E.: On the mathematical foundations of nondifferentiable optimization in engineering design. SIAM Rev. 29(1), 21–89 (1987)

    Article  MathSciNet  Google Scholar 

  33. Polak, E., Royset, J.O., Womersley, R.S.: Algorithms with adaptive smoothing for finite minimax problems. J. Optim. Theory Appl. 119(3), 459–484 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  34. Polyak, R.A.: Smooth optimization methods for minimax problems. SIAM J. Control Optim. 26(6), 1274–1286 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  35. Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences], vol. 317. Springer, Berlin (1998)

    Book  MATH  Google Scholar 

  36. Stafford, R.: Random Points in an n-Dimensional Hypersphere. MATLAB File Exchange (2005). http://www.mathworks.com/matlabcentral/fileexchange/9443-random-points-in-an-n-dimensional-hypersphere

  37. Wolfe, P.: A method of conjugate subgradients for minimizing nondifferentiable functions. Math. Program. Stud. 3, 145–173 (1975)

    Article  MathSciNet  Google Scholar 

  38. Wschebor, M.: Smoothed analysis of κ(A). J. Complex. 20(1), 97–107 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  39. Xu, S.: Smoothing method for minimax problems. Comput. Optim. Appl. 20(3), 267–279 (2001)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to express their gratitude to Dr. C. Sagastizábal for inspirational conversations regarding the Goldstein subdifferential. This research was partially funded by the NSERC DG program and by UBC IRF.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to W. Hare.

Appendix

Appendix

Table 2 Test set summary: problem name and number, problem dimension (N), and number of sub-functions (M)
Table 3 Average accuracy for 25 trials obtained by the AGS and RAGS algorithms for the simplex gradient
Table 4 Average accuracy for 25 trials obtained by the AGS and RAGS algorithms for the centered simplex gradient
Table 5 Average accuracy for 25 trials obtained by the AGS and RAGS algorithm for the Gupal estimate of the gradient of the Steklov averaged function

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hare, W., Nutini, J. A derivative-free approximate gradient sampling algorithm for finite minimax problems. Comput Optim Appl 56, 1–38 (2013). https://doi.org/10.1007/s10589-013-9547-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-013-9547-6

Keywords

Navigation