Skip to main content
Log in

A Discussion on Variational Analysis in Derivative-Free Optimization

  • Published:
Set-Valued and Variational Analysis Aims and scope Submit manuscript

Abstract

Variational Analysis studies mathematical objects under small variations. With regards to optimization, these objects are typified by representations of first-order or second-order information (gradients, subgradients, Hessians, etc). On the other hand, Derivative-Free Optimization studies algorithms for continuous optimization that do not use first-order information. As such, researchers might conclude that Variational Analysis plays a limited role in Derivative-Free Optimization research. In this paper we argue the contrary by showing that many successful DFO algorithms rely heavily on tools and results from Variational Analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Amaioua, N., Audet, C., Conn, A.R., Le Digabel, S.: Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm. Eur. J. Oper. Res. 268(1), 13–24 (2018)

    MathSciNet  MATH  Google Scholar 

  2. Audet, C.: Convergence results for generalized pattern search algorithms are tight. Optim. Eng. 5(2), 101–122 (2004)

    MathSciNet  MATH  Google Scholar 

  3. Audet, C., Béchard, V., Le Digabel, S.: Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search. J Glob Optim 41, 299–318 (2008)

    MathSciNet  MATH  Google Scholar 

  4. Audet, C., Côté, P., Poissant, C., Tribes, C.: Monotonic grey box direct search optimization. Optim. Lett. 14, 3–18 (2020)

    MathSciNet  MATH  Google Scholar 

  5. Audet, C., Dennis, J.E. Jr.: Analysis of generalized pattern searches. SIAM J. Optim. 13(3), 889–903 (2003)

    MathSciNet  MATH  Google Scholar 

  6. Audet, C., Dennis, J.E.: Jr. Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)

    MathSciNet  MATH  Google Scholar 

  7. Audet, C., Dennis, J.E. Jr., Le Digabel, S.: Globalization strategies for mesh adaptive direct search. Comput. Optim. Appl. 46(2), 193–215 (2010)

    MathSciNet  MATH  Google Scholar 

  8. Audet, C., Hare, W.: Derivative-Free and Blackbox Optimization. Springer International Publishing AG, Switzerland (2017)

    MATH  Google Scholar 

  9. Audet, C., Hare, W.: Algorithmic construction of the subdifferential from directional derivatives. Set-Valued Var. Anal. 26(3), 431–447 (2018)

    MathSciNet  MATH  Google Scholar 

  10. Audet, C., Hare, W.: Model-based methods in derivative-free nonsmooth optimization, chapter 18. In: Bagirov, A., Gaudioso, M., Karmitsa, N., Mäkelä, M (eds.) Numerical nonsmooth optimization. Springer (2020)

  11. Audet, C., Ianni, A., Le Digabel, S., Tribes, C.: Reducing the number of function evaluations in mesh adaptive direct search algorithms. SIAM J. Optim. 24(2), 621–642 (2014)

    MathSciNet  MATH  Google Scholar 

  12. Audet, C., Ihaddadene, A., Le Digabel, S., Tribes, C.: Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm. Optim. Lett. 12(4), 675–689 (2018)

    MathSciNet  MATH  Google Scholar 

  13. Audet, C., Le Digabel, S., Tribes, C.: The mesh adaptive direct search algorithm for granular and discrete variables. SIAM J. Optim. 29(2), 1164–1189 (2019)

    MathSciNet  MATH  Google Scholar 

  14. Audet, C., Savard, G., Zghal, W.: A mesh adaptive direct search algorithm for multiobjective optimization. Eur. J. Oper. Res. 204(3), 545–556 (2010)

    MathSciNet  MATH  Google Scholar 

  15. Audet, C., Tribes, C.: Mesh-based nelder?mead algorithm for inequality constrained optimization. Comput. Optim. Appl. 71(2), 331–352 (2018)

    MathSciNet  MATH  Google Scholar 

  16. Aziz, M., Hare, W., Jaberipour, M., Lucet, Y.: Multi-fidelity algorithms for the horizontal alignment problem in road design. Eng. Optim. 0(0), 1–20 (2019)

    Google Scholar 

  17. Bagirov, A.M., Karasözen, B., Sezer, M.: Discrete gradient method: derivative-free method for nonsmooth optimization. J. Optim. Theory Appl. 137(2), 317–334 (2008)

    MathSciNet  MATH  Google Scholar 

  18. Bajaj, I., Iyer, S.S., Hasan, M.M.F.: A trust region-based two phase algorithm for constrained black-box and grey-box optimization with infeasible initial point. Comput. Chem. Eng. 116, 306–321 (2018)

    Google Scholar 

  19. Berahas, A.S., Byrd, R.H., Nocedal, J.: Derivative-free optimization of noisy functions via quasi-N,ewton methods. SIAM J. Optim. 29(2), 965–993 (2019)

    MathSciNet  MATH  Google Scholar 

  20. Berghen, F.V.: CONDOR: A Constrained, Non-Linear, Derivative-Free Parallel Optimizer for Continuous, High Computing Load, Noisy Objective Functions. PhD Thesis, Université Libre de Bruxelles, Belgium (2004)

  21. Berghen, F.V., Bersini, H.: CONDOR, a new parallel, constrained extension of Powell’s UOBYQA algorithm: experimental results and comparison with the DFO algorithm. jcomam 181, 157–175 (2005)

    MathSciNet  MATH  Google Scholar 

  22. Bottou, L., Curtis, F.E., Nocedal, J.: Optimization methods for large-scale machine learning. SIAM Rev. 60(2), 223–311 (2018)

    MathSciNet  MATH  Google Scholar 

  23. Burke, J.V., Lewis, A.S., Overton, M.L.: A robust gradient sampling algorithm for nonsmooth, nonconvex optimization. SIAM J. Optim. 15(3), 751–779 (2005)

    MathSciNet  MATH  Google Scholar 

  24. Chen, R., Menickelly, M., Scheinberg, K.: Stochastic optimization using a trust-region method and random models. Math. Program. 169, 447–487 (2018)

    MathSciNet  MATH  Google Scholar 

  25. Cocchi, G., Liuzzi, G., Papini, A., Sciandrone, M.: An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints. Comput. Optim. Appl. 69(2), 267–296 (2018)

    MathSciNet  MATH  Google Scholar 

  26. Conn, A.R., Scheinberg, K., Toint, Ph.L.: A derivative free optimization algorithm in practice. In: Proceedings of 7th AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization. http://perso.fundp.ac.be/phtoint/pubs/TR98-11.ps (1998)

  27. Conn, A.R., Scheinberg, K., Toint, P. h. L.: DFO (Derivative Free Optimization) https://projects.coin-or.org/Dfo (2001)

  28. Conn, A.R., Scheinberg, K.: L.n. Vicente. Geometry of sample sets in derivative free optimization Polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28(4), 721–749 (2008)

    MathSciNet  MATH  Google Scholar 

  29. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free MOS-SIAM Optimization. Series on Optimization. SIAM, Philadelphia (2009)

    MATH  Google Scholar 

  30. Conn, A.R., Toint, P. h. L.: An Algorithm using Quadratic Interpolation for Unconstrained Derivative Free Optimization, pp 27–47. Springer, Berlin (1996). chapter Nonlinear Optimization and Applications

    MATH  Google Scholar 

  31. Coope, I.D., Tappenden, R.: Efficient calculation of regular simplex gradients. Comput. Optim. Appl. To appear (2019)

  32. Custódio, A. L., Madeira, J.F.A., Vaz, A.I.F., Vicente, L.N.: Direct multisearch for multiobjective optimization. SIAM J. Optim. 21(3), 1109–1140 (2011)

    MathSciNet  MATH  Google Scholar 

  33. Gramacy, R.B., Le Digabel, S.: The mesh adaptive direct search algorithm with treed Gaussian process surrogates. Pacific J. Optim. 11(3), 419–447 (2015)

    MathSciNet  MATH  Google Scholar 

  34. Hare, W.: Compositions of convex functions and fully linear models. Optim. Lett. 11(7), 1217–1227 (2017)

    MathSciNet  MATH  Google Scholar 

  35. Hare, W., Jaberipour, M.: Adaptive interpolation strategies in derivative-free optimization: a case study. Pac. J. Optim. 14(2), 327–347 (2018)

    MathSciNet  Google Scholar 

  36. Hare, W., Jarry-Bolduc, G.: Calculus identities for generalized simplex gradients Rules and applications. SIAM J. Optim. 30(1), 853–884 (2020)

    MathSciNet  Google Scholar 

  37. Hare, W., Nutini, J.: A derivative-free approximate gradient sampling algorithm for finite minimax problems. Comput. Optim. Appl. 56(1), 1–38 (2013)

    MathSciNet  MATH  Google Scholar 

  38. Hare, W., Nutini, J., Tesfamariam, S.: A survey of non-gradient optimization methods in structural engineering. Adv. Eng. Softw. 59, 19–28 (2013)

    Google Scholar 

  39. Hare, W., Planiden, C., Sagastizábal, C.: A derivative-free VU-algorithm for convex finite-max problems. Optim. Methods Softw., (to appear). https://www.tandfonline.com/doi/full/10.1080/10556788.2019.1668944

  40. Hare, W., Sagastizábal, C., Solodov, M.: A proximal bundle method for nonsmooth nonconvex functions with inexact information. Comput. Optim Appl. 63(1), 1–28 (2016)

    MathSciNet  MATH  Google Scholar 

  41. Hare, W.L., Lucet, Y.: Derivative-free optimization via proximal point methods. J. Optim. Theory Appl. 160(1), 204–220 (2014)

    MathSciNet  MATH  Google Scholar 

  42. Hooke, R., Jeeves, T.A.: “Direct Search” solution of numerical and statistical problems. J. Assoc. Comput. Mach. 8(2), 212–229 (1961)

    MATH  Google Scholar 

  43. Khan, K.A., Larson, J., Wild, S.M.: Manifold sampling for optimization of nonconvex functions that are piecewise linear compositions of smooth components. SIAM J. Optim. 28(4), 3001–3024 (2018)

    MathSciNet  MATH  Google Scholar 

  44. Larson, J., Menickelly, M., Wild, S.M.: Manifold sampling for 1 nonconvex optimization. SIAM J. Optim. 26(4), 2540–2563 (2016)

    MathSciNet  MATH  Google Scholar 

  45. Larson, J., Menickelly, M., Wild, S.M.: Derivative-free optimization methods. Acta Numerica 28, 287–404 (2019)

    MathSciNet  MATH  Google Scholar 

  46. Lera, D., Sergeyev, Y.D.: GOSH: derivative-free global optimization using multi-dimensional space-filling curves. J. Global Optim. 71(1), 193–211 (2018)

    MathSciNet  MATH  Google Scholar 

  47. Liuzzi, G., Lucidi, S., Rinaldi, F., Vicente, L.N.: Trust-region methods for the derivative-free optimization of nonsmooth black-box functions. SIAM J. Optim. 29(4), 3012–3035 (2019)

    MathSciNet  MATH  Google Scholar 

  48. Menickelly, M., Wild, S.M.: Derivative-free robust optimization by outer approximations. Math. Program. 179(1-2, Ser. A), 157–193 (2020)

    MathSciNet  MATH  Google Scholar 

  49. Mifflin, R.: A superlinearly convergent algorithm for minimization without evaluating derivatives. Math. Program. 9(1), 100–117 (1975)

    MathSciNet  MATH  Google Scholar 

  50. Müller, J., Day, M.: Surrogate optimization of computationally expensive black-box problems with hidden constraints. INFORMS J. Comput. To appear (2019)

  51. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)

    MathSciNet  MATH  Google Scholar 

  52. Oeuvray, R., Bierlaire, M.: Boosters: a derivative-free algorithm based on radial basis functions. Int. J. Model. Simul. 29(1), 26–36 (2009)

    Google Scholar 

  53. Paquette, C., Scheinberg, K.: A stochastic line search method with convergence rate analysis (2018)

  54. Polak, E., Wetter, M.: Precision control for generalized pattern search algorithms with adaptive precision function evaluations. SIAM J. Optim. 16(3), 650–669 (2006)

    MathSciNet  MATH  Google Scholar 

  55. Powell, M.J.D.: A direct search optimization method that models the objective and constraint functions by linear interpolation. In: Gomez, S., Hennart, J.-P. (eds.) Advances in Optimization and Numerical Analysis, Proceedings of the 6th Workshop on Optimization and Numerical Analysis, Oaxaca, Mexico, vol. 275, pp. 51–67, Kluwer Academic Publishers, Dordrecht (1994)

  56. Powell, M.J.D.: UOBYQA: Unconstrained optimization by quadratic approximation. Technical Report DAMTP 2000/NA14, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Silver Street, Cambridge CB3 9EW, England (2000)

  57. Powell, M.J.D.: UOBYQA: Unconstrained Optimization by quadratic approximation. Math. Program. 92(3), 555–582 (2002)

    MathSciNet  MATH  Google Scholar 

  58. Powell, M.J.D.: On trust region methods for unconstrained minimization without derivatives. Math. Program. 97(3), 605–623 (2003)

    MathSciNet  MATH  Google Scholar 

  59. Powell, M.J.D.: Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Program. 100(1), 183–215 (2004)

    MathSciNet  MATH  Google Scholar 

  60. Powell, M.J.D.: The BOBYQA algorithm for bound constrained optimization without derivatives. Technical report, Department of Applied Mathematics and Theoretical Physics, Cambridge University, UK (2009)

  61. Regis, R.G.: The calculus of simplex gradients. Optim. Lett. 9(5), 845–865 (2015)

    MathSciNet  MATH  Google Scholar 

  62. Regis, R.G.: On the properties of positive spanning sets and positive bases. Optim. Eng. 17(1), 229–262 (2016)

    MathSciNet  MATH  Google Scholar 

  63. Regis, R.G., Shoemaker, C.A.: Parallel radial basis function methods for the global optimization of expensive functions. Eur. J. Oper. Res. 182(2), 514–535 (2007)

    MathSciNet  MATH  Google Scholar 

  64. Rios, L.M., Sahinidis, N.V.: Derivative-free optimization: a review of algorithms and comparison of software implementations. J. Glob. Optim. 56(3), 1247–1293 (2013)

    MathSciNet  MATH  Google Scholar 

  65. Shashaani, S., Hashemi, F.S., Pasupathy, R.: ASTRO-DF: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization. SIAM J. Optim. 28(4), 3145–3176 (2018)

    MathSciNet  MATH  Google Scholar 

  66. Torczon, V.: On the convergence of pattern search algorithms. SIAM J. Optim. 7(1), 1–25 (1997)

    MathSciNet  MATH  Google Scholar 

  67. Verdério, A., Karas, E.W., Pedroso, L.G., Scheinberg, K.: On the construction of quadratic models for derivative-free trust-region algorithms. EURO J. Comput Optim 5, 501–527 (2017)

    MathSciNet  MATH  Google Scholar 

  68. Wild, S.M., Regis, R.G., Shoemaker, C.A.: ORBIT: optimization by radial basis function interpolation in trust-regions. SIAM J. Sci. Comput. 30(6), 3197–3219 (2008)

    MathSciNet  MATH  Google Scholar 

  69. Wild, S.M., Shoemaker, C.A.: Global convergence of radial basis function trust region derivative-free algorithms. SIAM J. Optim. 21(3), 761–781 (2011)

    MathSciNet  MATH  Google Scholar 

  70. Winfield, D.: Function and Functional Optimization by Interpolation in Data Tables. PhD thesis Harvard University USA (1969)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Warren Hare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Hare’s research is partially supported by Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery Grant #2018-03865.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hare, W. A Discussion on Variational Analysis in Derivative-Free Optimization. Set-Valued Var. Anal 28, 643–659 (2020). https://doi.org/10.1007/s11228-020-00556-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11228-020-00556-y

Keywords

Mathematics Subject Classification (2010)

Navigation