Skip to main content
Log in

Compositions of convex functions and fully linear models

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Derivative-free optimization (DFO) is the mathematical study of the optimization algorithms that do not use derivatives. One branch of DFO focuses on model-based DFO methods, where an approximation of the objective function is used to guide the optimization algorithm. Proving convergence of such methods often applies an assumption that the approximations form fully linear models—an assumption that requires the true objective function to be smooth. However, some recent methods have loosened this assumption and instead worked with functions that are compositions of smooth functions with simple convex functions (the max-function or the \(\ell _1\) norm). In this paper, we examine the error bounds resulting from the composition of a convex lower semi-continuous function with a smooth vector-valued function when it is possible to provide fully linear models for each component of the vector-valued function. We derive error bounds for the resulting function values and subgradient vectors.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. CMS Books in Mathematics/Ouvrages de Mathématiques de la SMC. Springer, New York (2011). With a foreword by Hédy Attouch

    Book  MATH  Google Scholar 

  2. Bauschke, H.H., Hare, W.L., Moursi, W.M.: A derivative-free comirror algorithm for convex optimization. Optim. Methods Softw. 30(4), 706–726 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bigdeli, K., Hare, W., Nutini, J., Tesfamariam, S.: Optimizing damper connectors for adjacent buildings. Optim. Eng. 17(1), 47–75 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  4. Conn, A., Scheinberg, K., Toint, P.: On the convergence of derivative-free methods for unconstrained optimization. In: Approximation Theory and Optimization (Cambridge, 1996), pp. 83–108. Cambridge University Press, Cambridge (1997)

  5. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of interpolation sets in derivative free optimization. Math. Program. 111(1–2, Ser. B), 141–172 (2008)

    MathSciNet  MATH  Google Scholar 

  6. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28(4), 721–748 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  7. Conn, A.R., Scheinberg, K., Vicente, L.N.: Global convergence of general derivative-free trust-region algorithms to first- and second-order critical points. SIAM J. Optim. 20(1), 387–415 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  8. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization, vol. 8 of MPS/SIAM Book Series on Optimization. SIAM (2009)

  9. Custódio, A.L., Vicente, L.N.: Using sampling and simplex derivatives in pattern search methods. SIAM J. Optim. 18(2), 537–555 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  10. Hare, W.L., Lucet, Y.: Derivative-free optimization via proximal point methods. J. Optim. Theory Appl. 160(1), 204–220 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  11. Hare, W., Nutini, J.: A derivative-free approximate gradient sampling algorithm for finite minimax problems. Comput. Optim. Appl. 56(1), 1–38 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  12. Larson, J., Menickelly, M., Wild, S.: Manifold sampling for \(l1\) nonconvex optimization. Technical report, Argonne National Laboratory (2015). http://www.mcs.anl.gov/papers/P5392-0915

  13. Powell, M.J.D.: UOBYQA: unconstrained optimization by quadratic approximation. Math. Program. 92(3, Ser. B),555–582. ISMP 2000, Part 2 (Atlanta, GA) (2002)

  14. Powell, M.J.D.: On trust region methods for unconstrained minimization without derivatives. Math. Program. 97(3),605–623 (2003). New trends in optimization and computational algorithms (NTOC 2001) (Kyoto)

  15. Powell, M.J.D.: Developments of NEWUOA for minimization without derivatives. IMA J. Numer. Anal. 28(4), 649–664 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  16. Regis, R.G.: The calculus of simplex gradients. Optim. Lett. 9(5), 845–865 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  17. Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences], vol. 317. Springer, Berlin (1998)

    Google Scholar 

  18. Wild, S.M., Shoemaker, C.: Global convergence of radial basis function trust region derivative-free algorithms. SIAM J. Optim. 21(3), 761–781 (2011)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This research was partially funded by the Natural Sciences and Engineering Research Council (NSERC) of Canada, Discover Grant #355571-2013.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to W. Hare.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hare, W. Compositions of convex functions and fully linear models. Optim Lett 11, 1217–1227 (2017). https://doi.org/10.1007/s11590-017-1117-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-017-1117-x

Keywords

Navigation