Skip to main content
Log in

A subgradient method with non-monotone line search

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this paper we present a subgradient method with non-monotone line search for the minimization of convex functions with simple convex constraints. Different from the standard subgradient method with prefixed step sizes, the new method selects the step sizes in an adaptive way. Under mild conditions asymptotic convergence results and iteration-complexity bounds are obtained. Preliminary numerical results illustrate the relative efficiency of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data availibility

The data that supports the findings of this study is available from the corresponding author upon request.

Code availability

The code that supports the findings of this study is available from the corresponding author upon request.

Notes

  1. An extensive numerical comparison between the proposed method and other nonmonotone subgradient methods is beyond the scope of the present paper and will be left for a future work. The aim of our numerical experiments is just to illustrate the proposed method and its properties.

  2. The latitude/longitude coordinates of the Brazilian cities can be found, for instance, at ftp://geoftp.ibge.gov.br/Organizacao/Localidades.

  3. This data set can be found at http://archive.ics.uci.edu/ml.

References

  1. Beck, A.: First-Order Methods in Optmization, 1st edn. Society for Industrial and Applied Mathematics-SIAM and Mathematical Optimization Society (2017)

  2. Bertsekas, D.P.: Nonlinear Programming, Athena Scientific Optimization and Computation Series, 2nd edn. Athena Scientific, Belmont (1999)

    Google Scholar 

  3. Brimberg, J.: The Fermat-Weber location problem revisited. Math. Program. 71, 71–76 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  4. Combettes, P.L.: Quasi-Fejérian analysis of some optimization algorithms. In: Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications (Haifa, 2000), volume 8 of Stud. Comput. Math., pp. 115–152. North-Holland, Amsterdam (2001)

  5. Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62(2, Ser. B), 261–275 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  6. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machine and other Kernel-Based Learning Methods. Cambridge University Press (2000)

  7. Ermol’ev, Y.M.: Methods of solution of nonlinear extremal problems. Cybernetics 2(4), 1–14 (1966)

    Article  MathSciNet  Google Scholar 

  8. Goffin, J.-L., Kiwiel, K.C.: Convergence of a simple subgradient level method. Math. Program. 85(1, Ser. A), 207–211 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  9. Grant, M., Boyd, S.: Cvx: Matlab software for disciplined convex programming, version 2.1 (2014)

  10. Grant, M.C., Boyd, S.P.: Graph implementations for nonsmooth convex programs. In: Recent Advances in Learning and Control, volume 371 of Lect. Notes Control Inf. Sci., pp. 95–110. Springer, London (2008)

  11. Grapiglia, G.N., Sachs, E.W.: On the worst-case evaluation complexity of non-monotone line search algorithms. Comput. Optim. Appl. 68(3), 555–577 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  12. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  13. Hiriart-Urruty, J.-B., Lemaréchal, C.: Convex analysis and minimization algorithms. I, volume 305 of Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences]. Springer, Berlin (1993). Fundamentals

  14. Jerinkić, N.K., Ostojić, T.: AN-SPS: Adaptive sample size nonmonotone line search spectral projected subgradient method for convex constrained optimization problems. Preprint arXiv:2208.10616 (2022)

  15. Kiwiel, K.C.: Methods of Descent for Nondifferentiable Optimization: Lecture Notes in Mathematics, vol. 1133. Springer, Berlin (1985)

    MATH  Google Scholar 

  16. Kiwiel, K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14(3), 807–840 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  17. Krejic, N., Jerinkic, N.K., Ostojic, T.: Spectral projected subgradient method for nonsmooth convex optimization problems. Preprint arXiv:2203.12681, pp. 1–17 (2022)

  18. Loreto, M., Crema, A.: Convergence analysis for the modified spectral projected subgradient method. Optim. Lett. 9(5), 915–929 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  19. Loreto, M., Xu, Y., Kotval, D.: A numerical study of applying spectral-step subgradient method for solving nonsmooth unconstrained optimization problems. Comput. Oper. Res. 104, 90–97 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  20. Nedić, A., Bertsekas, D.: Convergence rate of incremental subgradient algorithms. In Stochastic optimization: algorithms and applications (Gainesville, FL, 2000), volume 54 of Appl. Optim., pp. 223–264. Kluwer Acad. Publ., Dordrecht (2001)

  21. Nedić, A., Bertsekas, D.P.: The effect of deterministic noise in subgradient methods. Math. Program. 125(1, Ser. A), 75–99 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  22. Nesterov, Y.: Subgradient methods for huge-scale optimization problems. Math. Program. 146(1–2, Ser. A), 275–297 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  23. Polyak, B.T., Introduction to optimization. Translations Series in Mathematics and Engineering. Optimization Software Inc, Publications Division, New York,: Translated from the Russian. With a foreword by Dimitri P, Bertsekas (1987)

  24. Sachs, E.W., Sachs, S.M.: Nonmonotone line searches for optimization algorithms. Control Cybernet. 40(4), 1059–1075 (2011)

    MathSciNet  MATH  Google Scholar 

  25. Shalev-Shwartz, S., Singer, Y., Srebro, N., Pegasos, N.: Primal Estimated sub-GrAdient SOlver for SVM. In: Proceedings of the 24th International Conference on Machine Learning, pp. 807–814 (2007)

  26. Shor, N.Z., Minimization methods for nondifferentiable functions, volume 3 of Springer Series in Computational Mathematics. Springer-Verlag, Berlin,: Translated from the Russian by K. C. Kiwiel and A, Ruszczyński (1985)

  27. Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We would like to thank the referees for their constructive remarks which allow us to improve our work.O. P. Ferreira was partially supported in part by CNPq - Brazil Grants 304666/2021-1, G. N. Grapiglia was partially supported by CNPq - Brazil Grant 312777/2020-5, J.C.O. Souza was supported in part by CNPq Grant 313901/2020-1. The project leading to this publication has received funding from the French government under the “France 2030” investment plan managed by the French National Research Agency (reference: ANR-17-EURE-0020) and from Excellence Initiative of Aix-Marseille University - A*MIDEX.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to O. P. Ferreira.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ferreira, O.P., Grapiglia, G.N., Santos, E.M. et al. A subgradient method with non-monotone line search. Comput Optim Appl 84, 397–420 (2023). https://doi.org/10.1007/s10589-022-00438-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-022-00438-z

Keywords

Navigation