Skip to main content
Log in

A Non-monotone Conjugate Subgradient Type Method for Minimization of Convex Functions

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

We suggest a conjugate subgradient type method without any line search for minimization of convex non-differentiable functions. Unlike the custom methods of this class, it does not require monotone decrease in the goal function and reduces the implementation cost of each iteration essentially. At the same time, its step-size procedure takes into account behavior of the method along the iteration points. The preliminary results of computational experiments confirm the efficiency of the proposed modification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Shor, N.Z.: Minimization Methods for Non-differentiable Functions. Springer, Berlin (1985)

    Book  Google Scholar 

  2. Polyak, B.T.: Introduction to Optimization. Optimization Software, New York (1987)

    MATH  Google Scholar 

  3. Mäkela, M.M., Neittaanmäki, P.: Nonsmooth Optimization. World Scientific, Singapore (1992)

    Book  Google Scholar 

  4. Kiwiel, K.C.: Methods of Descent for Nondifferentiable Optimization. Springer, Berlin (1985)

    Book  Google Scholar 

  5. Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms. Springer, Berlin (1993)

    Book  Google Scholar 

  6. Konnov, I.V.: Nonlinear Optimization and Variational Inequalities. Kazan University Press, Kazan (2013). [in Russian]

    Google Scholar 

  7. Wolfe, P.: A method of conjugate subgradients for minimizing nondifferentiable functions. In: Balinski, M.L., Wolfe, P. (eds.) Nondifferentiable Optimization. Mathematical Programming Study 3, 145–173 (1975)

  8. Dem’yanov, V.F., Vasil’yev, L.V.: Nondifferentiable Optimization. Optimization Software, New York (1985)

    Book  Google Scholar 

  9. Chepurnoi, N.D.: Relaxation method of minimization of convex functions. Dokl. Akad. Nauk Ukrainy. Ser. A. 3, 68–69 (1982). [in Russian]

    MathSciNet  Google Scholar 

  10. Konnov, I.V.: A subgradient method of successive relaxation for solving optimization problems. Preprint VINITI No. 531-83, Faculty of Computational Mathematics and Cybernetics, Kazan University, Kazan, 14 pp. (1982). [in Russian]

  11. Kiwiel, K.C.: An aggregate subgradient method for nonsmooth convex minimization. Math. Program. 27, 320–341 (1983)

    Article  MathSciNet  Google Scholar 

  12. Konnov, I.V.: A method of the conjugate subgradient type for minimization of functionals. Issled. Prikl. Matem. 12, 59–62 (1984); Engl. transl.: J. Soviet Math. 45, 1026–1029 (1989)

  13. Mikhalevich, V.S., Gupal, A.M., Norkin, V.I.: Methods of Nonconvex Optimization. Nauka, Moscow (1987). [in Russian]

    MATH  Google Scholar 

  14. Konnov, I.V.: Combined Relaxation Methods for Variational Inequalities. Springer, Berlin (2001)

    Book  Google Scholar 

  15. Gupal, A.M.: Methods of minimizing functions that satisfy a Lipschitz condition, averaging the directions of descent. Cybernetics 14, 695–698 (1978)

    MATH  Google Scholar 

  16. Chepurnoi, N.D.: A method of average quasigradients with stepwise step control for minimization of weakly convex functions. Kibernetika 6, 131–132 (1981). [in Russian]

    Google Scholar 

  17. Konnov, I.V.: Conditional gradient method without line-search. Russ. Mathem. (Izv. VUZ.) 62, 82–85 (2018)

    Article  Google Scholar 

  18. Konnov, I.V.: A simple adaptive step-size choice for iterative optimization methods. Adv. Model. Optim. 20, 353–369 (2018)

    MATH  Google Scholar 

  19. Konnov, I.V.: Simplified versions of the conditional gradient method. Optimization 67, 2275–2290 (2018)

    Article  MathSciNet  Google Scholar 

  20. Lemaréchal, C.: An extension of Davidon methods to nondifferentiable problems. In: Balinski, M.L., Wolfe, P. (eds.) Nondifferentiable Optimization. Mathematical Programming Study 3, pp. 95–109 (1975)

    Chapter  Google Scholar 

  21. Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)

    Book  Google Scholar 

  22. Pshenichnyi, B.N., Danilin, Y.M.: Numerical Methods in Extremal Problems. MIR, Moscow (1978)

    Google Scholar 

  23. Shor, N.Z., Shabashova, L.P.: Solution of minimax problems by the method of generalized gradient descent with dilatation of the space. Cybernetics 8, 88–94 (1972)

    Article  Google Scholar 

  24. Nesterov, Y.: Primal–dual subgradient methods for convex problems. Math. Program. 120, 261–283 (2009)

    Article  MathSciNet  Google Scholar 

  25. Nesterov, Y., Shikhman, V.: Quasi-monotone subgradient methods for nonsmooth convex minimization. J. Optim. Theory Appl. 165, 917–940 (2015)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The results of this work were obtained within the state assignment of the Ministry of Science and Education of Russia, Project No. 1.460.2016/1.4. This work was supported by Russian Foundation for Basic Research, Project No. 19-01-00431.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Igor Konnov.

Additional information

Amir Beck.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Konnov, I. A Non-monotone Conjugate Subgradient Type Method for Minimization of Convex Functions. J Optim Theory Appl 184, 534–546 (2020). https://doi.org/10.1007/s10957-019-01589-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-019-01589-6

Keywords

Mathematics Subject Classification

Navigation