Skip to main content
Log in

Adaptive Mirror Descent Algorithms for Convex and Strongly Convex Optimization Problems with Functional Constraints

  • Published:
Journal of Applied and Industrial Mathematics Aims and scope Submit manuscript

Abstract

Under consideration are some adaptive mirror descent algorithms for the problems of minimization of a convex objective functional with several convex Lipschitz (generally, nonsmooth) functional constraints. It is demonstrated that the methods are applicable to the objective functionals of various levels of smoothness: The Lipschitz condition holds either for the objective functional itself or for its gradient or Hessian (while the functional itself can fail to satisfy the Lipschitz condition). The main idea is the adaptive adjustment of the method with respect to the Lipschitz constant of the objective functional (its gradient or Hessian), as well as the Lipschitz constant of the constraint. The two types of methods are considered: adaptive (not requiring the knowledge of the Lipschitz constants neither for the objective functional nor for constraints, and partially adaptive (requiring the knowledge of the Lipschitz constant for constraints). Using the restart technique, some methods are proposed for strongly convex minimization problems. Some estimates of the rate of convergence are obtained for all algorithms under consideration in dependence on the level of smoothness of the objective functional. Numerical experiments are presented to illustrate the advantages of the proposed methods for some examples.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. A. Ben-Tal and A. Nemirovski, “Robust Truss Topology Design via Semidefinite Programming,” SIAM J. Optim. 7 (4), 991-1016 (1997).

    Article  MathSciNet  MATH  Google Scholar 

  2. Yu. Nesterov and S. Shpirko, “Primal-Dual Subgradient Methods for Huge-Scale Linear Conic Problem,” SIAM J. Optim. 24 (3), 1–2. (2014).

    Article  MathSciNet  MATH  Google Scholar 

  3. Y Nesterov, Introductory Lectures on Convex Optimization: A Basic Course (Kluwer Acad. Publ., Massachusetts, 2004).

    Book  MATH  Google Scholar 

  4. F. P. Vasil’ev, Optimization Methods, Vol. 1 (MTsNMO, Moscow, 2011) [in Russian].

    Google Scholar 

  5. F. P. Vasil’ev, Optimization Methods, Vol. 2 (MTsNMO, Moscow, 2011) [in Russian].

    Google Scholar 

  6. G. Lan, “Gradient Slidingfor Composite Optimization,” Math. Program. 159 (1–2), 1–2. (2016).

    Article  MathSciNet  MATH  Google Scholar 

  7. A. Beck and M. Teboulle, “Mirror Descent and Nonlinear Projected Subgradient Methods for Convex Optimization,” Oper. Res. Lett. 31 (3), 1–2. 2003).

    Article  MathSciNet  MATH  Google Scholar 

  8. A. Nemirovskii and D. Yudin, “Efficient Methods for Large-Scale Convex Optimization Problems,” Ekono-mika i Matematicheskie Metody No. 2, 135–152 (1979).

    Google Scholar 

  9. A. Nemirovski and D. Yudin, Problem Complexity and Method Efficiency in Optimization (Nauka, Moscow, 1979; J. Wiley & Sons, New York, 1983).

    Google Scholar 

  10. N. Z. Shor, “Application of Generalized Gradient Descent in Block Programming,” Kibernetika 3 (3), 1–2. (1967).

    MathSciNet  Google Scholar 

  11. B. T. Polyak, “A General Method of Solving Extremum Problems,” Soviet Math. Dokl. 8(3), 1–2. 1967).

    MATH  Google Scholar 

  12. A. Beck, A. Ben-Tal, N. Guttmann-Beck, and L. Tetruashvili, “The CoMirror Algorithm for Solving Nonsmooth Constrained Convex Problems,” Oper. Res. Lett. 38 (6), 1–2. (2010).

    Article  MathSciNet  MATH  Google Scholar 

  13. A. Ben-Tal and A. Nemirovski, Lectures on Modern Convex Optimization (SIAM, Philadelphia, 2001).

    Book  MATH  Google Scholar 

  14. A. Bayandina, P. Dvurechensky, A. Gasnikov, F. Stonyakin, and A. Titov, “Mirror Descent and Convex Optimization Problems with Non-Smooth Inequality Constraints,” in Large-Scale and Distributed Optimization: Lecture Notes in Mathematics, Vol. 2227 (Springer, Cham, 2018), pp. 181–213.

    Chapter  Google Scholar 

  15. F. S. Stonyakin, M. S. Alkousa, A. N. Stepanov, and M. A. Barinov, “Adaptive Mirror Descent Algorithms in Convex Programming Problems with Lipschitz Constraints,” Trudy Inst. Mat. Mekh. Ural. Otdel. Ross. Akad. Nauk 24 (2), 1–2. (2018).

    MathSciNet  Google Scholar 

  16. A. A. Titov, F. S. Stonyakin, A. V. Gasnikov, and M. S. Alkousa, “Mirror Descent and Constrained Online Optimization Problems Optimization and Applications,” in Optimization and Applications: Revised Selected Papers, 9th International Conference OPTIMA-2018 (Petrovac, Montenegro, October 1-5, 2018). Ser. Communications in Computer and Information Science, Vol. 974 (Springer, Cham, 2019), pp. 64–78.

    Google Scholar 

  17. Yu. Nesterov, “Subgradient Methods for Convex Functions with Nonstandard Growth Properties,” (2016). Available at http://www.mathnet.ru:8080/PresentFiles/16179/growthbm_nesterov.pdf

    Google Scholar 

  18. FS. Stonyakin and A. A. Titov, “One Mirror Descent Algorithm for Convex Constrained Optimization Problems with Non-Standard Growth Properties,” in Proceedings of the School-Seminar on Optimization Problems and Their Applications (OPTA-SCL), Omsk, Russia, July 8–14, 2018, CEUR Workshop Proceedings, Vol. 2098 (RWTH Aachen Univ., Aachen, 2018), pp. 372–384. Available at http://ceur-ws.org/Vol-2098.

    Google Scholar 

  19. Yu. Nesterov, “A Method of Solving a Convex Programming Problem with Convergence Rate O(1/k2),” Soviet Math. Doklady 27 (2), 1–2. (1983).

    MATH  Google Scholar 

  20. A. Juditsky and A. Nemirovski, “First Order Methods for Non-Smooth Convex Large-Scale Optimization. I: General Purpose Methods,” in Optimization for Machine Learning, Ed. by S. Sra, S. Nowozin, and S. J. Wright (MIT Press, Cambridge, Massachusetts, 2012), pp. 121–148.

    Google Scholar 

  21. A. S. Bayandina, A. V Gasnikov, E. V Gasnikova, and S. V. Matsievsky, “Primal-Dual Mirror Descent for the Stochastic Programming Problems with Functional Constraints,” Comput. Math. Math. Phys. 58 (11), 1–2. 2018).

    Article  MathSciNet  MATH  Google Scholar 

  22. X. Meng and H. Chen, “Accelerating Nesterov’s Method for Strongly Convex Functions with Lipschitz Gradient,” (Cornell Univ. Libr. e-Print Archive, 2011; arXiv: 1109.6058) https://arxiv.org/pdf/1109.6058.pdf.

    Google Scholar 

Download references

Acknowledgements

The authors are very grateful to Yu. E. Nesterov, A. V. Gasnikov and P. E. Dvurechensky for fruitful discussions and comments.

Funding

F. S. Stonyakin (analysis of Algorithms 1 and 3) was supported by the Russian Foundation for Basic Research (project no. 18-31-00219 mol_a).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to F. S. Stonyakin, M. Alkousa, A. N. Stepanov or A. A. Titov.

Additional information

Russian Text © The Author(s), 2019, published in Diskretnyi Analiz i Issledovanie Operatsii, 2019, Vol. 26, No. 3, pp. 60–86.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stonyakin, F.S., Alkousa, M., Stepanov, A.N. et al. Adaptive Mirror Descent Algorithms for Convex and Strongly Convex Optimization Problems with Functional Constraints. J. Appl. Ind. Math. 13, 557–574 (2019). https://doi.org/10.1134/S1990478919030165

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S1990478919030165

Keywords

Navigation