Skip to main content
Log in

Numerical Methods for Some Classes of Variational Inequalities with Relatively Strongly Monotone Operators

  • Research Articles
  • Published:
Mathematical Notes Aims and scope Submit manuscript

Abstract

The paper deals with a significant extension of the recently proposed class of relatively strongly convex optimization problems in spaces of large dimension. In the present paper, we introduce an analog of the concept of relative strong convexity for variational inequalities (relative strong monotonicity) and study estimates for the rate of convergence of some numerical first-order methods for problems of this type. The paper discusses two classes of variational inequalities depending on the conditions related to the smoothness of the operator. The first of these classes of problems contains relatively bounded operators, and the second, operators with an analog of the Lipschitz condition (known as relative smoothness). For variational inequalities with relatively bounded and relatively strongly monotone operators, a version of the subgradient method is studied and an optimal estimate for the rate of convergence is justified. For problems with relatively smooth and relatively strongly monotone operators, we prove the linear rate of convergence of an algorithm with a special organization of the restart procedure of a mirror prox method for variational inequalities with monotone operators.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.

Similar content being viewed by others

References

  1. H. Bauschke, J. Bolte, and M. Teboulle, “A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited, and applications,” Math. Oper. Res. 42 (2), 330–348 (2017).

    Article  MathSciNet  MATH  Google Scholar 

  2. R.-A. Dragomir, A. Taylor, A. d’Aspremont, and J. Bolte, “Optimal complexity and certification of Bregman first-order methods,” Math. Program. Ser. A 194 (1-2), 41–83 (2022).

    Article  MathSciNet  MATH  Google Scholar 

  3. H. Lu, R. Freund and Yu. Nesterov, “Relatively smooth convex optimization by first-order methods, and applications,” SIAM J. Optim. 28 (1), 333–354 (2018).

    Article  MathSciNet  MATH  Google Scholar 

  4. R.-A. Dragomir, Bregman Gradient Methods for Relatively Smooth Optimization, Doctoral Dissertation https://hal.inria.fr/tel-03389344/document (2021).

    Google Scholar 

  5. S. Julien, M. Schmidt, and F. Bach, A Simpler Approach to Obtaining an \(O(1/t)\) Convergence Rate for the Projected Stochastic Subgradient Method, arXiv: 1212.2002 (2012).

    Google Scholar 

  6. K. Antonakopoulos and P. Mertikopoulos, “Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements,” in 35th Conference on Neural Information Processing Systems (NeurIPS 2021) (2021); arXiv: 2107.08011 (2021).

    Google Scholar 

  7. H. Lu, “Relative-continuity for non-Lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent,” INFORMS J. Optim. 1 (4), 288–303 (2018).

    Article  MathSciNet  Google Scholar 

  8. Y. Zhou, V. Portella, M. Schmidt, and N. Harvey, “Regret bounds without Lipschitz continuity: Online learning with relative-Lipschitz losses,” in 34th Conference on Neural Information Processing Systems (NeurIPS 2020) (Springer, Vancouver, BC, 2020), pp. 232–246.

    Google Scholar 

  9. H. Hendrikx, L. Xiao, S. Bubeck, F. Bach, and L. Massoulie, “Statistically preconditioned accelerated gradient method for distributed optimization,” in Proceedings of the 37th International Conference on Machine Learning, arXiv: https:// hal.archives-ouvertes.fr/ hal-02974232 (2020).

    Google Scholar 

  10. F. Stonyakin, A. Tyurin, A. Gasnikov, P. Dvurechensky, A. Agafonov, D. Dvinskikh, M. Alkousa, D. Pasechnyuk, S. Artamonov, and V. Piskunova, “Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model,” Optim. Methods Softw. 36 (6), 1155–1201 (2021).

    Article  MathSciNet  MATH  Google Scholar 

  11. A. Gasnikov, P. Dvurechensky, F. Stonyakin, and A. Titov, “An adaptive proximal method for variational inequalities,” Comput. Math. and Math. Phys. 59 (5), 836–841 (2018).

    Article  MathSciNet  MATH  Google Scholar 

  12. A. Titov, F. Stonyakin, M. Alkousa, and A. Gasnikov, “Algorithms for solving variational inequalities and saddle point problems with some generalizations of Lipschitz property for operators,” in Mathematical Optimization Theory and Operations Research – Recent Trends, Commun. Comput. Inf. Sci. (Springer, Cham, 2021), Vol. 1476, pp. 86–101.

    Chapter  Google Scholar 

  13. A. S. Nemirovskii and Yu. E. Nesterov, “Optimal methods of smooth convex minimization,” Comput. Math. and Math. Phys. 25 (2), 21–30 (1985).

    Article  MathSciNet  Google Scholar 

  14. F. Stonyakin, A. Titov, M. Alkousa, O. Savchuk, and D. Pasechnyuk, Gradient-Type Adaptive Methods for Relatively Lipschitz Convex Optimization Problems, arXiv: 2107.05765 (2021).

    Google Scholar 

Download references

Funding

The research of F. S. Stonyakin in Sec. 2 and the work of A. A. Titov on the proof of Lemma 2 and Theorem 2 were supported by the Russian Science Foundation under grant 21-71-30005.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to F. S. Stonyakin.

Additional information

Translated from Matematicheskie Zametki, 2022, Vol. 112, pp. 879–894 https://doi.org/10.4213/mzm13357.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stonyakin, F.S., Titov, A.A., Makarenko, D.V. et al. Numerical Methods for Some Classes of Variational Inequalities with Relatively Strongly Monotone Operators. Math Notes 112, 965–977 (2022). https://doi.org/10.1134/S000143462211030X

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S000143462211030X

Keywords

Navigation