Abstract
The paper deals with a significant extension of the recently proposed class of relatively strongly convex optimization problems in spaces of large dimension. In the present paper, we introduce an analog of the concept of relative strong convexity for variational inequalities (relative strong monotonicity) and study estimates for the rate of convergence of some numerical first-order methods for problems of this type. The paper discusses two classes of variational inequalities depending on the conditions related to the smoothness of the operator. The first of these classes of problems contains relatively bounded operators, and the second, operators with an analog of the Lipschitz condition (known as relative smoothness). For variational inequalities with relatively bounded and relatively strongly monotone operators, a version of the subgradient method is studied and an optimal estimate for the rate of convergence is justified. For problems with relatively smooth and relatively strongly monotone operators, we prove the linear rate of convergence of an algorithm with a special organization of the restart procedure of a mirror prox method for variational inequalities with monotone operators.
Similar content being viewed by others
References
H. Bauschke, J. Bolte, and M. Teboulle, “A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited, and applications,” Math. Oper. Res. 42 (2), 330–348 (2017).
R.-A. Dragomir, A. Taylor, A. d’Aspremont, and J. Bolte, “Optimal complexity and certification of Bregman first-order methods,” Math. Program. Ser. A 194 (1-2), 41–83 (2022).
H. Lu, R. Freund and Yu. Nesterov, “Relatively smooth convex optimization by first-order methods, and applications,” SIAM J. Optim. 28 (1), 333–354 (2018).
R.-A. Dragomir, Bregman Gradient Methods for Relatively Smooth Optimization, Doctoral Dissertation https://hal.inria.fr/tel-03389344/document (2021).
S. Julien, M. Schmidt, and F. Bach, A Simpler Approach to Obtaining an \(O(1/t)\) Convergence Rate for the Projected Stochastic Subgradient Method, arXiv: 1212.2002 (2012).
K. Antonakopoulos and P. Mertikopoulos, “Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements,” in 35th Conference on Neural Information Processing Systems (NeurIPS 2021) (2021); arXiv: 2107.08011 (2021).
H. Lu, “Relative-continuity for non-Lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent,” INFORMS J. Optim. 1 (4), 288–303 (2018).
Y. Zhou, V. Portella, M. Schmidt, and N. Harvey, “Regret bounds without Lipschitz continuity: Online learning with relative-Lipschitz losses,” in 34th Conference on Neural Information Processing Systems (NeurIPS 2020) (Springer, Vancouver, BC, 2020), pp. 232–246.
H. Hendrikx, L. Xiao, S. Bubeck, F. Bach, and L. Massoulie, “Statistically preconditioned accelerated gradient method for distributed optimization,” in Proceedings of the 37th International Conference on Machine Learning, arXiv: https:// hal.archives-ouvertes.fr/ hal-02974232 (2020).
F. Stonyakin, A. Tyurin, A. Gasnikov, P. Dvurechensky, A. Agafonov, D. Dvinskikh, M. Alkousa, D. Pasechnyuk, S. Artamonov, and V. Piskunova, “Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model,” Optim. Methods Softw. 36 (6), 1155–1201 (2021).
A. Gasnikov, P. Dvurechensky, F. Stonyakin, and A. Titov, “An adaptive proximal method for variational inequalities,” Comput. Math. and Math. Phys. 59 (5), 836–841 (2018).
A. Titov, F. Stonyakin, M. Alkousa, and A. Gasnikov, “Algorithms for solving variational inequalities and saddle point problems with some generalizations of Lipschitz property for operators,” in Mathematical Optimization Theory and Operations Research – Recent Trends, Commun. Comput. Inf. Sci. (Springer, Cham, 2021), Vol. 1476, pp. 86–101.
A. S. Nemirovskii and Yu. E. Nesterov, “Optimal methods of smooth convex minimization,” Comput. Math. and Math. Phys. 25 (2), 21–30 (1985).
F. Stonyakin, A. Titov, M. Alkousa, O. Savchuk, and D. Pasechnyuk, Gradient-Type Adaptive Methods for Relatively Lipschitz Convex Optimization Problems, arXiv: 2107.05765 (2021).
Author information
Authors and Affiliations
Corresponding author
Additional information
Translated from Matematicheskie Zametki, 2022, Vol. 112, pp. 879–894 https://doi.org/10.4213/mzm13357.
Rights and permissions
About this article
Cite this article
Stonyakin, F.S., Titov, A.A., Makarenko, D.V. et al. Numerical Methods for Some Classes of Variational Inequalities with Relatively Strongly Monotone Operators. Math Notes 112, 965–977 (2022). https://doi.org/10.1134/S000143462211030X
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S000143462211030X