Abstract
A projected extrapolated gradient method is designed for solving monotone variational inequality in Hilbert space. Requiring local Lipschitz continuity of the operator, our proposed method improves the value of the extrapolated parameter and admits larger step sizes, which are predicted based a local information of the involved operator and corrected by bounding the distance between each pair of successive iterates. The correction will be implemented when the distance is larger than a given constant and its main cost is to compute a projection onto the feasible set. In particular, when the operator is the gradient of a convex function, the correction step is not necessary. We establish the convergence and ergodic convergence rate in theory under the larger range of parameters. Related numerical experiments illustrate the improvements in efficiency from the larger step sizes.
Similar content being viewed by others
Notes
All codes are available at https://github.com/cxk9369010/PEG.
References
Antipin, A.S.: On a method for convex programs using a symmetrical modification of the Lagrange function. Ekonomika i Matematicheskie Metody 12(6), 1164–1173 (1976)
Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, Berlin, New York (2011)
Bertsekas, D.P., Gafni, E.M.: Projection methods for variational inequalities with applications to the traffic assignment problem. Math. Program. Study 17, 139–159 (1982)
Boţ, R.I., Csetnek, E.R.: Forward-backward and Tseng’s type penalty schemes for monotone inclusion problems. Set-Valued Var. Anal. 22, 313–331 (2014)
Boţ, R.I., Csetnek, E.R.: An inertial forward-backward-forward primal-dual splitting algorithm for solving monotone inclusion problems. Numer. Algor. 71, 519–540 (2016)
Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)
Burachik, R.S., Lopes, J.O., Svaiter, B.F.: An outer approximation method for the variational inequality problem. SIAM J. Control Optim. 43(6), 2071–2088 (2005)
Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011)
Chang, X., Bai, J., Song, D., Liu, S.: Linearized symmetric multi-block ADMM with indefinite proximal regularization and optimal proximal parameter. Calcolo, 57(4), Article number: 38 (2020)
Chang, X., Liu, S., Zhao, P., Li, X.: Convergent prediction-correction-based ADMM for multi-block separable convex programming. J. Comput. Appl. Math. 335, 270–288 (2018)
Chang, X., Liu, S., Zhao, P., Song, D.: A generalization of linearized alternating direction method of multipliers for solving two-block separable convex programming. J. Comput. Appl. Math. 357, 251–272 (2019)
Combettes, P.L., Băng, C.V.: Variable metric forward-backward splitting with applications to monotone inclusions in duality. Optimization 63(9), 1289–1318 (2014)
Denisov, S., Semenov, V., Chabak, L.: Convergence of the modified extragradient method for variational inequalities with non-Lipschitz operators. Cybern. Syst. Anal. 51, 757–765 (2015)
Ekeland, I., Temam, R.: Convex Analysis and Variational Problems. North-Holland, Amsterdam, Holland (1976)
Facchinei, F., Pang, J.-S.: Finite-dimensional variational inequalities and complementarity problem. Springer, New York (2003)
Gibali, A.: A new non-Lipschitzian projection method for solving variational inequalities in Euclidean spaces. J. Nonlinear Anal. Optim. 6, 41–51 (2015)
He, B., Yuan, X.: A class of ADMM-based algorithms for multi-block separable convex programming. Comput. Optim. Appl. 70(3), 791–826 (2018)
Huang, Y., Dong, Y.: New properties of forward-backward splitting and a practical proximal-descent algorithm. Appl. Math. Comput. 237, 60–68 (2014)
Iusem, A.N., Pérez, L.R.: An extragradient-type algorithm for nonsmooth variational inequalities. Optimization 48, 309–332 (2000)
Iusem, A.N., Svaiter, B.F.: A variant of Korpelevich’s method for variational inequalities with a new search strategy. Optimization 42, 309–321 (1997)
Khobotov, E.N.: Modification of the extragradient method for solving variational inequalities and certain optimization problems. USSR Comp. Math. Phys. 27, 120–127 (1987)
Korpelevich, G.M.: The extragradient method for finding saddle points and other problem. Ekonomika i Matematicheskie Metody 12, 747–756 (1976)
Liang, J., Fadili, J., Peyré, G.: Activity identification and local linear convergence of forward-backward-type methods. SIAM J. Optim. 27(1), 408–437 (2017)
Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16(6), 964–979 (1979)
Lorenz, D., Pock, T.: An inertial forward-backward algorithm for monotone inclusions. J. Math. Imaging Vis. 51, 311–325 (2015)
Lyashko, S.I., Semenov, V.V., Voitova, T.A.: Low-cost modification of Korpelevich’s method for monotone equilibrium problems. Cybern. Syst. Anal. 47, 631–639 (2011)
Mainge, P.E., Gobinddass, M.L.: Convergence of one-step projected gradient methods for variational inequalities. J. Optim. Theory Appl. 171, 146–168 (2016)
Malitsky, Y.V.: Projected reflected gradient methods for variational inequalities. SIAM J. Optim. 25(1), 502–520 (2015)
Malitsky, Y.V.: Proximal extrapolated gradient methods for variational inequalities. Optim. Methods Soft. 33(1), 140–164 (2018)
Malitsky, Y.V., Semenov, V.V.: An extragradient algorithm for monotone variational inequalities. Cybern. Syst. Anal. 50, 271–277 (2014)
Monteiro, R.D., Svaiter, B.F.: Complexity of variants of Tseng’s modified FB splitting and Korpelevich’s methods for hemivariational inequalities with applications to saddle-point and convex optimization problems. SIAM J. Optim. 21, 1688–1720 (2011)
Nesterov, Y.: A method of solving a convex programming problem with convergence rate \({{\cal{O}}}(1/k^2)\). Soviet Math. Doklady 27(2), 372–376 (1983)
Nesterov, Y.: Introductory lectures on convex optimization: A basic course. Kluwer academic publishers, Boston (2004)
Noor, M.A.: Modified projection method for pseudomonotone variational inequalities. Appl. Math. Lett. 15, 315–320 (2002)
Opial, Z.: Weak convergence of the sequence of successive approximations for nonexpansive mappings. Bull. Am. Math. Soc. 73(4), 591–597 (1967)
Pang, J.S., Gabriel, S.A.: NE/SQP: A robust algorithm for the nonlinear complementarity problem. Math. Program. 60(1–3), 295–337 (1993)
Rockafellar, R.T.: Generalized directional derivatives and subgradients of nonconvex functions. Can. J. Math. 32, 257–280 (1980)
Solodov, M.V., Svaiter, B.F.: A new projection method for variational inequality problems. SIAM J. Control Optim. 37, 765–776 (1999)
Sun, D.: A projection and contraction method for the nonlinear complementarity problems and its extensions. Math. Numer. Sinica 16, 183–194 (1994)
Tseng, P.: A modified forward-backward splitting method for maximal monotone mapping. SIAM J. Control Optim. 38, 431–446 (2000)
Yang, J., Liu, H.: A modified projected gradient method for monotone variational inequalities. J. Optim. Theory Appl. 179(1), 197–211 (2018)
Acknowledgements
The research of Xiaokai Chang was supported by the Innovation Ability Improvement Project of Gansu (Grant No. 2020A022) and the Hongliu Foundation of First-class Disciplines of Lanzhou University of Technology, China. The research of Jianchao Bai was supported by the National Natural Science Foundation of China (Grant. No. 12001430) and the China Postdoctoral Science Foundation ((Grant. No. 2020M683545).
Author information
Authors and Affiliations
Corresponding authors
Additional information
Communicated by Regina S. Burachik.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
The Details on Remark 3.4
The Details on Remark 3.4
For the case of \(\delta \in (\frac{\sqrt{5}-1}{2},1]\) presented in Remark 3.4, by Fact 2.4 with \(\varepsilon _1>0\), we have
Meanwhile, for any \(\varepsilon _2>0\) we deduce
Hence, Lemmas 3.3 and 3.4 can be improved as following lemmas.
Lemma A.1
Let \(\{x_n\}\), \(\{y_n\}\) be two sequences generated by Algorithm 3.1 and \({\bar{x}}\in {{\mathcal {S}}}\). Then, for any \(\varepsilon _1,\varepsilon _2>0\), we have
where \(\varPhi (\cdot ,\cdot )\) is defined as in (7).
Lemma A.2
Let \(\{x_n\}\), \(\{y_n\}\) be two sequences generated by Algorithm 3.1 and \({\bar{x}}\in {{\mathcal {S}}}\). Then, for any \(\varepsilon _1,\varepsilon _2>0\), we have
where
For any \(\delta \in (\frac{\sqrt{5}-1}{2},1]\), we have \(\delta ^2+\delta -1>0\) and define a function \(\kappa (\delta )\) as
Noting that the structure of (38) and \(\kappa (\delta )\) is a maximum value, so
which together with \(a=\frac{\delta ^2}{\delta ^2+\delta -1}\) and \(\varepsilon _1>0\) yields
and then
It follows from the first-order optimality condition of problem (39) that
when \(\varepsilon _1=\sqrt{a+1}\) and \(\varepsilon _2=\sqrt{a+1}-1\).
Consequently, we can obtain the following convergence result on Algorithm 3.1 with \(\delta \in (\frac{\sqrt{5}-1}{2},1]\) and \(\alpha \in (0,\kappa (\delta ))\).
Theorem A.1
Let \(\{x_n\}\) be the sequence generated by Algorithm 3.1 with \(\delta \in (\frac{\sqrt{5}-1}{2},1]\) and \(\alpha \in (0,\kappa (\delta ))\). Then, \(\{x_n\}\) converges weakly to a solution of problem (1).
Proof
Firstly, \(\delta \in (\frac{\sqrt{5}-1}{2},1]\) gives \(\frac{1}{\delta }\ge \frac{\delta ^2 +\delta -1}{\delta ^3}>0\). Note that \(\lim \limits _{n\rightarrow \infty }\frac{\lambda _n}{\lambda _{n-1}}=1\), by taking the limit, \(\alpha <\kappa (\delta )\) and the definition of \(\kappa (\alpha )\) in (30), we have
for any \(\delta \in (\frac{\sqrt{5}-1}{2},1]\). Thus, there exists an integer \(N>2,\) such that for any \(n>N\),
Consequently, by the similar processing as in the proof of Theorem 3.1, the result can be proved. We omit the details. \(\square \)
Rights and permissions
About this article
Cite this article
Chang, X., Bai, J. A Projected Extrapolated Gradient Method with Larger Step Size for Monotone Variational Inequalities. J Optim Theory Appl 190, 602–627 (2021). https://doi.org/10.1007/s10957-021-01902-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-021-01902-2