Skip to main content
Log in

On the O(1/t) convergence rate of the projection and contraction methods for variational inequalities with Lipschitz continuous monotone operators

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

Nemirovski’s analysis (SIAM J. Optim. 15:229–251, 2005) indicates that the extragradient method has the O(1/t) convergence rate for variational inequalities with Lipschitz continuous monotone operators. For the same problems, in the last decades, a class of Fejér monotone projection and contraction methods is developed. Until now, only convergence results are available to these projection and contraction methods, though the numerical experiments indicate that they always outperform the extragradient method. The reason is that the former benefits from the ‘optimal’ step size in the contraction sense. In this paper, we prove the convergence rate under a unified conceptual framework, which includes the projection and contraction methods as special cases and thus perfects the theory of the existing projection and contraction methods. Preliminary numerical results demonstrate that the projection and contraction methods converge twice faster than the extragradient method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. For convenience, we only consider the distance function in the Euclidean-norm. All the results in this paper are easy to extended to the contraction of the distance function in G-norm where G is a positive definite matrix.

  2. A similar type of (small) problems was tested in [21] where the components of the nonlinear mapping D(u) are D j (u)=c⋅arctan(u j ).

  3. In the paper by Harker and Pang [4], the matrix M=A T A+B+D, where A and B are the same matrices as what we use here, and D is a diagonal matrix with uniformly distributed random entries d jj ∈(0.0,0.3).

  4. In [4], the similar problems in the first set are called easy problems while the 2-nd set problems are called hard problems.

References

  1. Bertsekas, D.P., Tsitsiklis, J.N.: Parallel and Distributed Computation, Numerical Methods. Prentice-Hall, Englewood Cliffs (1989)

    MATH  Google Scholar 

  2. Blum, E., Oettli, W.: Mathematische Optimierung: Grundlagen und Verfahren. Ökonometrie und Unternehmensforschung. Springer, Berlin (1975)

    Book  Google Scholar 

  3. Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems, Vols. I and II. Springer Series in Operations Research. Springer, New York (2003)

    Google Scholar 

  4. Harker, P.T., Pang, J.S.: A damped-Newton method for the linear complementarity problem. Lect. Appl. Math. 26, 265–284 (1990)

    MathSciNet  Google Scholar 

  5. He, B.S.: A class of projection and contraction methods for monotone variational inequalities. Appl. Math. Optim. 35, 69–76 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  6. He, B.S., Liao, L.-Z.: Improvements of some projection methods for monotone nonlinear variational inequalities. J. Optim. Theory Appl. 112, 111–128 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  7. He, B.S., Xu, M.-H.: A general framework of contraction methods for monotone variational inequalities. Pac. J. Optim. 4, 195–212 (2008)

    MATH  MathSciNet  Google Scholar 

  8. He, B.S., Yuan, X.M., Zhang, J.J.Z.: Comparison of two kinds of prediction-correction methods for monotone variational inequalities. Comput. Optim. Appl. 27, 247–267 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  9. He, B.S., Liao, L.-Z., Wang, X.: Proximal-like contraction methods for monotone variational inequalities in a unified framework I: effective quadruplet and primary methods. Comput. Optim. Appl. 51, 649–679 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  10. He, B.S., Liao, L.-Z., Wang, X.: Proximal-like contraction methods for monotone variational inequalities in a unified framework II: general methods and numerical experiments. Comput. Optim. Appl. 51, 681–708 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  11. Howard, A.G.: Large margin, transformation learning. PhD Thesis, Graduate School of Arts and Science, Columbia University (2009)

  12. Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekon. Mat. Metod. 12, 747–756 (1976)

    MATH  Google Scholar 

  13. Khobotov, E.N.: Modification of the extragradient method for solving variational inequalities and certain optimization problems. USSR Comput. Math. Math. Phys. 27, 120–127 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  14. Lacoste-Julien, S.: Discriminative machine learning with structure. PhD Thesis, Computer Science, University of California, Berkeley (2009)

  15. Nemirovski, A.: Prox-method with rate of convergence O(1/t) for variational inequality with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optim. 15, 229–251 (2005)

    Article  MathSciNet  Google Scholar 

  16. Pan, Y.: A game theoretical approach to constrained OSNR optimization problems in optical network. PhD Thesis, Electrical and Computer Engineering, University of Toronto (2009)

  17. Pan, Y., Pavel, L.: Games with coupled propagated constraints in optical networks with multi-link topologies. Automatica 45, 871–880 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  18. Sha, F.: Large margin training of acoustic models for speech recognition. PhD Thesis, Computer and Information Science, University of Pennsylvania (2007)

  19. Solodov, M.V., Tseng, P.: Modified projection-type methods for monotone variational inequalities. SIAM J. Control Optim. 34, 1814–1830 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  20. Sun, D.: A class of iterative methods for solving nonlinear projection equations. J. Optim. Theory Appl. 91, 123–140 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  21. Taji, K., Fukushima, M., Ibaraki, I.: A globally convergent Newton method for solving strongly monotone variational inequalities. Math. Program. 58, 369–383 (1993)

    Article  MATH  MathSciNet  Google Scholar 

  22. Taskar, B., Lacoste-Julien, S., Jordan, M.I.: Structured prediction, dual extragradient and Bregman projections. J. Mach. Learn. Res. 7, 1627–1653 (2006)

    MATH  MathSciNet  Google Scholar 

  23. Taskar, B., Lacoste-Julien, S., Jordan, M.I.: Structured prediction via extragradient method. In: Weiss, Y., Schoelkopf, B., Platt, J. (eds.) Advances in Neural Information Processing Systems (NIPS), vol. 18 (2006)

    Google Scholar 

  24. Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization. Department of Mathematics, University of Washington, Seattle, WA 98195, USA (2008)

  25. Xue, G.L., Ye, Y.Y.: An efficient algorithm for minimizing a sum of Euclidean norms with applications. SIAM J. Optim. 7, 1017–1036 (1997)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors thank X.-L. Fu, M. Li, M. Tao and X.-M. Yuan for the discussion and valuable suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bingsheng He.

Additional information

X. Cai was supported by the MOEC fund 20110091110004. G. Gu was supported by the NSFC grants 11001124 and 91130007. B. He was supported by the NSFC grant 91130007 and the MOEC fund 20110091110004.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cai, X., Gu, G. & He, B. On the O(1/t) convergence rate of the projection and contraction methods for variational inequalities with Lipschitz continuous monotone operators. Comput Optim Appl 57, 339–363 (2014). https://doi.org/10.1007/s10589-013-9599-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-013-9599-7

Keywords

Navigation