Advertisement

Set-Valued Analysis

, Volume 16, Issue 7–8, pp 899–912 | Cite as

Strong Convergence of Projected Subgradient Methods for Nonsmooth and Nonstrictly Convex Minimization

  • Paul-Emile Maingé
Article

Abstract

In this paper, we establish a strong convergence theorem regarding a regularized variant of the projected subgradient method for nonsmooth, nonstrictly convex minimization in real Hilbert spaces. Only one projection step is needed per iteration and the involved stepsizes are controlled so that the algorithm is of practical interest. To this aim, we develop new techniques of analysis which can be adapted to many other non-Fejérian methods.

Keywords

Convex minimization Projected gradient method Nonsmooth optimization Viscosity method 

Mathematics Subject Classifications (2000)

90C25 90C30 65C25 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Albert, Y.I.: Recurrence relations and variational inequalities. Soviet Mathematics, Doklady, 27, 511–517 (1983)Google Scholar
  2. 2.
    Albert, Y.I., Iusem, A.N.: Extension of subgradient techniques for nonsmooth optimization in a Banach space. Set-Valued Anal. 9, 315–335 (2001)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Albert, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81, 23–35 (1998)Google Scholar
  4. 4.
    Bauschke, H.H., Combettes, P.L.: A weak-to-strong convergence principle for Fejer monotone methods in Hilbert space. Math. Oper. Res. 26, 248–264 (2001)MATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Bello, L., Raydan, M.: Preconditioned spectral projected-gradient method on convex sets. J. Comput. Math. 23, 225–232 (2005)MATHMathSciNetGoogle Scholar
  6. 6.
    Bertsekas, D.P., Gafni, E.M.: Projection methods for variational inequalities with applications to the traffic assignment problem. Math. Program. Stud. 17, 139–159 (1982)MATHMathSciNetGoogle Scholar
  7. 7.
    Bertsekas, D.P.: On the Goldstein–Levitin–Polyak gradient projection method. IEEE Trans. Automat. Contr. AC-21(2), 174–184 (1976)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Byrne, C.L.: A unified treatment of some iterative algorithms in signal processing and image reconstruction. Inverse Problems 18, 441–453 (2004)CrossRefMathSciNetGoogle Scholar
  9. 9.
    Clarke, F.H.: Optimization and Nonsmooth Analysis. SIAM Publications, Philadelphia (1983)MATHGoogle Scholar
  10. 10.
    Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62, 261–275 (1993)CrossRefGoogle Scholar
  11. 11.
    Ekeland, I., Themam, R.: Convex analysis and variational problems. In: Classic in Applied Mathematics, p. 28. SIAM, Philadelphia (1999)Google Scholar
  12. 12.
    Ermoliev, Y.M.: Methods for solving nonlinear extremal problems. Cybernet. 2, 1–17 (1966)CrossRefGoogle Scholar
  13. 13.
    Hager, W.W., Park, S.: The gradient projection method with exact line search. J. Glob. Optim. 30, 103–118 (2004)MATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Halpern, B.: Fixed points of nonexpanding maps. Bull. Amer. Math. Soc. 73, 957–961 (1967)MATHCrossRefMathSciNetGoogle Scholar
  15. 15.
    Iusem, A.N.: On the convergence properties of the projected gradient method for convex optimization. Comput. Appl. Math. 22(1), 37–52 (2003)CrossRefMathSciNetGoogle Scholar
  16. 16.
    Khobotov, E.N.: A modification of the extragradient method for the solution of variational inequalities and some optimization problems. Zh. Vychisl. Mat. Mat. Fiz. 27, 1462–1473 (1987)MathSciNetGoogle Scholar
  17. 17.
    Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Matecon 12, 747–756 (1976)MATHGoogle Scholar
  18. 18.
    Marcotte, P.: Applications of Khobotov’s algorithm to variational and network equlibrium problems. Inform. Syst. Oper. Res. 29, 258–270 (1991)MATHGoogle Scholar
  19. 19.
    Maingé, P.E., Moudafi, A.: Strong convergence of an iterative method for hierarchical fixed-point problems. Pacific J. Optim. 3(3), 529–538 (2007)MATHGoogle Scholar
  20. 20.
    Moudafi, A.: Viscosity approximations methods for fixed point problems. J. Math. Anal. Appl. 241, 46–55 (2000)MATHCrossRefMathSciNetGoogle Scholar
  21. 21.
    Nadezhkina, N., Takahashi, W.: Strong convergence theorem by a hybrid method for nonexpansive mappings and Lipschitz continuous monotone mappings. SIAM J. Optim. 16(4), 1230–1241 (2006)MATHCrossRefMathSciNetGoogle Scholar
  22. 22.
    Solodov, M.V., Tseng, P.: Modified projection methods for monotone variational inequalities. SIAM J. Control Optim. 34(5), 1814–1834 (1996)MATHCrossRefMathSciNetGoogle Scholar
  23. 23.
    Solodov, M.V., Zavriev, S.K.: Error stability properties of generalized gradient-type algorithms. J. Optim. Theory Appl. 98, 663–680 (1998)MATHCrossRefMathSciNetGoogle Scholar
  24. 24.
    Solodov, M.V.: A new projection method for variational inequality problems. SIAM J. Control Optim. 37(3), 756–776 (1999)CrossRefMathSciNetGoogle Scholar
  25. 25.
    Xiu, N., Wang, C., Kong, L.: A note on the gradient projection method with exact stepsize rule. J. Comput. Math. 25(2), 221–230 (2007)MATHMathSciNetGoogle Scholar
  26. 26.
    Yamada, I., Ogura, N.: Hybrid steepest descent method for the variational inequality problem over the fixed point set of certain quasi-nonexpansive mappings. Numer. Funct. Anal. Optim. 25(7–8), 619–655 (2004)MATHMathSciNetGoogle Scholar
  27. 27.
    Zeng, L.C., Yao, J.C.: Strong convergence theorem by an extragradient method for fixed point problems and variational inequality problems. Taiwan. J. Math. 10(5), 1293–1303 (2006)MATHMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2008

Authors and Affiliations

  1. 1.Département Scientifique Interfacultaire, GRIMAAGUniversité des Antilles-Guyane, Campus de SchoelcherCedex, Martinique (F.W.I.)France

Personalised recommendations