Advertisement

Journal of Global Optimization

, Volume 46, Issue 3, pp 347–361 | Cite as

A primal dual modified subgradient algorithm with sharp Lagrangian

  • Regina S. Burachik
  • Alfredo N. Iusem
  • Jefferson G. Melo
Article

Abstract

We apply a modified subgradient algorithm (MSG) for solving the dual of a nonlinear and nonconvex optimization problem. The dual scheme we consider uses the sharp augmented Lagrangian. A desirable feature of this method is primal convergence, which means that every accumulation point of a primal sequence (which is automatically generated during the process), is a primal solution. This feature is not true in general for available variants of MSG. We propose here two new variants of MSG which enjoy both primal and dual convergence, as long as the dual optimal set is nonempty. These variants have a very simple choice for the stepsizes. Moreover, we also establish primal convergence when the dual optimal set is empty. Finally, our second variant of MSG converges in a finite number of steps.

Keywords

Nonsmooth optimization Nonconvex optimization Duality scheme Sharp Lagrangian Modified subgradient algorithm 

Mathematics Subject Classification (2000)

90C26 49M29 49M37 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Burachik R.S., Gasimov R.N., Ismayilova N.A., Kaya C.Y.: On a modified subgradient algorithm for dual problems via sharp Augmented Lagrangian. J. Glob. Optim. 34(1), 55–78 (2006)CrossRefGoogle Scholar
  2. 2.
    Burachik R.S., Kaya C.Y.: An update rule and a convergence result for a penalty function method. J. Ind. Manag. Optim. 3(2), 381–398 (2007)Google Scholar
  3. 3.
    Burachik, R.S., Kaya, C.Y., Mammadov, M.: An inexact modified subgradient algorithm for nonconvex optimization. Comput. Optim. Appl. (2008) doi: 10.1007/s10589-008-9168-7
  4. 4.
    Burachik R.S., Rubinov A.M.: Abstract convexity and augmented lagrangians. SIAM J. Optim. 18, 413–436 (2007)CrossRefGoogle Scholar
  5. 5.
    Gasimov R.N.: Augmented Lagrangian duality and nondifferentiable optimization methods in nonconvex programming. J. Glob. Optim. 24(2), 187–203 (2002)CrossRefGoogle Scholar
  6. 6.
    Gasimov R.N., Rubinov A.M.: On augmented Lagrangians for optimization problems with a single constraint. J. Glob. Optim. 28(2), 153–173 (2004)CrossRefGoogle Scholar
  7. 7.
    Huang X.X., Yang X.Q.: A unified augmented Lagrangian approach to duality and exact penalization. Math. Oper. Res. 28(3), 533–552 (2003)CrossRefGoogle Scholar
  8. 8.
    Huang X.X., Yang X.Q.: Further study on augmented Lagrangian duality theory. J. Glob. Optim. 31(2), 193–210 (2005)CrossRefGoogle Scholar
  9. 9.
    Nedić A., Ozdaglar A., Rubinov A.M.: Abstract convexity for nonconvex optimization duality. Optimization 56(5–6), 655–674 (2007)Google Scholar
  10. 10.
    Penot J-P.: Augmented Lagrangians, duality and growth conditions. J. Nonlinear Convex Anal. 3(3), 283–302 (2002)Google Scholar
  11. 11.
    Rockafellar R.T., Wets R. J.-B.: Variational Analysis. Springer, Berlin (1998)CrossRefGoogle Scholar
  12. 12.
    Rubinov A.M., Yang X.Q.: Lagrange-type Functions in Constrained Non-convex Optimization. Kluwer Academic, Dordrecht, The Netherlands (2003)Google Scholar
  13. 13.
    Rubinov A.M., Huang X.X., Yang X.Q.: The zero duality gap property and lower semicontinuity of the perturbation function. Math. Oper. Res. 27(4), 775–791 (2002)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC. 2009

Authors and Affiliations

  • Regina S. Burachik
    • 1
  • Alfredo N. Iusem
    • 2
  • Jefferson G. Melo
    • 2
  1. 1.School of Mathematics and StatisticsUniversity of South AustraliaMawson LakesAustralia
  2. 2.IMPA, Instituto de Matemática Pura e AplicadaRio de JaneiroBrazil

Personalised recommendations