A primal dual modified subgradient algorithm with sharp Lagrangian
We apply a modified subgradient algorithm (MSG) for solving the dual of a nonlinear and nonconvex optimization problem. The dual scheme we consider uses the sharp augmented Lagrangian. A desirable feature of this method is primal convergence, which means that every accumulation point of a primal sequence (which is automatically generated during the process), is a primal solution. This feature is not true in general for available variants of MSG. We propose here two new variants of MSG which enjoy both primal and dual convergence, as long as the dual optimal set is nonempty. These variants have a very simple choice for the stepsizes. Moreover, we also establish primal convergence when the dual optimal set is empty. Finally, our second variant of MSG converges in a finite number of steps.
KeywordsNonsmooth optimization Nonconvex optimization Duality scheme Sharp Lagrangian Modified subgradient algorithm
Mathematics Subject Classification (2000)90C26 49M29 49M37
Unable to display preview. Download preview PDF.
- 2.Burachik R.S., Kaya C.Y.: An update rule and a convergence result for a penalty function method. J. Ind. Manag. Optim. 3(2), 381–398 (2007)Google Scholar
- 3.Burachik, R.S., Kaya, C.Y., Mammadov, M.: An inexact modified subgradient algorithm for nonconvex optimization. Comput. Optim. Appl. (2008) doi: 10.1007/s10589-008-9168-7
- 9.Nedić A., Ozdaglar A., Rubinov A.M.: Abstract convexity for nonconvex optimization duality. Optimization 56(5–6), 655–674 (2007)Google Scholar
- 10.Penot J-P.: Augmented Lagrangians, duality and growth conditions. J. Nonlinear Convex Anal. 3(3), 283–302 (2002)Google Scholar
- 12.Rubinov A.M., Yang X.Q.: Lagrange-type Functions in Constrained Non-convex Optimization. Kluwer Academic, Dordrecht, The Netherlands (2003)Google Scholar