Abstract
In this paper we study two inexact fast augmented Lagrangian algorithms for solving linearly constrained convex optimization problems. Our methods rely on a combination of the excessive-gap-like smoothing technique introduced in Nesterov (SIAM J Optim 16(1):235–249, 2005) and the general inexact oracle framework studied in Devolder (Math Program 146:37–75, 2014). We develop and analyze two augmented based algorithmic instances with constant and adaptive smoothness parameters, and derive a total computational complexity estimate in terms of projections on a simple primal feasible set for each algorithm. For the constant parameter algorithm we obtain the overall computational complexity of order \(\mathcal {O}(\frac{1}{\epsilon ^{5/4}})\), while for the adaptive one we obtain \(\mathcal {O}(\frac{1}{\epsilon })\) total number of projections onto the primal feasible set in order to achieve an \(\epsilon \)-optimal solution for the original problem.
Similar content being viewed by others
References
Aybat, N., Iyengar, G.: An augmented Lagrangian method for conic convex programming, working paper. arXiv:1302.6322 (2013)
Aybat, N., Iyengar, G.: A first-order augmented Lagrangian method for compressed sensing. SIAM J Optim 22, 429–459 (2012)
Bauschke, H., Combettes, P.: Convex analysis and monotone operators theory in Hilbert spaces. Springer, Verlag (2011)
Ben-Tal, A., Nemirovski, A.: Lectures on modern convex optimization: analysis, algorithms, and engineering applications, vol. 3, MPS/SIAM series on optimization, SIAM (2001)
Bertsekas, D.: Convex optimization theory. Athena Scientific (2009)
Briceno-Arias, L., Combettes, P.: A monotone + skew splitting model for composite monotone inclusions in duality. SIAM J Optim 21, 1230–1250 (2011)
Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J Math Imaging Vis 40, 120–145 (2011)
Combettes, P.: Solving monotone inclusions via compositions of nonexpansive averaged operators. Optimization 53, 475–504 (2004)
Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math Program 146, 37–75 (2014)
Eckstein, J., Bertsekas, D.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math Program 55, 293–318 (1992)
He, B., Tao, M., Yuan, X.: Alternating direction method with Gaussian back substitution for separable convex programming. SIAM J Optim 22(2), 313–340 (2012)
He, B., Yang, H., Zhang, C.: A modified augmented Lagrangian method for a class of monotone variational inequalities. Eur J Oper Res 159(1), 35–51 (2004)
He, B., Yuan, X.: On the \({\cal O}(1/n)\) convergence rate of the Douglas–Rachford alternating direction method. SIAM J Num Anal 50, 700–709 (2012)
Lan, G., Monteiro, R.: Iteration-complexity of first-order augmented Lagrangian methods for convex programming. Math Program 155(1–2), 511–547 (2016). doi:10.1007/s10107-015-0861-x
Li, X., Yuan, X.: A proximal strictly contractive Peaceman–Rachford splitting method for convex programming with applications to imaging. SIAM J Imaging Sci 8, 1332–1365 (2015)
Necoara, I., Nedelcu, V.: Rate analysis of inexact dual first order methods: application to dual decomposition. IEEE Trans Automa Control 59(5), 1232–1243 (2014)
Necoara, I., Patrascu, A.: Iteration complexity analysis of dual first order methods for conic convex programming, technical report. Opt Met Soft. arXiv:1409.1462 (2014)
Necoara, I., Patrascu, A., Glineur, F.: Complexity certifications of first order inexact Lagrangian and penalty methods for conic convex programming, Tech. Rep., Univ. Politehnica Bucharest, pp. 1–34 (2015)
Necoara, I., Suykens, J.: Application of a smoothing technique to decomposition in convex optimization. IEEE Trans Autom Control 53(11), 2674–2679 (2008)
Nedelcu, V., Necoara, I., Tran-Dinh, Q.: Computational complexity of inexact gradient augmented Lagrangian methods: application to constrained MPC. SIAM J Control Optim 52(5), 3109–3134 (2014)
Nedic, A., Ozdaglar, A.: Approximate primal solutions and rate analysis for dual subgradient methods. SIAM J Optim 19(4), 1757–1780 (2009)
Nemirovskii, A.: Prox-method with rate of convergence \({\cal O}(1/t)\) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J Optim 15, 229–251 (2004)
Nesterov, Y.: New primal-dual subgradient methods for convex problems with functional constraints. http://lear.inrialpes.fr/workshop/osl2015/slides/osl2015_yurii.pdf (2015)
Nesterov, Y.: Introductory lectures on convex optimization: a basic course. Kluwer, Boston (2004)
Nesterov, Y.: Excessive gap technique in nonsmooth convex minimization. SIAM J Optim 16(1), 235–249 (2005)
Nesterov, Y.: Subgradient methods for huge-scale optimization problems. Math Program 146, 275–297 (2014)
Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math Oper Res 1, 97–116 (1976)
Tran-Dinh, Q., Cevher, V.: A primal-dual algorithmic framework for constrained convex minimization, technical report. arXiv:1406.5403 (2014)
Tran-Dinh, Q., Necoara, I., Diehl, M.: Fast inexact distributed optimization algorithms for separable convex optimization. Optimization 65(2), 325–356 (2016)
Tran-Dinh, Q., Savorgnan, C., Diehl, M.: Combining Lagrangian decomposition and excessive gap smoothing technique for solving large-scale separable convex optimization problems. Comput Optim Appl 55(1), 75–111 (2013)
Author information
Authors and Affiliations
Corresponding author
Additional information
The research leading to these results has received funding from: UEFISCDI Romania, PN II-RU-TE, project MoCOBiDS, No. 176/01.10.2015; Sectorial Operational Programme Human Resources Development 2007–2013 of the Ministry of European Funds through the Financial Agreement POSDRU/159/1.5/S/134398; NAFOSTED, Vietnam.
Rights and permissions
About this article
Cite this article
Patrascu, A., Necoara, I. & Tran-Dinh, Q. Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization. Optim Lett 11, 609–626 (2017). https://doi.org/10.1007/s11590-016-1024-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-016-1024-6