Skip to main content
Log in

Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this paper we study two inexact fast augmented Lagrangian algorithms for solving linearly constrained convex optimization problems. Our methods rely on a combination of the excessive-gap-like smoothing technique introduced in Nesterov (SIAM J Optim 16(1):235–249, 2005) and the general inexact oracle framework studied in Devolder (Math Program 146:37–75, 2014). We develop and analyze two augmented based algorithmic instances with constant and adaptive smoothness parameters, and derive a total computational complexity estimate in terms of projections on a simple primal feasible set for each algorithm. For the constant parameter algorithm we obtain the overall computational complexity of order \(\mathcal {O}(\frac{1}{\epsilon ^{5/4}})\), while for the adaptive one we obtain \(\mathcal {O}(\frac{1}{\epsilon })\) total number of projections onto the primal feasible set in order to achieve an \(\epsilon \)-optimal solution for the original problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Aybat, N., Iyengar, G.: An augmented Lagrangian method for conic convex programming, working paper. arXiv:1302.6322 (2013)

  2. Aybat, N., Iyengar, G.: A first-order augmented Lagrangian method for compressed sensing. SIAM J Optim 22, 429–459 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bauschke, H., Combettes, P.: Convex analysis and monotone operators theory in Hilbert spaces. Springer, Verlag (2011)

    Book  MATH  Google Scholar 

  4. Ben-Tal, A., Nemirovski, A.: Lectures on modern convex optimization: analysis, algorithms, and engineering applications, vol. 3, MPS/SIAM series on optimization, SIAM (2001)

  5. Bertsekas, D.: Convex optimization theory. Athena Scientific (2009)

  6. Briceno-Arias, L., Combettes, P.: A monotone + skew splitting model for composite monotone inclusions in duality. SIAM J Optim 21, 1230–1250 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  7. Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J Math Imaging Vis 40, 120–145 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  8. Combettes, P.: Solving monotone inclusions via compositions of nonexpansive averaged operators. Optimization 53, 475–504 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  9. Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math Program 146, 37–75 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  10. Eckstein, J., Bertsekas, D.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math Program 55, 293–318 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  11. He, B., Tao, M., Yuan, X.: Alternating direction method with Gaussian back substitution for separable convex programming. SIAM J Optim 22(2), 313–340 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  12. He, B., Yang, H., Zhang, C.: A modified augmented Lagrangian method for a class of monotone variational inequalities. Eur J Oper Res 159(1), 35–51 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  13. He, B., Yuan, X.: On the \({\cal O}(1/n)\) convergence rate of the Douglas–Rachford alternating direction method. SIAM J Num Anal 50, 700–709 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  14. Lan, G., Monteiro, R.: Iteration-complexity of first-order augmented Lagrangian methods for convex programming. Math Program 155(1–2), 511–547 (2016). doi:10.1007/s10107-015-0861-x

  15. Li, X., Yuan, X.: A proximal strictly contractive Peaceman–Rachford splitting method for convex programming with applications to imaging. SIAM J Imaging Sci 8, 1332–1365 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  16. Necoara, I., Nedelcu, V.: Rate analysis of inexact dual first order methods: application to dual decomposition. IEEE Trans Automa Control 59(5), 1232–1243 (2014)

    Article  MathSciNet  Google Scholar 

  17. Necoara, I., Patrascu, A.: Iteration complexity analysis of dual first order methods for conic convex programming, technical report. Opt Met Soft. arXiv:1409.1462 (2014)

  18. Necoara, I., Patrascu, A., Glineur, F.: Complexity certifications of first order inexact Lagrangian and penalty methods for conic convex programming, Tech. Rep., Univ. Politehnica Bucharest, pp. 1–34 (2015)

  19. Necoara, I., Suykens, J.: Application of a smoothing technique to decomposition in convex optimization. IEEE Trans Autom Control 53(11), 2674–2679 (2008)

    Article  MathSciNet  Google Scholar 

  20. Nedelcu, V., Necoara, I., Tran-Dinh, Q.: Computational complexity of inexact gradient augmented Lagrangian methods: application to constrained MPC. SIAM J Control Optim 52(5), 3109–3134 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  21. Nedic, A., Ozdaglar, A.: Approximate primal solutions and rate analysis for dual subgradient methods. SIAM J Optim 19(4), 1757–1780 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  22. Nemirovskii, A.: Prox-method with rate of convergence \({\cal O}(1/t)\) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J Optim 15, 229–251 (2004)

    Article  MathSciNet  Google Scholar 

  23. Nesterov, Y.: New primal-dual subgradient methods for convex problems with functional constraints. http://lear.inrialpes.fr/workshop/osl2015/slides/osl2015_yurii.pdf (2015)

  24. Nesterov, Y.: Introductory lectures on convex optimization: a basic course. Kluwer, Boston (2004)

    Book  MATH  Google Scholar 

  25. Nesterov, Y.: Excessive gap technique in nonsmooth convex minimization. SIAM J Optim 16(1), 235–249 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  26. Nesterov, Y.: Subgradient methods for huge-scale optimization problems. Math Program 146, 275–297 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  27. Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math Oper Res 1, 97–116 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  28. Tran-Dinh, Q., Cevher, V.: A primal-dual algorithmic framework for constrained convex minimization, technical report. arXiv:1406.5403 (2014)

  29. Tran-Dinh, Q., Necoara, I., Diehl, M.: Fast inexact distributed optimization algorithms for separable convex optimization. Optimization 65(2), 325–356 (2016)

  30. Tran-Dinh, Q., Savorgnan, C., Diehl, M.: Combining Lagrangian decomposition and excessive gap smoothing technique for solving large-scale separable convex optimization problems. Comput Optim Appl 55(1), 75–111 (2013)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrei Patrascu.

Additional information

The research leading to these results has received funding from: UEFISCDI Romania, PN II-RU-TE, project MoCOBiDS, No. 176/01.10.2015; Sectorial Operational Programme Human Resources Development 2007–2013 of the Ministry of European Funds through the Financial Agreement POSDRU/159/1.5/S/134398; NAFOSTED, Vietnam.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Patrascu, A., Necoara, I. & Tran-Dinh, Q. Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization. Optim Lett 11, 609–626 (2017). https://doi.org/10.1007/s11590-016-1024-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-016-1024-6

Keywords

Navigation