Optimization Letters

, Volume 11, Issue 3, pp 609–626 | Cite as

Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization

Original Paper

Abstract

In this paper we study two inexact fast augmented Lagrangian algorithms for solving linearly constrained convex optimization problems. Our methods rely on a combination of the excessive-gap-like smoothing technique introduced in Nesterov (SIAM J Optim 16(1):235–249, 2005) and the general inexact oracle framework studied in Devolder (Math Program 146:37–75, 2014). We develop and analyze two augmented based algorithmic instances with constant and adaptive smoothness parameters, and derive a total computational complexity estimate in terms of projections on a simple primal feasible set for each algorithm. For the constant parameter algorithm we obtain the overall computational complexity of order \(\mathcal {O}(\frac{1}{\epsilon ^{5/4}})\), while for the adaptive one we obtain \(\mathcal {O}(\frac{1}{\epsilon })\) total number of projections onto the primal feasible set in order to achieve an \(\epsilon \)-optimal solution for the original problem.

Keywords

Inexact augmented Lagrangian Smoothing techniques  Primal-dual fast gradient Excessive gap Computational complexity 

References

  1. 1.
    Aybat, N., Iyengar, G.: An augmented Lagrangian method for conic convex programming, working paper. arXiv:1302.6322 (2013)
  2. 2.
    Aybat, N., Iyengar, G.: A first-order augmented Lagrangian method for compressed sensing. SIAM J Optim 22, 429–459 (2012)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Bauschke, H., Combettes, P.: Convex analysis and monotone operators theory in Hilbert spaces. Springer, Verlag (2011)CrossRefMATHGoogle Scholar
  4. 4.
    Ben-Tal, A., Nemirovski, A.: Lectures on modern convex optimization: analysis, algorithms, and engineering applications, vol. 3, MPS/SIAM series on optimization, SIAM (2001)Google Scholar
  5. 5.
    Bertsekas, D.: Convex optimization theory. Athena Scientific (2009)Google Scholar
  6. 6.
    Briceno-Arias, L., Combettes, P.: A monotone + skew splitting model for composite monotone inclusions in duality. SIAM J Optim 21, 1230–1250 (2011)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J Math Imaging Vis 40, 120–145 (2011)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Combettes, P.: Solving monotone inclusions via compositions of nonexpansive averaged operators. Optimization 53, 475–504 (2004)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math Program 146, 37–75 (2014)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Eckstein, J., Bertsekas, D.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math Program 55, 293–318 (1992)MathSciNetCrossRefMATHGoogle Scholar
  11. 11.
    He, B., Tao, M., Yuan, X.: Alternating direction method with Gaussian back substitution for separable convex programming. SIAM J Optim 22(2), 313–340 (2012)MathSciNetCrossRefMATHGoogle Scholar
  12. 12.
    He, B., Yang, H., Zhang, C.: A modified augmented Lagrangian method for a class of monotone variational inequalities. Eur J Oper Res 159(1), 35–51 (2004)MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    He, B., Yuan, X.: On the \({\cal O}(1/n)\) convergence rate of the Douglas–Rachford alternating direction method. SIAM J Num Anal 50, 700–709 (2012)MathSciNetCrossRefMATHGoogle Scholar
  14. 14.
    Lan, G., Monteiro, R.: Iteration-complexity of first-order augmented Lagrangian methods for convex programming. Math Program 155(1–2), 511–547 (2016). doi:10.1007/s10107-015-0861-x
  15. 15.
    Li, X., Yuan, X.: A proximal strictly contractive Peaceman–Rachford splitting method for convex programming with applications to imaging. SIAM J Imaging Sci 8, 1332–1365 (2015)MathSciNetCrossRefMATHGoogle Scholar
  16. 16.
    Necoara, I., Nedelcu, V.: Rate analysis of inexact dual first order methods: application to dual decomposition. IEEE Trans Automa Control 59(5), 1232–1243 (2014)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Necoara, I., Patrascu, A.: Iteration complexity analysis of dual first order methods for conic convex programming, technical report. Opt Met Soft. arXiv:1409.1462 (2014)
  18. 18.
    Necoara, I., Patrascu, A., Glineur, F.: Complexity certifications of first order inexact Lagrangian and penalty methods for conic convex programming, Tech. Rep., Univ. Politehnica Bucharest, pp. 1–34 (2015)Google Scholar
  19. 19.
    Necoara, I., Suykens, J.: Application of a smoothing technique to decomposition in convex optimization. IEEE Trans Autom Control 53(11), 2674–2679 (2008)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Nedelcu, V., Necoara, I., Tran-Dinh, Q.: Computational complexity of inexact gradient augmented Lagrangian methods: application to constrained MPC. SIAM J Control Optim 52(5), 3109–3134 (2014)MathSciNetCrossRefMATHGoogle Scholar
  21. 21.
    Nedic, A., Ozdaglar, A.: Approximate primal solutions and rate analysis for dual subgradient methods. SIAM J Optim 19(4), 1757–1780 (2009)MathSciNetCrossRefMATHGoogle Scholar
  22. 22.
    Nemirovskii, A.: Prox-method with rate of convergence \({\cal O}(1/t)\) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J Optim 15, 229–251 (2004)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Nesterov, Y.: New primal-dual subgradient methods for convex problems with functional constraints. http://lear.inrialpes.fr/workshop/osl2015/slides/osl2015_yurii.pdf (2015)
  24. 24.
    Nesterov, Y.: Introductory lectures on convex optimization: a basic course. Kluwer, Boston (2004)CrossRefMATHGoogle Scholar
  25. 25.
    Nesterov, Y.: Excessive gap technique in nonsmooth convex minimization. SIAM J Optim 16(1), 235–249 (2005)MathSciNetCrossRefMATHGoogle Scholar
  26. 26.
    Nesterov, Y.: Subgradient methods for huge-scale optimization problems. Math Program 146, 275–297 (2014)MathSciNetCrossRefMATHGoogle Scholar
  27. 27.
    Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math Oper Res 1, 97–116 (1976)MathSciNetCrossRefMATHGoogle Scholar
  28. 28.
    Tran-Dinh, Q., Cevher, V.: A primal-dual algorithmic framework for constrained convex minimization, technical report. arXiv:1406.5403 (2014)
  29. 29.
    Tran-Dinh, Q., Necoara, I., Diehl, M.: Fast inexact distributed optimization algorithms for separable convex optimization. Optimization 65(2), 325–356 (2016)Google Scholar
  30. 30.
    Tran-Dinh, Q., Savorgnan, C., Diehl, M.: Combining Lagrangian decomposition and excessive gap smoothing technique for solving large-scale separable convex optimization problems. Comput Optim Appl 55(1), 75–111 (2013)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.Automatic Control and Systems Engineering DepartmentUniversity Politehnica BucharestBucharestRomania
  2. 2.Department of Statistics and Operations ResearchUniversity of North Carolina at Chapel Hill (UNC)Chapel HillUSA

Personalised recommendations