Advertisement

Mathematical Programming

, Volume 162, Issue 1–2, pp 165–199 | Cite as

On the linear convergence of the alternating direction method of multipliers

  • Mingyi Hong
  • Zhi-Quan LuoEmail author
Full Length Paper

Abstract

We analyze the convergence rate of the alternating direction method of multipliers (ADMM) for minimizing the sum of two or more nonsmooth convex separable functions subject to linear constraints. Previous analysis of the ADMM typically assumes that the objective function is the sum of only two convex functions defined on two separable blocks of variables even though the algorithm works well in numerical experiments for three or more blocks. Moreover, there has been no rate of convergence analysis for the ADMM without strong convexity in the objective function. In this paper we establish the global R-linear convergence of the ADMM for minimizing the sum of any number of convex separable functions, assuming that a certain error bound condition holds true and the dual stepsize is sufficiently small. Such an error bound condition is satisfied for example when the feasible set is a compact polyhedron and the objective function consists of a smooth strictly convex function composed with a linear mapping, and a nonsmooth \(\ell _1\) regularizer. This result implies the linear convergence of the ADMM for contemporary applications such as LASSO without assuming strong convexity of the objective function.

Keywords

Linear convergence Alternating directions of multipliers Error bound Dual ascent 

Mathematics Subject Classification

49 90 

Notes

Acknowledgments

The authors are grateful to Xiangfeng Wang and Dr. Min Tao of Nanjing University for their constructive comments.

References

  1. 1.
    Bertsekas, D.P.: Nonlinear Programming. Athena Scientific, Belmont (1999)zbMATHGoogle Scholar
  2. 2.
    Bertsekas, D.P., Gafni, E.: Projection methods for variational inequalities with application to the traffic assignment problem. Math. Prog. Study 17, 139–159 (1982)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Bertsekas, D.P., Hosein, P.A., Tseng, P.: Relaxation methods for network flow problems with convex arc costs. SIAM J. Control Optim. 25, 1219–1243 (1987)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Bertsekas, D.P., Tsitsiklis, J.N.: Parallel and Distributed Computation: Numerical Methods. Prentice-Hall, Englewood Cliffs (1989)zbMATHGoogle Scholar
  5. 5.
    Boley, D.: Local Linear Convergence of the Alternating Direction Method of Multipliers on Quadratic or Linear Programs. SIAM J. Optim. 23(4), 2183–2207 (2013)Google Scholar
  6. 6.
    Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3, 1–122 (2011). (Michael Jordan, Editor in Chief)CrossRefzbMATHGoogle Scholar
  7. 7.
    Bregman, L.M.: The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Comput. Math. Math. Phys. 7, 200–217 (1967)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Chen, C., He, B. , Yuan, X., Ye, Y.: The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent. Math. Prog. 155(1), 57–79 (2013)Google Scholar
  9. 9.
    Cottle, R.W., Duvall, S.G., Zikan, K.: A Lagrangian relaxation algorithm for the constrained matrix problem. Nav. Res. Logist. Q. 33, 55–76 (1986)CrossRefzbMATHGoogle Scholar
  10. 10.
    De Pierro, A.R., Iusem, A.N.: On the convergence properties of Hildreth’s quadratic programming algorithm. Math. Prog. 47, 37–51 (1990)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Deng, W., Yin, W.: On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers. J. Sci. Comput. 66(3), 889–916 (2012)Google Scholar
  12. 12.
    Douglas, J., Rachford, H.H.: On the numerical solution of the heat conduction problem in 2 and 3 space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Eckstein, J.: Splitting Methods for Monotone Operators with Applications to Parallel Optimization. Ph.D. Thesis, Operations Research Center, MIT (1989)Google Scholar
  14. 14.
    Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55, 293–318 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Eckstein, J., Svaiter, B.F.: General projective splitting methods for sums of maximal monotone operators. SIAM J. Control Optim. 48, 787–811 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Gabay, D.: Application of the method of multipliers to varuational inequalities. In: Fortin, M., Glowinski, R. (eds.) Augmented Lagrangian Methods: Application to the Numerical Solution of Boundary-Value Problem, pp. 299–331. North-Holland, Amsterdam (1983)CrossRefGoogle Scholar
  17. 17.
    Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite-element approximations. Comput. Math. Appl. 2, 17–40 (1976)CrossRefzbMATHGoogle Scholar
  18. 18.
    Glowinski, R.: Numerical Methods for Nonlinear Variational Problems. Springer, New York (1984)CrossRefzbMATHGoogle Scholar
  19. 19.
    Glowinski, R., Le Tallec, P.: Augmented Lagrangian and Operator Splitting Methods in Nonlinear Mechanics. SIAM Studies in Applied Mathematics, Philadelphia (1989)CrossRefzbMATHGoogle Scholar
  20. 20.
    Goldfarb, D., Ma, S.: Fast multiple splitting algorithms for convex optimization. SIAM J. Optim. 22(2), 533–556 (2012)Google Scholar
  21. 21.
    Goldfarb, D., Ma, S., Scheinberg, K.: Fast alternating linearization methods for minimizing the sum of two convex functions. Math. Prog. A. 141(1,2), 349–382 (2013)Google Scholar
  22. 22.
    Goldstein, A.A.: Convex programming in hilbert space. Bull. Am. Math. Soc. 70, 709–710 (1964)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Goldstein, T., O’Donoghue, B., Setzer, S.: Fast alternating direction optimization methods. SIAM J. Imaging Sci, 7(3), 1588–1623 (2014)Google Scholar
  24. 24.
    He, B.S., Tao, M., Yuan, X.M.: Alternating direction method with gaussian back substitution for separable convex programming. SIAM J. Optim. 22, 313–340 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    He, B.S., Yuan, X.M.: On the \(O(1/n)\) convergence rate of the Douglas–Rachford alternating direction method. SIAM J. Numer. Anal. 50, 700–709 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Hoffman, A.J.: On approximate solutions of systems of linear inequalities. J. Res. Nat. Bur. Stand. 49, 263–265 (1952)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Iusem, A.N.: On dual convergence and the rate of primal convergence of bregman’s convex programming method. SIAM J. Control Optim. 1, 401–423 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Kontogiorgis, S., Meyer, R.R.: A variable-penalty alternating directions method for convex optimization. Math. Program. 83, 29–53 (1998)MathSciNetzbMATHGoogle Scholar
  29. 29.
    Levitin, E.S., Poljak, B.T.: Constrained minimization methods. Z. Vycisl. Mat. i Mat. Fiz. 6, 787–823 (1965). English translation in USSR Comput. Math. Phys. 6, 1–50 (1965)Google Scholar
  30. 30.
    Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16, 964–979 (1979)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Lin, Y.Y., Pang, J.-S.: Iterative methods for large convex quadratic programs: a survey. SIAM J. Control Optim. 18, 383–411 (1987)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Luo, Z.-Q., Tseng, P.: On the convergence of the coordinate descent method for convex differentiable minimization. J. Optim. Theory Appl. 72, 7–35 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  33. 33.
    Luo, Z.-Q., Tseng, P.: On the Linear convergence of descent methods for convex essentially smooth minimization. SIAM J. Control Optim. 30, 408–425 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  34. 34.
    Luo, Z.-Q., Tseng, P.: On the convergence rate of dual ascent methods for strictly convex minimization. Math. Oper. Res. 18, 846–867 (1993)MathSciNetCrossRefzbMATHGoogle Scholar
  35. 35.
    Ma, S.: Alternating proximal gradient method for convex minimization. J. Sci. Comput. (2015). doi: 10.1007/s10915-015-0150-0
  36. 36.
    Mangasarian, O.L., Shiau, T.-H.: Lipschitz continuity of solutions of linear inequalities, programs and complementarity problems. SIAM J. Control Optim. 25, 583–595 (1987)MathSciNetCrossRefzbMATHGoogle Scholar
  37. 37.
    Monteiro, R., Svaiter, B.: Iteration-complexity of block-decomposition algorithms and the alternating direction method of multipliers. SIAM J. Optim. 23, 475–507 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  38. 38.
    Ohuchi, A., Kaji, I.: Lagrangian dual coordinatewise maximization algorithm for network transportation problems with quadratic costs. Networks 14, 515–530 (1984)CrossRefzbMATHGoogle Scholar
  39. 39.
    Ortega, J.M., Rheinboldt, W.C.: Iterative Solution of Nonlinear Equations in Several Variables. Academic Press, New York (1970)zbMATHGoogle Scholar
  40. 40.
    Pang, J.-S.: On the Convergence of Dual Ascent Methods for Large Scale Linearly Constrained Optimization Problems. The University of Texas, School of Management, Dallas (1984)Google Scholar
  41. 41.
    Pang, J.-S.: A posteriori error bounds for the linearly-constrained variational inequality problem. Math. Oper. Res. 12, 474–484 (1987)MathSciNetCrossRefGoogle Scholar
  42. 42.
    Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)CrossRefzbMATHGoogle Scholar
  43. 43.
    Tao, M., Yuan, X.M.: Recovering low-rank and sparse components of matrices from incomplete and noisy observations. SIAM J. Optim. 21, 57–81 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  44. 44.
    Tseng, P.: Dual ascent methods for problems with strictly convex costs and linear constraints: a unified approach. SIAM J. Control Optim. 28, 214–242 (1990)MathSciNetCrossRefzbMATHGoogle Scholar
  45. 45.
    Tseng, P.: Approximation accuracy, gradient methods, and error bound for structured convex optimization. Math. Prog. 125(2), 263–295 (2010)Google Scholar
  46. 46.
    Tseng, P., Bertsekas, D.P.: Relaxation methods for problems with strictly convex separable costs and linear constraints. Math. Prog. 38, 303–321 (1987)MathSciNetCrossRefzbMATHGoogle Scholar
  47. 47.
    Tseng, P., Bertsekas, D.P.: Relaxation methods for problems with strictly convex costs and linear constraints. Math. Oper. Res. 16, 462–481 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  48. 48.
    Ventura, J.A., Hearn, D.W.: Computational Development of a Lagrangian Dual Approach for Quadratic Networks. 21(4), 469–485 (1991)Google Scholar
  49. 49.
    Wang, X.F., Yuan, X.M.: The linearized alternating direction method of multipliers for dantzig selector. SIAM J. Sci. Comput. 34, 2792–2811 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  50. 50.
    Yang, J.F., Zhang, Y.: Alternating direction algorithms for \(l_1\)-problems in compressive sensing. SIAM J. Sci. Comput. 33, 250–278 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  51. 51.
    Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B (Statistical Methodology) 68, 49–67 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  52. 52.
    Zhang, H., Jiang, J.J., Luo, Z.-Q.: On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems. J. Oper. Res. Soc. Chin. 1(2), 163–186 (2013)Google Scholar
  53. 53.
    Zenios, S.A., Mulvey, J.M.: Relaxation techniques for strictly convex network problems. Ann. Oper. Res. 5, 517–538 (1986)MathSciNetCrossRefGoogle Scholar
  54. 54.
    Zhou, Z., Li, X., Wright, J., Candes, E.J., Ma, Y.: Stable principal component pursuit. In: Proceedings of 2010 IEEE International Symposium on Information Theory (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society 2016

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringUniversity of MinnesotaMinneapolisUSA
  2. 2.School of Science and EngineeringThe Chinese University of Hong KongShenzhenChina
  3. 3.Department of Industrial and Manufacturing Systems EngineeringIowa State UniversityAmesUSA

Personalised recommendations