Skip to main content
Log in

The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

The alternating direction method of multipliers (ADMM) is now widely used in many fields, and its convergence was proved when two blocks of variables are alternatively updated. It is strongly desirable and practically valuable to extend the ADMM directly to the case of a multi-block convex minimization problem where its objective function is the sum of more than two separable convex functions. However, the convergence of this extension has been missing for a long time—neither an affirmative convergence proof nor an example showing its divergence is known in the literature. In this paper we give a negative answer to this long-standing open question: The direct extension of ADMM is not necessarily convergent. We present a sufficient condition to ensure the convergence of the direct extension of ADMM, and give an example to show its divergence.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. A more general model with \(m\) block of functions and variables was considered in [20]. But here, for the convenience of notation, we only focus on the model (1.1) with \(m=3\) and the analysis can be trivially extended to the general case with a generic \(m\).

References

  1. Bertsekas, D.P.: Constrained Optimization and Lagrange Multiplier Methods. Academic Press, London (1982)

    MATH  Google Scholar 

  2. Blum, E., Oettli, W.: Mathematische Optimierung. Grundlagen und Verfahren. Ökonometrie und Unternehmensforschung. Springer, Berlin-Heidelberg-New York (1975)

    Google Scholar 

  3. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3, 1–122 (2010)

    Article  MATH  Google Scholar 

  4. Chan, T.F., Glowinski, R.: Finite Element Approximation and Iterative Solution of a Class of Mildly Non-linear Elliptic Equations, Technical Report. Stanford University, Stanford, CA (1978)

    Google Scholar 

  5. Chandrasekaran, V., Parrilo, P.A., Willsky, A.S.: Latent variable graphical model selection via convex optimization. Ann. Stat. 40, 1935–1967 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  6. Eckstein, J., Yao, W.: Augmented Lagrangian and alternating direction methods for convex optimization: A tutorial and some illustrative computational results, manuscript (2012)

  7. Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55, 293–318 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  8. Fortin, M., Glowinski, R.: On decomposition-coordination methods using an augmented Lagrangian. In: Fortin, M., Glowinski, R. (eds.) Augmented Lagrangian Methods: Applications to the Solution of Boundary Problems. North- Holland, Amsterdam (1983)

    Google Scholar 

  9. Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite element approximations. Comput. Math. Appl. 2, 17–40 (1976)

    Article  MATH  Google Scholar 

  10. Glowinski, R.: Numerical Methods for Nonlinear Variational Problems. Springer, Berlin (1984)

    Book  MATH  Google Scholar 

  11. Glowinski, R.: On alternating directon methods of multipliers: a historical perspective. In: Springer Proceedings of a Conference Dedicated to J. Periaux (to appear)

  12. Glowinski, R., Marrocco, A.: Approximation par èlèments finis d’ordre un et rèsolution par pènalisation-dualitè d’une classe de problémes non linèaires. R.A.I.R.O. R2, 41–76 (1975)

    MathSciNet  Google Scholar 

  13. Gol’shtein, E.G., Tret’yakov, N.V.: Modified Lagrangian in convex programming and their generalizations. Math. Program. Studies 10, 86–97 (1979)

    Article  MathSciNet  Google Scholar 

  14. Han, D.R., Yuan, X.M.: A note on the alternating direction method of multipliers. J. Optim. Theory Appl. 155, 227–238 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  15. He, B.S., Tao, M., Yuan, X.M.: Alternating direction method with Gaussian back substitution for separable convex programming. SIAM J. Optim. 22, 313–340 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  16. He, B. S., Tao, M., Yuan, X. M.: A splitting method for separable convex programming. IMA J. Numer. Anal. (to appear)

  17. He, B.S., Tao, M., Yuan, X.M.: Convergence rate and iteration complexity on the alternating direction method of multipliers with a substitution procedure for separable convex programming. Math. Oper. Res. (under revision)

  18. He, B.S., Yuan, X.M.: On the \(O(1/n)\) convergence rate of the Douglas–Rachford alternating direction method. SIAM J. Num. Anal. 50, 700–709 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  19. Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)

    Article  MATH  MathSciNet  Google Scholar 

  20. Hong, M., Luo, Z. Q.: On the linear convergence of the alternating direction method of multipliers, manuscript (August 2012)

  21. McLachlan, G.J.: Discriminant Analysis and Statistical Pattern Recognition, vol. 544. Wiley-Interscience, New York (2004)

    MATH  Google Scholar 

  22. Mohan, K., London, P., Fazel, M., Witten, D., Lee, S.: Node-based learning of multiple gaussian graphical models. arXiv:1303.5145 (2013)

  23. Martinet, B.: Regularization d’inequations variationelles par approximations successives. Revue Francaise d’Informatique et de Recherche Opérationelle 4, 154–159 (1970)

    MATH  MathSciNet  Google Scholar 

  24. Peng, Y.G., Ganesh, A., Wright, J., Xu, W.L., Ma, Y.: Robust alignment by sparse and low-rank decomposition for linearly correlated images. IEEE Trans. Pattern Anal. Mach. Intel. 34, 2233–2246 (2012)

    Article  Google Scholar 

  25. Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Fletcher, R. (ed.) Optimization, pp. 283–298. Academic Press, New York (1969)

    Google Scholar 

  26. Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1, 97–116 (1976)

    Article  MATH  MathSciNet  Google Scholar 

  27. Tao, M., Yuan, X.M.: Recovering low-rank and sparse components of matrices from incomplete and noisy observations. SIAM J. Optim. 21, 57–81 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  28. Wen, Z., Goldfarb, D., Yin, W.: Alternating direction augmented lagrangian methods for semidefinite programming. Math. Program. Comput. 2, 203–230 (2010)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Caihua Chen.

Additional information

C. Chen: This author was supported in part by the Natural Science Foundation of Jiangsu Province under project Grant No. BK20130550 and the NSFC Grant 11401300 and 11371192.

B. He: This author was supported by the NSFC Grant 91130007 and 11471156.

Y. Ye: This author was supported by AFOSR Grant FA9550-12-1-0396.

X. Yuan: This author was supported partially by the General Research Fund from Hong Kong Research Grants Council: 203613.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, C., He, B., Ye, Y. et al. The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent. Math. Program. 155, 57–79 (2016). https://doi.org/10.1007/s10107-014-0826-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-014-0826-5

Keywords

Mathematics Subject Classification

Navigation