Generalized symmetric ADMM for separable convex optimization

  • Jianchao Bai
  • Jicheng Li
  • Fengmin Xu
  • Hongchao Zhang
Article

Abstract

The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where \(p \ge 1\) and \(q \ge 1\) are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case \({\mathcal {O}}(1/t)\) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising.

Keywords

Separable convex programming Multiple blocks Parameter convergence domain Alternating direction method of multipliers Global convergence Complexity Statistical learning 

Mathematics Subject Classification

65C60 65E05 68W40 90C06 

Notes

Acknowledgements

The authors would like to thank the anonymous referees for providing very constructive comments. Jianchao Bai also wish to thank Prof. Defeng Sun at National University of Singapore for his valuable discussions on ADMM and Prof. Pingfan Dai at Xi’an Jiaotong University for discussion on an early version of the paper.

References

  1. 1.
    Bai, J.C., Li, J.C., Li, J.F.: A novel parameterized proximal point algorithm with applications in statistical learning. Optimization Online, March (2017) http://www.optimization-online.org/DB_HTML/2017/03/5901.html
  2. 2.
    Chandrasekaran, V., Parrilo, P.A., Willsky, A.S.: Latent variable graphical model selection via convex optimization. Ann. Stat. 40, 1935–1967 (2012)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Chen, C.H., He, B.S., Ye, Y.Y., Yuan, X.M.: The direct extension of ADMM for multi-block minimization problems is not necessarily convergent. Math. Program. 155, 57–79 (2016)MathSciNetCrossRefMATHGoogle Scholar
  4. 4.
    Dong, B., Yu, Y., Tian, D.D.: Alternating projection method for sparse model updating problems. J. Eng. Math. 93, 159–173 (2015)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Fortin, M., Glowinski, R.: Augmented Lagrangian Methods: Applications to the Numerical Solution of Boundary-Value Problems, pp. 299–331. North-Holland, Amsterdam (1983)MATHGoogle Scholar
  6. 6.
    Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer Ser. Oper. Res. 1, Springer, New York (2003)Google Scholar
  7. 7.
    Fortin, M.: Numerical Methods for Nonlinear Variational Problems. Springer, New York (1984)Google Scholar
  8. 8.
    Glowinski, R., Marrocco, A.: Approximation paréléments finis d’rdre un et résolution, par pénalisation-dualité d’une classe de problèmes de Dirichlet non linéaires. Rev. Fr. Autom. Inform. Rech. Opér. Anal. Numér. 2, 41–76 (1975)Google Scholar
  9. 9.
    Gu, Y., Jiang, B., Han, D.: A semi-proximal-based strictly contractive Peaceman-Rachford splitting method. arXiv:1506.02221 (2015)
  10. 10.
    Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)MathSciNetCrossRefMATHGoogle Scholar
  11. 11.
    He, B.S., Yuan, X.M.: On the \(\cal{O}(1/n)\) convergence rate of the Douglas–Rachford alternating direction method. SIAM J. Numer. Anal. 50, 700–709 (2012)MathSciNetCrossRefMATHGoogle Scholar
  12. 12.
    He, B.S., Liu, H., Wang, Z.R., Yuan, X.M.: A strictly contractive Peaceman–Rachford splitting method for convex programming. SIAM J. Optim. 24, 1011–1040 (2014)MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    He, B.S., Hou, L.S., Yuan, X.M.: On full Jacobian decomposition of the augmented Lagrangian method for separable convex programming. SIAM J. Optim. 25, 2274–2312 (2015)MathSciNetCrossRefMATHGoogle Scholar
  14. 14.
    He, B.S., Tao, M., Yuan, X.M.: A splitting method for separable convex programming. IMA J. Numer. Anal. 31, 394–426 (2015)MathSciNetCrossRefMATHGoogle Scholar
  15. 15.
    He, B.S., Yuan, X.M.: Block-wise alternating direction method of multipliers for multiple-block convex programming and beyond. SIAM J. Comput. Math. 1, 145–174 (2015)MathSciNetCrossRefGoogle Scholar
  16. 16.
    He, B.S., Ma, F., Yuan, X.M.: Convergence study on the symmetric version of ADMM with larger step sizes. SIAM J. Imaging Sci. 9, 1467–1501 (2016)MathSciNetCrossRefMATHGoogle Scholar
  17. 17.
    He, B.S., Xu, H.K., Yuan, X.M.: On the proximal Jacobian decomposition of ALM for multiple-block separable convex minimization problems and its relationship to ADMM. J. Sci. Comput. 66, 1204–1217 (2016)MathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    He, B.S.: On the convergence properties of alternating direction method of multipliers. Numer. Math. J. Chin. Univ. (Chine. Ser.) 39, 81–96 (2017)Google Scholar
  19. 19.
    Liu, Z.S., Li, J.C., Li, G., Bai, J.C., Liu, X.N.: A new model for sparse and low-rank matrix decomposition. J. Appl. Anal. Comput. 7, 600–616 (2017)MathSciNetGoogle Scholar
  20. 20.
    Ma, S.Q.: Alternating proximal gradient method for convex minimization. J. Sci. Comput. 68, 546–572 (2016)MathSciNetCrossRefMATHGoogle Scholar
  21. 21.
    Rothman, A.J., Bickel, P.J., Levina, E., Zhu, J.: Sparse permutation invariant covariance estimation. Electron. J. Stat. 2, 494–515 (2008)MathSciNetCrossRefMATHGoogle Scholar
  22. 22.
    Tao, M., Yuan, X.M.: Recovering low-rank and sparse components of matrices from incomplete and noisy observations. SIAM J. Optim. 21, 57–81 (2011)MathSciNetCrossRefMATHGoogle Scholar
  23. 23.
    Wang, J.J., Song, W.: An algorithm twisted from generalized ADMM for multi-block separable convex minimization models. J. Comput. Appl. Math. 309, 342–358 (2017)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2017

Authors and Affiliations

  • Jianchao Bai
    • 1
  • Jicheng Li
    • 1
  • Fengmin Xu
    • 2
  • Hongchao Zhang
    • 3
  1. 1.School of Mathematics and StatisticsXi’an Jiaotong UniversityXi’anPeople’s Republic of China
  2. 2.School of Economics and FinanceXi’an Jiaotong UniversityXi’anPeople’s Republic of China
  3. 3.Department of MathematicsLouisiana State UniversityBaton RougeUSA

Personalised recommendations