Skip to main content

On the Optimal Proximal Parameter of an ADMM-like Splitting Method for Separable Convex Programming

  • Conference paper
  • First Online:
Book cover Mathematical Methods in Image Processing and Inverse Problems (IPIP 2018)

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 360))

Included in the following conference series:

Abstract

We proposed an ADMM-like splitting method in [11] for solving convex minimization problems with linear constraints and multi-block separable objective functions. Its proximal parameter is required to be sufficiently large to theoretically ensure the convergence, despite that a smaller value of this parameter is preferred for numerical acceleration. Empirically, this method has been applied to solve various applications with relaxed restrictions on the parameter, yet no rigorous theory is available for guaranteeing the convergence. In this paper, we identify the optimal (smallest) proximal parameter for this method and clarify some ambiguity in selecting this parameter for implementation. For succinctness, we focus on the case where the objective function is the sum of three functions and show that the optimal proximal parameter is 0.5. This optimal proximal parameter generates positive indefiniteness in the regularization of the subproblems, and thus its convergence analysis is significantly different from those for existing methods of the same kind in the literature, which all require positive definiteness (or positive semi-definiteness plus additional assumptions) of the regularization. We establish the convergence and estimate the convergence rate in terms of iteration complexity for the improved method with the optimal proximal parameter.

Bingsheng He—He was supported by the NSFC Grant 11471156. // Xiaoming Yuan—He was supported by the General Research Fund from Hong Kong Research Grants Council: 12300317.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. C.H. Chen, B.S. He, Y.Y. Ye, X.M. Yuan, The direct extension of ADMM for multi-block convex minimization problems is not necessary convergent. Math. Program. 155, 57–79 (2016)

    Article  MathSciNet  Google Scholar 

  2. E. Esser, M. Möller, S. Osher, G. Sapiro, J. Xin, A convex model for non-negative matrix factorization and dimensionality reduction on physical space. IEEE Trans. Imaging Process. 21(7), 3239–3252 (2012)

    Article  Google Scholar 

  3. F. Facchinei, J.S. Pang, Finite-Dimensional Variational Inequalities and Complementarity Problems, Vol. I (Springer Series in Operations Research, Springer, 2003)

    Google Scholar 

  4. R. Glowinski, Numerical Methods for Nonlinear Variational Problems (Springer, New York, Berlin, Heidelberg, Tokyo, 1984)

    Book  Google Scholar 

  5. R. Glowinski, A. Marrocco, Sur l’approximationparéléments finis d’ordre un et larésolution parpénalisation-dualité d’une classe de problèmes deDirichlet non linéaires, Revue Fr. Autom. Inform. Rech.Opér., Anal. Numér. 2 (1975), pp. 41–76

    Google Scholar 

  6. B.S. He, PPA-like contraction methods for convex optimization: a framework using variational inequality approach. J. Oper. Res. Soc. China 3, 391–420 (2015)

    Article  MathSciNet  Google Scholar 

  7. B.S. He, H. Liu, Z.R. Wang, X.M. Yuan, A strictly contractive Peaceman-Rachford splitting method for convex programming. SIAM J. Optim. 24, 1011–1040 (2014)

    Article  MathSciNet  Google Scholar 

  8. B.S. He, L.S. Hou, X.M. Yuan, On full Jacobian decomposition of the augmented Lagrangian method for separable convex programming. SIAM J. Optim. 25(4), 2274–2312 (2015)

    Article  MathSciNet  Google Scholar 

  9. B.S. He, F. Ma, X.M. Yuan, Indefinite proximal augmented Lagrangian method and its application to full Jacobian splitting for multi-block separable convex minimization problems. IMA J. Numer. Anal. 75, 361–388 (2020)

    MATH  Google Scholar 

  10. B.S. He, F. Ma, X.M. Yuan, Optimally linearizing the alternating direction method of multipliers for convex programming. Comput. Optim. Appl. 75(2), 361–388 (2020)

    Article  MathSciNet  Google Scholar 

  11. B.S. He, M. Tao, X.M. Yuan, A splitting method for separable convex programming. IMA J. Numer. Anal. 35, 394–426 (2014)

    Article  MathSciNet  Google Scholar 

  12. B.S. He, H. Yang, Some convergence properties of a method of multipliers for linearly constrained monotone variational inequalities. Oper. Res. Lett. 23, 151–161 (1998)

    Article  MathSciNet  Google Scholar 

  13. B.S. He, X.M. Yuan, On the \(O(1/t)\) convergence rate of the alternating direction method. SIAM J. Numer. Anal. 50, 700–709 (2012)

    Article  MathSciNet  Google Scholar 

  14. B.S. He, X.M. Yuan, On non-ergodic convergence rate of Douglas-Rachford alternating directions method of multipliers. Numer. Math. 130, 567–577 (2015)

    Article  MathSciNet  Google Scholar 

  15. M.R. Hestenes, Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)

    Article  MathSciNet  Google Scholar 

  16. K.C. Kiwiel, C.H. Rosa, A. Ruszczy\(\acute{n}\)ski, Proximal decomposition via alternating linearization. SIAM J. Optim. 9, 668–C689 (1999)

    Google Scholar 

  17. B. Martinet, Regularisation, d’inéquations variationelles par approximations succesives. Rev. Francaise d’Inform. Recherche Oper. 4, 154–159 (1970)

    MATH  Google Scholar 

  18. Powell M.J.D., A method for nonlinear constraints in minimization problems, in Optimization, ed. by R. Fletcher (Academic Press, New York, NY, 1969), pp. 283–298

    Google Scholar 

  19. R.T. Rockafellar, Monotone operators and the proximal point algorithm. SIAM J. Cont. Optim. 14, 877–898 (1976)

    Article  MathSciNet  Google Scholar 

  20. M. Tao, X.M. Yuan, Recovering low-rank and sparse components of matrices from incomplete and noisy observations. SIAM J. Optim. 21, 57–81 (2011)

    Article  MathSciNet  Google Scholar 

  21. R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, K. Knight, Sparsity and smoothness via the fused lasso. J. R. Stat. Soc. 67, 91–108 (2005)

    Google Scholar 

  22. X. Zhou, C. Yang, W. Yu, Moving object detection by detecting contiguous outliers in the low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35, 597–610 (2013)

    Article  Google Scholar 

  23. Z. Zhou, X. Li, J. Wright, E.J. Candes, Y. Ma, Stable principal component pursuit, in Proceedings of international symposium on information theory, Austin, Texas, USA (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaoming Yuan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

He, B., Yuan, X. (2021). On the Optimal Proximal Parameter of an ADMM-like Splitting Method for Separable Convex Programming. In: Tai, XC., Wei, S., Liu, H. (eds) Mathematical Methods in Image Processing and Inverse Problems. IPIP 2018. Springer Proceedings in Mathematics & Statistics, vol 360. Springer, Singapore. https://doi.org/10.1007/978-981-16-2701-9_8

Download citation

Publish with us

Policies and ethics