Abstract
We proposed an ADMM-like splitting method in [11] for solving convex minimization problems with linear constraints and multi-block separable objective functions. Its proximal parameter is required to be sufficiently large to theoretically ensure the convergence, despite that a smaller value of this parameter is preferred for numerical acceleration. Empirically, this method has been applied to solve various applications with relaxed restrictions on the parameter, yet no rigorous theory is available for guaranteeing the convergence. In this paper, we identify the optimal (smallest) proximal parameter for this method and clarify some ambiguity in selecting this parameter for implementation. For succinctness, we focus on the case where the objective function is the sum of three functions and show that the optimal proximal parameter is 0.5. This optimal proximal parameter generates positive indefiniteness in the regularization of the subproblems, and thus its convergence analysis is significantly different from those for existing methods of the same kind in the literature, which all require positive definiteness (or positive semi-definiteness plus additional assumptions) of the regularization. We establish the convergence and estimate the convergence rate in terms of iteration complexity for the improved method with the optimal proximal parameter.
Bingsheng He—He was supported by the NSFC Grant 11471156. // Xiaoming Yuan—He was supported by the General Research Fund from Hong Kong Research Grants Council: 12300317.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
C.H. Chen, B.S. He, Y.Y. Ye, X.M. Yuan, The direct extension of ADMM for multi-block convex minimization problems is not necessary convergent. Math. Program. 155, 57–79 (2016)
E. Esser, M. Möller, S. Osher, G. Sapiro, J. Xin, A convex model for non-negative matrix factorization and dimensionality reduction on physical space. IEEE Trans. Imaging Process. 21(7), 3239–3252 (2012)
F. Facchinei, J.S. Pang, Finite-Dimensional Variational Inequalities and Complementarity Problems, Vol. I (Springer Series in Operations Research, Springer, 2003)
R. Glowinski, Numerical Methods for Nonlinear Variational Problems (Springer, New York, Berlin, Heidelberg, Tokyo, 1984)
R. Glowinski, A. Marrocco, Sur l’approximationparéléments finis d’ordre un et larésolution parpénalisation-dualité d’une classe de problèmes deDirichlet non linéaires, Revue Fr. Autom. Inform. Rech.Opér., Anal. Numér. 2 (1975), pp. 41–76
B.S. He, PPA-like contraction methods for convex optimization: a framework using variational inequality approach. J. Oper. Res. Soc. China 3, 391–420 (2015)
B.S. He, H. Liu, Z.R. Wang, X.M. Yuan, A strictly contractive Peaceman-Rachford splitting method for convex programming. SIAM J. Optim. 24, 1011–1040 (2014)
B.S. He, L.S. Hou, X.M. Yuan, On full Jacobian decomposition of the augmented Lagrangian method for separable convex programming. SIAM J. Optim. 25(4), 2274–2312 (2015)
B.S. He, F. Ma, X.M. Yuan, Indefinite proximal augmented Lagrangian method and its application to full Jacobian splitting for multi-block separable convex minimization problems. IMA J. Numer. Anal. 75, 361–388 (2020)
B.S. He, F. Ma, X.M. Yuan, Optimally linearizing the alternating direction method of multipliers for convex programming. Comput. Optim. Appl. 75(2), 361–388 (2020)
B.S. He, M. Tao, X.M. Yuan, A splitting method for separable convex programming. IMA J. Numer. Anal. 35, 394–426 (2014)
B.S. He, H. Yang, Some convergence properties of a method of multipliers for linearly constrained monotone variational inequalities. Oper. Res. Lett. 23, 151–161 (1998)
B.S. He, X.M. Yuan, On the \(O(1/t)\) convergence rate of the alternating direction method. SIAM J. Numer. Anal. 50, 700–709 (2012)
B.S. He, X.M. Yuan, On non-ergodic convergence rate of Douglas-Rachford alternating directions method of multipliers. Numer. Math. 130, 567–577 (2015)
M.R. Hestenes, Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)
K.C. Kiwiel, C.H. Rosa, A. Ruszczy\(\acute{n}\)ski, Proximal decomposition via alternating linearization. SIAM J. Optim. 9, 668–C689 (1999)
B. Martinet, Regularisation, d’inéquations variationelles par approximations succesives. Rev. Francaise d’Inform. Recherche Oper. 4, 154–159 (1970)
Powell M.J.D., A method for nonlinear constraints in minimization problems, in Optimization, ed. by R. Fletcher (Academic Press, New York, NY, 1969), pp. 283–298
R.T. Rockafellar, Monotone operators and the proximal point algorithm. SIAM J. Cont. Optim. 14, 877–898 (1976)
M. Tao, X.M. Yuan, Recovering low-rank and sparse components of matrices from incomplete and noisy observations. SIAM J. Optim. 21, 57–81 (2011)
R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, K. Knight, Sparsity and smoothness via the fused lasso. J. R. Stat. Soc. 67, 91–108 (2005)
X. Zhou, C. Yang, W. Yu, Moving object detection by detecting contiguous outliers in the low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35, 597–610 (2013)
Z. Zhou, X. Li, J. Wright, E.J. Candes, Y. Ma, Stable principal component pursuit, in Proceedings of international symposium on information theory, Austin, Texas, USA (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
He, B., Yuan, X. (2021). On the Optimal Proximal Parameter of an ADMM-like Splitting Method for Separable Convex Programming. In: Tai, XC., Wei, S., Liu, H. (eds) Mathematical Methods in Image Processing and Inverse Problems. IPIP 2018. Springer Proceedings in Mathematics & Statistics, vol 360. Springer, Singapore. https://doi.org/10.1007/978-981-16-2701-9_8
Download citation
DOI: https://doi.org/10.1007/978-981-16-2701-9_8
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-2700-2
Online ISBN: 978-981-16-2701-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)