Abstract
We design, analyze and test a golden ratio primal–dual algorithm (GRPDA) for solving structured convex optimization problem, where the objective function is the sum of two closed proper convex functions, one of which involves a composition with a linear transform. GRPDA preserves all the favorable features of the classical primal–dual algorithm (PDA), i.e., the primal and the dual variables are updated in a Gauss–Seidel manner, and the per iteration cost is dominated by the evaluation of the proximal point mappings of the two component functions and two matrix-vector multiplications. Compared with the classical PDA, which takes an extrapolation step, the novelty of GRPDA is that it is constructed based on a convex combination of essentially the whole iteration trajectory. We show that GRPDA converges within a broader range of parameters than the classical PDA, provided that the reciprocal of the convex combination parameter is bounded above by the golden ratio, which explains the name of the algorithm. An \(\mathcal {O}(1/N)\) ergodic convergence rate result is also established based on the primal–dual gap function, where N denotes the number of iterations. When either the primal or the dual problem is strongly convex, an accelerated GRPDA is constructed to improve the ergodic convergence rate from \(\mathcal {O}(1/N)\) to \(\mathcal {O}(1/N^2)\). Moreover, we show for regularized least-squares and linear equality constrained problems that the reciprocal of the convex combination parameter can be extended from the golden ratio to 2 and meanwhile a relaxation step can be taken. Our preliminary numerical results on LASSO, nonnegative least-squares and minimax matrix game problems, with comparisons to some state-of-the-art relative algorithms, demonstrate the efficiency of the proposed algorithms.
Similar content being viewed by others
Notes
An operator P is \(\alpha \)-averaged for some \(\alpha \in (0,1)\) if there exists a nonexpansive operator Q such that \(P = (1-\alpha )I +\alpha Q\).
References
Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. CMS Books in Mathematics/Ouvrages de Mathématiques de la SMC, 2nd edn. Springer, Cham (2011)
Beck, A.: First-Order Methods in Optimization. MOS-SIAM Series on Optimization. SIAM-Society for Industrial and Applied Mathematics, Philadelphia (2017)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)
Bertsekas, D.P., Gafni, E.M.: Projection methods for variational inequalities with application to the traffic assignment problem. Math. Program. Stud. 17, 139–159 (1982)
Bouwmans, T., Aybat, N.S., Zahzah, E.H.: Algorithms for Stable PCA. In: Bouwmans, T., Aybat, N.S., Zahzah, E. (eds.) Handbook of Robust Low-Rank and Sparse Matrix Decomposition: Applications in Image and Video Processing, chap. 2. CRC Press, Taylor and Francis Group (2016)
Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2010)
Chambolle, A., Ehrhardt, M.J., Richtarik, P., Schonlieb, C.-B.: Stochastic primal–dual hybrid gradient algorithm with arbitrary sampling and imaging application. SIAM J. Optim. 28(4), 2783–2808 (2018)
Chambolle, A., Pock, T.: A first-order primal–dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vis. 40(1), 120–145 (2011)
Chambolle, A., Pock, T.: On the ergodic convergence rates of a first-order primal–dual algorithm. Math. Program. 159(1–2), 253–287 (2016)
Chen, S., Saunders, M.A., Donoho, D.L.: Atomic decomposition by basis pursuit. SIAM Rev. 43(1), 129–159 (2001)
Donoho, D.L.: Compressed sensing. IEEE Trans. Inform. Theory 52(4), 1289–1306 (2006)
Duchi, J., Shalev-Shwartz, S., Singer, Y., Chandra, T.: Efficient projections onto the \(l_1\)-ball for learning in high dimensions. In: The 25th International Conference on Machine Learning, pp. 272–279 (2008)
Esser, E., Zhang, X., Chan, T.F.: A general framework for a class of first order primal–dual algorithms for convex optimization in imaging science. SIAM J. Imaging Sci. 3(4), 1015–1046 (2010)
Fazel, M., Pong, T.K., Sun, D.F., Tseng, P.: Hankel matrix rank minimization with applications in system identification and realization. SIAM J. Matrix Anal. Appl. 34, 946–977 (2013)
Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite-element approximations. Comput. Math. Appl. 2, 17–40 (1976)
Glowinski, R., Marrocco, A.: Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité, d’une classe de problèmes de Dirichlet non linéaires. R.A.I.R.O., R2 9(R-2), 41–76 (1975)
Hayden, S., Stanley, O.: A low patch-rank interpretation of texture. SIAM J. Imaging Sci. 6(1), 226–262 (2013)
He, B., Yuan, X.: Convergence analysis of primal–dual algorithms for a saddle-point problem: from contraction perspective. SIAM J. Imaging Sci. 5(1), 119–149 (2012)
Jonathan, E., Dimitri, P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55(1–3), 293–318 (1992)
Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16(6), 964–979 (1979)
Liu, Y., Xu, Y., Yin, W.: Acceleration of primal-dual methods by preconditioning and simple subproblem procedures. arXiv:1811.08937v2 (2018)
Malitsky, Y.: Golden ratio algorithms for variational inequalities. Math. Program. (to appear)
Malitsky, Y., Pock, T.: A first-order primal–dual algorithm with linesearch. SIAM J. Optim. 28(1), 411–432 (2018)
Monteiro, R.D.C., Svaiter, B.F.: Complexity of variants of Tseng’s modified FB splitting and Korpelevich’s methods for hemivariational inequalities with applications to saddle-point and convex optimization problems. SIAM J. Optim. 21(4), 1688–1720 (2011)
Nedic, A., Ozdaglar, A.: Subgradient methods for saddle-point problems. J. Optim. Theory Appl. 142(1), 205–228 (2009)
Needell, D., Ward, R.: Near-optimal compressed sensing guarantees for anisotropic and isotropic total variation minimization. IEEE Trans. Image Process. 22(10), 3941–3949 (2013)
Nemirovski, A.: Prox-method with rate of convergence O(1/t) for variational inequalities with Lipschitz continuous monotone operators and smooth convex–concave saddle point problems. SIAM J. Optim. 15(1), 229–251 (2006)
Paige, C.C., Saunders, M.A.: Lsqr: an algorithm for sparselinear equations and sparse least squares. ACM Trans. Math. Soft. 8, 43–71 (1982)
Pock, T., Chambolle, A.: Diagonal preconditioning for first order primal-dual algorithms in convex optimization. In: IEEE International Conference on Computer Vision, pp. 1762–1769 (2011)
Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)
Shefi, R., Teboulle, M.: Rate of convergence analysis of decomposition methods based on the proximal method of multipliers for convex minimization. SIAM J. Optim. 24(1), 269–297 (2014)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. 58(1), 267–288 (1996)
Uzawa, H.: Iterative methods for concave programming. In: Arrow, K.J., Hurwicz, L., Uzawa, H. (eds.) Studies in Linear and Nonlinear Programming. Stanford University Press, Stanford, CA (1958)
Yang, J., Zhang, Y.: Alternating direction algorithms for \(\ell _1\)-problems in compressive sensing. SIAM J. Sci. Comput. 33(1), 250–278 (2011)
Zhu, M., Chan, T.F.: An efficient primal-dual hybrid gradient algorithm for total variation image restoration. In: CAM Report 08-34, UCLA, Los Angeles, CA (2008)
Acknowledgements
Junfeng Yang was supported by the NSFC Grants 11922111 and 11771208. The research of Xiaokai Chang was supported by the Innovation Ability Improvement Project of Gansu (Grant No. 2020A022) and the Hongliu Foundation of Firstclass Disciplines of Lanzhou University of Technology, China.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Chang, X., Yang, J. A Golden Ratio Primal–Dual Algorithm for Structured Convex Optimization. J Sci Comput 87, 47 (2021). https://doi.org/10.1007/s10915-021-01452-9
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10915-021-01452-9
Keywords
- Structured convex optimization
- Saddle point problem
- Primal–dual algorithm
- Golden ratio
- Acceleration
- Convergence rate
- Fixed point iteration