Abstract
This chapter delves into three advanced algorithms for convex minimization. The projected gradient algorithm is useful in minimizing a strictly convex quadratic over a closed convex set. Although the algorithm extends to more general convex functions, the best theoretical results are available in this limited setting. We rely on the MM principle to motivate and extend the algorithm. The connections to Dykstra’s algorithm and the contraction mapping principle add to the charm of the subject. On the minus side of the ledger, the projected gradient method can be very slow to converge. This defect is partially offset by ease of coding in many problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bertsekas DP (1999) Nonlinear programming, 2nd edn. Athena Scientific, Belmont
Bregman LM (1965) The method of successive projection for finding a common point of convex sets. Sov Math Dokl 6:688–692
Bregman LM (1967) The relaxation method of finding the common points of convex sets and its application to the solution of problems in convex programming. USSR Comput Math Math Phy 7:200–217
Brophy JF, Smith PW (1988) Prototyping Karmarkar’s algorithm using MATH-PROTRAN. IMSL Dir 5:2–3
Candés EJ, Romberg J, Tao T (2006) Stable signal recovery from incomplete and inaccurate measurements. Comm Pure Appl Math 59:1207–1223
Cheney W (2001) Analysis for applied mathematics. Springer, New York
Conte SD, deBoor C (1972) Elementary numerical analysis. McGraw- Hill, New York
Davis JA, Smith TW (2008) General social surveys, 1972–2008 [machine-readable data le]. Roper Center for Public Opinion Research, University of Connecticut, Storrs
Donoho DL (2006) Compressed sensing. IEEE Trans Inform Theor 52:1289–1306
Goldstein T, Osher S (2009) The split Bregman method for ℓ 1-regularized problems. SIAM J Imag Sci 2:323–343
He L, Marquina A, Osher S (2005) Blind deconvolution using TV regularization and Bregman iteration. Int J Imag Syst Technol 15, 74–83
Jia R-Q, Zhao H, Zhao W (2009) Convergence analysis of the Bregman method for the variational model of image denoising. Appl Comput Harmon Anal 27:367–379
Hestenes MR, Karush WE (1951) A method of gradients for the calculation of the characteristic roots and vectors of a real symmetric matrix. J Res Natl Bur Stand 47:471–478
Osher S, Burger M, Goldfarb D, Xu J, Yin W (2005) An iterative regularization method for total variation based image restoration. Multiscale Model Simul 4:460–489
Osher S, Mao T, Dong B, Yin W (2011) Fast linearized Bregman iteration for compressive sensing and sparse denoising. Comm Math Sci 8:93–111
Robertson T, Wright FT, Dykstra RL (1988) Order restricted statistical inference. Wiley, Hoboken
Rudin LI, Osher S, Fatemi E (1992) Nonlinear total variation based noise removal algorithms. Physica D 60:259–268
Ruszczyński A (2006) Nonlinear optimization. Princeton University Press, Princeton
Silvapulle MJ, Sen PK (2005) Constrained statistical inference. Wiley, Hoboken
Tibshirani R, Saunders M, Rosset S, Zhu J, Knight K (2005) Sparsity and smoothness via the fused lasso. J Roy Stat Soc B 67:91–108
Yin W, Osher S, Goldfarb D, Darbon J (2008) Bregman iterative algorithms for ℓ 1-minimization with applications to compressed sensing. SIAM J Imag Sci 1:143–168
Zhang Z, Lange K, Ophoff R, Sabatti C (2010) Reconstructing DNA copy number by penalized estimation and imputation. Ann Appl Stat 4:1749–1773
Zhou H, Lange K (2012) A path algorithm for constrained estimation. J Comput Graph Stat DOI 10.1080/10618600.2012.681248
Zhou H, Lange K (2012) Path following in the exact penalty method of convex programming (submitted)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this chapter
Cite this chapter
Lange, K. (2013). Convex Minimization Algorithms. In: Optimization. Springer Texts in Statistics, vol 95. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-5838-8_16
Download citation
DOI: https://doi.org/10.1007/978-1-4614-5838-8_16
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-5837-1
Online ISBN: 978-1-4614-5838-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)