Abstract
The lasso of Tibshirani (1996) is a least-squares problem regularized by the ℓ 1 norm. Due to the sparseness promoting property of the ℓ 1 norm, the lasso has been received much attention in recent years. In this paper some basic properties of the lasso and two variants of it are exploited. Moreover, the proximal method and its variants such as the relaxed proximal algorithm and a dual method for solving the lasso by iterative algorithms are presented.
Similar content being viewed by others
References
Candes, E. J., Romberg, J. and Tao, T., Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information, IEEE Trans. Inform. Theory, 52(2), 2006, 489–509.
Candes, E. J., Romberg, J. and Tao, T., Stable signal recovery from incomplete and inaccurate measurements, Comm. Pure Applied Math., 59(2), 2006, 1207–1223.
Candes, E. J. and Tao, T., Near-optimal signal recovery from random projections: Universal encoding strategies? IEEE Trans. Inform. Theory, 52(12), 2006, 5406–5425.
Candes, E. J. and Wakin, M. B., An introduction to compressive sampling, IEEE Signal Processing Magazine, 2008, 21–30.
Cipra, B. A., ℓ 1-magic, SIAM News, 39(9), 2006.
Combettes, P. L. and Wajs, R., Signal recovery by proximal forward-backward splitting, Multiscale Model. Simul., 4(4), 2005, 1168–1200.
Donoho, D., Compressed sensing, IEEE Trans. Inform. Theory, 52(4), 2006, 1289–1306.
Friedman, J., Hastie, T. and Tibshirani, R., A note on the group lasso and a sparse group lasso, arXiv:1001.0736V1.
Geobel, K. and Kirk, W. A., Topics in Metric Fixed Point Theory, Cambridge Studies in Advanced Mathematics, Vol. 28, Cambridge University Press, 1990.
Hebiri, M. and van de Geer, S., The smooth-lasso and other ℓ 1+ℓ 2-penalized methods, Electron. J. Statist., 5, 2011, 1184–1226.
Marino, G. and Xu, H. K., Convergence of generalized proximal point algorithms, Comm. Pure Appl. Anal., 3(3), 2004, 791–808.
Micchelli, C. A., Shen, L. and Xu, Y., Proximity algorithms for image models: Denoising, Inverse Problems, 27, 2011, 045009, 30pp.
Moreau, J. J., Proprietes des applications “prox”, C. R. Acad. Sci. Paris Ser. A Math., 256, 1963, 1069–1071.
Moreau, J. J., Proximite et dualite dans un espace hilbertien, Bull. Soc. Math. France, 93, 1965, 272–299.
Raasch, T., On the L-curve criterion in ℓ 1 regularization of linear discrete ill-posed problems, International Conference on Inverse Problems and Related Topics, Nanjing, 2012.
Tibshirani, R., Regression shrinkage and selection via the lasso, J. Royal Statist. Soc. Ser. B, 58, 1996, 267–288.
Tibshirani, R., Saunders, M., Rosset, R., et al., Sparsity and smoothness via the fused lasso, J. Royal Statist. Soc., Ser. B, 67, 2005, 91–108.
Xu, H. K., Averaged mappings and the gradient-projection algorithm, J. Optim. Theory Appl., 150, 2011, 360–378.
Yuan, M. and Lin, Y., Model selection and estimation in regression with grouped variables, J. Royal Statist. Soc., Ser. B, 68, 2006, 49–67.
Zou, H. and Hastie, T., Regularization and variable selection via the elastic net, J. Royal Statist. Soc., Ser. B, 67, 2005, 301–320.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by NSC 102-2115-M-110-001-MY3.
Rights and permissions
About this article
Cite this article
Xu, HK. Properties and iterative methods for the lasso and its variants. Chin. Ann. Math. Ser. B 35, 501–518 (2014). https://doi.org/10.1007/s11401-014-0829-9
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11401-014-0829-9
Keywords
- Lasso
- Elastic net
- Smooth-lasso
- ℓ 1 regularization
- Sparsity
- Proximal method
- Dual method
- Projection
- Thresholding