Skip to main content
Log in

A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

Square-root Lasso problems have already be shown to be robust regression problems. Furthermore, square-root regression problems with structured sparsity also plays an important role in statistics and machine learning. In this paper, we focus on the numerical computation of large-scale linearly constrained sparse group square-root Lasso problems. In order to overcome the difficulty that there are two nonsmooth terms in the objective function, we propose a dual semismooth Newton (SSN) based augmented Lagrangian method (ALM) for it. That is, we apply the ALM to the dual problem with the subproblem solved by the SSN method. To apply the SSN method, the positive definiteness of the generalized Jacobian is very important. Hence we characterize the equivalence of its positive definiteness and the constraint nondegeneracy condition of the corresponding primal problem. In numerical implementation, we fully employ the second order sparsity so that the Newton direction can be efficiently obtained. Numerical experiments demonstrate the efficiency of the proposed algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data Availability

The datasets analysed during the current study are available at the following link: https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/regression.html

Notes

  1. The assumption \(\mathcal {A}\bar{x}-b=\bar{y}\ne 0\) is reasonable, since this is equivalent to the requirement that no overfitting occurs.

References

  1. Altenbuchinger, M., Rehberg, T., Zacharias, H.U., Stmmler, F., Dettmer, K., Weber, D., Hiergeist, A., Gessner, A., Holler, E., Oefner, P.J., Spang, R.: Reference point insensitive molecular data analysis. Bioinformatics 33, 219–226 (2017)

    Article  Google Scholar 

  2. Atitallah, I.B., Thrampoulidis, C., Kammoun, A., Al-Naffouri, T.Y., Alouini, M.-S., Hassibi, B.: The BOX-LASSO with application to GSSK modulation in massive MIMO systems. In: 2017 IEEE international symposium on information theory

  3. Belloni, A., Chernozhukov, V., Wang, L.: Square-root lasso: pivotal recovery of sparse signals via conic programming. Biometrika 98, 791–806 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bonnans, J.F., Shapiro, A.: Perturbation Analysis of Optimization Problems. Springer, New York (2000)

    Book  MATH  Google Scholar 

  5. Bunea, F., Lederer, J., She, Y.: The group square-root Lasso: theoretical properties and fast algorithms. IEEE Trans. Inf. Theory 60, 1313–1325 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  6. Chen, X.D., Sun, D., Sun, J.: Complementarity functions and numerical experiments for second-order-cone complementarity problems. Comput. Optim. Appl. 25, 39–56 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  7. Chu, H.T.M., Toh, K.-C., Zhang, Y.: On regularized square-root regression problems: distributionally robust interpretation and fast computations. arXiv:2109.03632 (2021)

  8. Ding, C.: Variational analysis of the Ky Fan k-norm. Set-Valued Val. Anal. 25, 265–296 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  9. Dontchev, A.L., Rockafellar, R.T.: Implicit Functions and Solution Mappings. Springer (2009)

    Book  MATH  Google Scholar 

  10. Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer (2003)

    MATH  Google Scholar 

  11. Friedman, J., Hastie, T., Tibshirani, R.: A note on the group lasso and a sparse group lasso. arXiv:1001.0736 (2010)

  12. Gaines, B.R., Zhou, H.: Algorithms for fitting the constrained Lasso. J. Comput. Graph. Stat. 27, 861–871 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  13. Hiriart-Urruty, J.-B., Strodiot, J.-J., Nguyen, V.H.: Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data. Appl. Math. Opt. 11, 43–56 (1984)

    Article  MATH  Google Scholar 

  14. Huang, L., Jia, J., Yu, B., Chun, B.G., Maniatis, P. and Naik, M.: Predicting execution time of computer programs using sparse polynomial regression. In: Advances in neural information processing systems 23: conference on neural information processing systems a meeting held December (2010)

  15. James, G.M., Paulson, C.: Rusmevichientong, P.: Penalized and constrained regression. unpublished manuscript, University of Southern California (2013)

  16. Kummer, B.: Newton’s method for non-differentiable functions. Adv. Math. Optim. 45, 114–125 (1988)

    Article  MATH  Google Scholar 

  17. Li, X., Sun, D., Toh, K.-C.: A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems. SIAM J. Optim. 28, 433–458 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  18. Li, X.G., Zhao, T., Yuan, X.M., Liu, H.: The flare package for high dimensional linear regression and precision matrix estimation in R. J. Mach. Learn. Res. 16, 553–557 (2015)

    MathSciNet  MATH  Google Scholar 

  19. Lichman, M.: UCI machine learning repository. (2013)

  20. Lin, W., Shi, P., Feng, R., Li, H.: Variable selection in regression with compositional covariates. Biometrika 101, 785–797 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  21. Luque, F.J.: Asymptotic convergence analysis of the proximal point algorithm. SIAM J. Control. Optim. 22, 277–293 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  22. Meng, F.W., Sun, D., Zhao, G.Y.: Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization. Math. Program. 104, 561–581 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  23. Minty, G.J.: On the monotonicity of the gradient of a convex function. Pac. J. Math. 14, 243–247 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  24. Moreau, J.J.: Proximité et dualité dans un espace Hilbertien. Bull. Soc. Math. France 93, 273–299 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  25. O’Donoghue, B., Chu, E., Parikh, N., Boyd, S.: Conic optimization via operator splitting and homogeneous self-dual embedding. J. Optimiz. Theory App. 169, 1042–1068 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  26. Qi, L., Sun, J.: A nonsmooth version of Newton’s method. Math. Program. 58, 353–367 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  27. Robinson, S.M.: Some continuity properties of polyhedral multifunctions. Math. Program. Stud. 14, 206–214 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  28. Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control. Optim. 14, 877–898 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  29. Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1, 97–116 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  30. Rockafellar, R.T.: Convex Analysis. Princeton Math. Ser. 28, Princeton University Press, Princeton (1996)

  31. Rockafellar, R.T., Wets, R.J.B.: Variational Analysis. Springer (1998)

    Book  MATH  Google Scholar 

  32. Santosa, F., Symes, W.: Linear inversion of band-limited reflection seismograms. SIAM J. Sci. Stat. Comput. 7, 1307–1330 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  33. Shi, P., Zhang, A., Li, H.: Regression analysis for microbiome compositional data. Ann. Appl. Stat. 10, 1019–1040 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  34. Stucky, B., van de Geer, S.: Sharp oracle inequalities for square root regularization. J. Mach. Learn. Res. 18, 1–29 (2017)

    MathSciNet  MATH  Google Scholar 

  35. Sun, J.: On monotropic piecewise qudratic programming. PhD thesis, University of Washington, Seattle (1986)

  36. Sun, T., Zhang, C.: Scaled sparse linear regression. Biometrika 99, 879–898 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  37. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Series B (Methodological) 58, 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  38. Xu, H., Caramanis, C., Mannor, S.: Robust regression and Lasso. IEEE Trans. Inf. Theory 56, 3561–3573 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  39. Yang, W.Z., Xu, H.: A unified robust regression model for Lasso-like algorithms. In: The 30th International conference on machine learning, ICML 2013 (PART 2): 1622–1630

  40. Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Series B (Statistical Methodology) 68, 49–67 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  41. Yu, Y.: On decomposing the proximal map. Adv. Neural Inform. Process. Syst 1, 91–99 (2013)

    Google Scholar 

  42. Zhang, H., Jiang, J., Luo, Z.-Q.: On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems. J. Oper. Res. Soc. China 1, 163–186 (2013)

    Article  MATH  Google Scholar 

  43. Zhang, Y., Zhang, N., Sun, D., Toh, K.-C.: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems. Math. Progam. 179, 223–263 (2020)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We would like to thank the Editor-in-Chief Professor Chi-Wang Shu and the anonymous referees for their helpful suggestions which greatly improves the quality of the manuscript.

Funding

Chengjing Wang’s work was supported in part by the National Natural Science Foundation of China (No. U21A20169), Zhejiang Provincial Natural Science Foundation of China (Grant No. LTGY23H240002). Peipei Tang’s work was supported in part by the Scientific Research Foundation of Zhejiang University City College (No. X-202112).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peipei Tang.

Ethics declarations

Conflict of interest

The authors have not disclosed any competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, C., Tang, P. A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems. J Sci Comput 96, 45 (2023). https://doi.org/10.1007/s10915-023-02271-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10915-023-02271-w

Keywords

Mathematics Subject Classification

Navigation