Skip to main content
Log in

A Lagrange–Newton algorithm for sparse nonlinear programming

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

The sparse nonlinear programming (SNP) problem has wide applications in signal and image processing, machine learning and finance, etc. However, the computational challenge posed by SNP has not yet been well resolved due to the nonconvex and discontinuous \(\ell _0\)-norm involved. In this paper, we resolve this numerical challenge by developing a fast Newton-type algorithm. As a theoretical cornerstone, we establish a first-order optimality condition for SNP based on the concept of strong \(\beta \)-Lagrangian stationarity via the Lagrangian function, and reformulate it as a system of nonlinear equations called the Lagrangian equations. The nonsingularity of the corresponding Jacobian is discussed, based on which the Lagrange–Newton algorithm (LNA) is then proposed. Under mild conditions, we establish the locally quadratic convergence and its iterative complexity estimation. To further demonstrate the efficiency and superiority of our proposed algorithm, we apply LNA to two specific problems arising from compressed sensing and sparse high-order portfolio selection, in which significant benefits accrue from the restricted Newton step.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. HTP is available at:https://github.com/foucart/HTP.

  2. NIHT, GP and OMP are available at https://www.southampton.ac.uk/engineering/about/staff/tb1m08.page#software. We use the version sparsity_0_5 in which NIHT, GP and OMP are called hard_l0_Mterm, greed_gp and greed_omp.

  3. CoSaMP and SP are available at:http://media.aau.dk/null_space_pursuits/2011/07/a-few-corrections-to-cosamp-and-sp-matlab.html.

  4. http://cran.r-project.org/web/packages/portfolioBacktest/vignettes.

  5. http://www.mathworks.com/matlabcentral/fileexchange/47839-co_moments-m.

References

  1. Beck, A., Eldar, Y.C.: Sparsity constrained nonlinear optimization: optimality conditions and algorithms. SIAM J. Optim. 23(3), 1480–1509 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  2. Beck, A., Hallak, N.: On the minimization over sparse symmetric sets: projections, optimality conditions, and algorithms. Math. Oper. Res. 41(1), 196–223 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  3. Beck, A., Vaisbourd, Y.: The sparse principal component analysis problem: Optimality conditions and algorithms. J. Optim. Theory Appl. 170(1), 119–143 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  4. Blumensath, T., Davies, M.E.: Gradient pursuits. IEEE Trans. Signal Process. 56(6), 2370–2382 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  5. Blumensath, T., Davies, M.E.: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27(3), 265–274 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  6. Blumensath, T., Davies, M.E.: Normalized iterative hard thresholding: guaranteed stability and performance. IEEE J. Sel. Top. Signal Process. 4(2), 298–309 (2010)

    Article  Google Scholar 

  7. Boudt, K., Lu, W., Peeters, B.: Higher order comoments of multifactor models and asset allocation. Financ. Res. Lett. 13, 225–233 (2015)

    Article  Google Scholar 

  8. Červinka, M., Kanzow, C., Schwartz, A.: Constraint qualifications and optimality conditions for optimization problems with cardinality constraints. Math. Program. 160(1), 353–377 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  9. Chen, J., Gu, Q.: Fast Newton hard thresholding pursuit for sparsity constrained nonconvex optimization. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pp. 757–766 (2017)

  10. Chen, X., Ge, D., Wang, Z., Ye, Y.: Complexity of unconstrained \(l_2-l_p\) minimization. Math. Program. 143(1–2), 371–383 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dai, W., Milenkovic, O.: Subspace pursuit for compressive sensing signal reconstruction. IEEE Trans. Inf. Theory 55(5), 2230–2249 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  12. Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  13. Elad, M.: Sparse and Redundant Representations. Springer, New York (2010)

    Book  MATH  Google Scholar 

  14. Elad, M., Figueiredo, M.A., Ma, Y.: On the role of sparse and redundant representations in image processing. Proc. IEEE 98(6), 972–982 (2010)

    Article  Google Scholar 

  15. Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its Oracle properties. J. Am. Stat. Assoc. 96(456), 1348–1360 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  16. Foucart, S.: Hard thresholding pursuit: an algorithm for compressive sensing. SIAM J. Numer. Anal. 49(6), 2543–2563 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  17. Gao, J., Li, D.: Optimal cardinality constrained portfolio selection. Oper. Res. 61(3), 745–761 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  18. Gotoh, J.Y., Takeda, A., Tono, K.: DC formulations and algorithms for sparse optimization problems. Math. Program 169(1), 141–176 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  19. Han, S.P.: Superlinearly convergent variable metric algorithms for general nonlinear programming problems. Math. Program. 11, 263–282 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  20. Koh, K., Kim, S.J., Boyd, S.: An interior-point method for large-scale \(\ell _1\)-regularized logistic regression. J. Mach. Learn. Res. 8, 1519–1555 (2007)

    MathSciNet  MATH  Google Scholar 

  21. Kyrillidis, A., Becker, S., Cevher, V., Koch, C.: Sparse projections onto the simplex. In: Proceedings of the 30th international conference on machine learning, Atlanta, Georgia, USA, 2013, vol. 28, pp. 235–243 (2013)

  22. Lu, Z.: Optimization over sparse symmetric sets via a nonmonotone projected gradient method. arXiv:1509.08581 (2015)

  23. Lu, Z., Zhang, Y.: Sparse approximation via penalty decomposition methods. SIAM J. Optim. 23(4), 2448–2478 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  24. Luo, Z., Sun, D., Toh, K.C., Xiu, N.: Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method. J. Mach. Learn. Res. 20(106), 1–25 (2019)

    MathSciNet  MATH  Google Scholar 

  25. Luo, Z.Q., Pang, J.S., Ralph, D. Piecewise sequential quadratic programming for mathematical programs with nonlinear complementarity constraints. In: Migdalas, A., Pardalos, P.M., Värbrand, P. (eds.) Multilevel Optimization: Algorithms and Applications. Nonconvex Optimization and Its Applications, vol 20. Springer, Boston, MA (1998)

  26. Misra, J.: Interactive exploration of microarray gene expression patterns in a reduced dimensional space. Genome Res. 12(7), 1112–1120 (2002)

    Article  Google Scholar 

  27. Natarajan, B.K.: Sparse approximate solutions to linear systems. SIAM J. Comput. 24(2), 227–234 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  28. Needell, D., Tropp, J.A.: CoSaMP: iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harmon. Anal. 26(3), 301–321 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  29. Negahban, S.N., Ravikumar, P., Wainwright, M.J., Yu, B., et al.: A unified framework for high-dimensional analysis of m-estimators with decomposable regularizers. Stat. Sci. 27(4), 538–557 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  30. Pan, L., Luo, Z., Xiu, N.: Restricted Robinson constraint qualification and optimality for cardinality-constrained cone programming. J. Optim. Theory Appl. 175(1), 104–118 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  31. Pan, L., Xiu, N., Fan, J.: Optimality conditions for sparse nonlinear programming. Sci. China Math. 60(5), 759–776 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  32. Pan, L., Xiu, N., Zhou, S.: On solutions of sparsity constrained optimization. J. Oper. Res. Soc. China 3(4), 421–439 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  33. Pan, L., Zhou, S., Xiu, N., Qi, H.D.: Convergent iterative hard thresholding for sparsity and nonnegativity constrained optimization. Pacif. J. Optim. 13(2), 325–353 (2017)

    MATH  Google Scholar 

  34. Pati, Y.C., Rezaiifar, R., Krishnaprasad, P.S.: Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition. In: Proceedings of 27th Asilomar Conference on Signals, Systems and Computers, pp. 40–44. IEEE (1993)

  35. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  36. Tropp, J.A., Gilbert, A.C.: Signal recovery from random measurements via orthogonal matching pursuit. IEEE Trans. Inf. Theory 53(12), 4655–4666 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  37. Wang, J., Deng, Z., Zheng, T., So, A.M.C.: Sparse high-order portfolios via proximal DCA and SCA. In: ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 5425–5429 (2021)

  38. Wilson, R.B.: A simplicial algorithm for concave programming. Ph.D. thesis, Graduate School of Business Administration, Harvard University (1963)

  39. Xu, F., Lu, Z., Xu, Z.: An efficient optimization approach for a cardinality-constrained index tracking problem. Optim. Methods Softw. 31, 258–271 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  40. Yin, P., Lou, Y., He, Q., Xin, J.: Minimization of \(\ell _{1-2}\) for compressed sensing. SIAM J. Sci. Comput. 37(1), 536–563 (2015)

    Article  MathSciNet  Google Scholar 

  41. Yuan, X., Li, P., Zhang, T.: Gradient hard thresholding pursuit. J. Mach. Learn. Res. 18(166), 1–43 (2018)

    MathSciNet  MATH  Google Scholar 

  42. Yuan, X., Liu, Q.: Newton greedy pursuit: a quadratic approximation method for sparsity-constrained optimization. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 4122–4129 (2014)

  43. Yuan, X., Liu, Q.: Newton-type greedy selection methods for \(\ell _0\)-constrained minimization. IEEE Trans. Pattern Anal. Mach. Intell. 39(12), 2437–2450 (2017)

    Article  Google Scholar 

  44. Zhou, S., Xiu, N., Qi, H.: Global and quadratic convergence of Newton hard-thresholding pursuit. J. Mach. Learn. Res. 22, 1–45 (2021)

    MathSciNet  MATH  Google Scholar 

  45. Zhou, S., Xiu, N., Wang, Y., Kong, L., Qi, H.D.: A null-space-based weighted \(\ell _1\) minimization approach to compressed sensing. Inf. Inference J. IMA 5(1), 76–102 (2016)

    MATH  Google Scholar 

  46. Zhou, T., Tao, D., Wu, X.: Manifold elastic net: a unified framework for sparse dimension reduction. Data Min. Knowl. Disc. 22(3), 340–371 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  47. Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. J. Comput. Graph. Stat. 15(2), 265–286 (2006)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We would like to thank AE and the two referees for their valuable comments, which have helped to shorten the paper and greatly improved its presentation. We also thank Dr. Shenglong Zhou of Imperial College, London for his great support on the numerical experiments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ziyan Luo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This research was partially supported by the National Natural Science Foundation of China (11771038, 11971052, 12011530155) and Beijing Natural Science Foundation (Z190002)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, C., Xiu, N., Qi, H. et al. A Lagrange–Newton algorithm for sparse nonlinear programming. Math. Program. 195, 903–928 (2022). https://doi.org/10.1007/s10107-021-01719-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-021-01719-x

Keywords

Mathematics Subject Classification

Navigation