Advertisement

Structured Sparsity Promoting Functions

  • Lixin ShenEmail author
  • Bruce W. Suter
  • Erin E. Tripp
Article
  • 4 Downloads

Abstract

Motivated by the minimax concave penalty-based variable selection in high-dimensional linear regression, we introduce a simple scheme to construct structured sparsity promoting functions from convex sparsity promoting functions and their Moreau envelopes. Properties of these functions are developed by leveraging their structure. In particular, we provide sparsity guarantees for the general family of functions. We further study the behavior of the proximity operators of several special functions, including indicator functions of closed and convex sets, piecewise quadratic functions, and linear combinations of the two. To demonstrate these properties, several concrete examples are presented and existing instances are featured as special cases.

Keywords

Moreau envelope Proximity operator Variable selection Sparsity Thresholding operator 

Mathematics Subject Classification

90C26 90C30 65K99 

Notes

Acknowledgements

Lixin Shen is partially supported by the US National Science Foundation under Grant DMS-1522332.

References

  1. 1.
    Candes, E., Tao, T.: Near optimal signal recovery from random projections: universal encoding strategies? IEEE Trans. Inf. Theory 52(12), 5406–5425 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Mallat, S.: A Wavelet Tour of Signal Processing, 2nd edn. Academic Press, New York, USA (1999)zbMATHGoogle Scholar
  3. 3.
    Suter, B.W.: Multirate and Wavelet Signal Processing. Academic Press, San Diego, USA (1997)zbMATHGoogle Scholar
  4. 4.
    Tibshirani, R.: Regression shrinkage and selection via the LASSO. J. Roy. Stat. Soc. B 58, 267–288 (1996)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Frank, I.E., Friedman, J.H.: A statistical view of some chemometrics regression tools (with discussion). Technometrics 35, 109–148 (1993)CrossRefzbMATHGoogle Scholar
  6. 6.
    Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96(456), 1348–1360 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Zhang, C.H.: Nearly unbiased variable selection under minimax concave penalty. Ann. Stat. 38(2), 894–942 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of łojasiewicz inequalities: subgradient flows, talweg, convexity. Trans. Am. Math. Soc. 362, 3319–3363 (2010)CrossRefzbMATHGoogle Scholar
  9. 9.
    Cannarsa, P., Sinestrari, C.: Semiconcave Functions, Hamilton–Jacobi Equations, and Optimal Control, Progress in Nonlinear Differential Equations and Their Applications, vol. 58. Birkhauser, Boston, USA (2004)zbMATHGoogle Scholar
  10. 10.
    Tao, P., Le Thi, H.: Convex analysis approach to dc programming: theory, algorithms and applications. Acta Mathematica Vietnamica 22(1), 289–355 (1997)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Soubies, E., Blanc-Feraud, L., Aubert, G.: A unified view of exact continuous penalties for \(\ell _2\)-\(\ell _0\) minimization. SIAM J. Optim. 27(3), 2034–2060 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Moreau, J.J.: Proximité et dualité dans un espace hilbertien. Bull. Soc. Math. France 93, 273–299 (1965)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Attouch, H., Bolte, J., Svaiter, B.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods. Math. Program. Ser. A 137, 91–129 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Bauschke, H.L., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. AMS Books in Mathematics. Springer, New York (2011)CrossRefzbMATHGoogle Scholar
  15. 15.
    Combettes, P., Wajs, V.: Signal recovery by proximal forward–backward splitting. Multiscale Model. Simul. SIAM Interdiscip. J. 4, 1168–1200 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6(2), 461–464 (1978)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Chen, S., Donoho, D., Saunders, M.: Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20, 33–61 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Selesnick, I.: Total variation denoising via the moreau envelope. IEEE Signal Process. Lett. 24(2), 216–220 (2017)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Hiriart-Urruty, J.B., Lemarechal, C.: Convex Analysis and Minimization Algorithms I, Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences], vol. 305. Springer-Verlag, Berlin (1993)Google Scholar
  20. 20.
    Micchelli, C.A., Shen, L., Xu, Y.: Proximity algorithms for image models: denoising. Inverse Prob. 27, 045,009 (2011). (30pp)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Donoho, D., Johnstone, I.: Ideal spatial adaptation by wavelet shrinkage. Biometrika 81, 425–455 (1994)MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)zbMATHGoogle Scholar
  23. 23.
    Planiden, C., Wang, X.: Epi-convergence: the Moreau envelope and generalized linear-quadratic functions. J. Optim. Theory Appl. 177(1), 21–63 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. Roy. Stat. Soc. B 67, 301–320 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Donoho, D.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Huber, P.: Robust Statistics, 2nd edn. John Wiley & Sons Inc., Hoboken, New Jersey (2009)CrossRefzbMATHGoogle Scholar
  27. 27.
    Gao, H.Y., Bruce, A.G.: WaveShrink with firm shrinkage. Stat. Sin. 7, 855–874 (1997)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Syracuse UniversitySyracuseUSA
  2. 2.Air Force Research LaboratoryRomeUSA

Personalised recommendations