Advertisement

Statistics and Computing

, Volume 29, Issue 3, pp 501–516 | Cite as

A consistent and numerically efficient variable selection method for sparse Poisson regression with applications to learning and signal recovery

  • Sabrina GuastavinoEmail author
  • Federico Benvenuto
Article
  • 122 Downloads

Abstract

We propose an adaptive \(\ell _1\)-penalized estimator in the framework of Generalized Linear Models with identity-link and Poisson data, by taking advantage of a globally quadratic approximation of the Kullback-Leibler divergence. We prove that this approximation is asymptotically unbiased and that the proposed estimator has the variable selection consistency property in a deterministic matrix design framework. Moreover, we present a numerically efficient strategy for the computation of the proposed estimator, making it suitable for the analysis of massive counts datasets. We show with two numerical experiments that the method can be applied both to statistical learning and signal recovery problems.

Keywords

Adaptive regularization Lasso Model selection Sparse Poisson regression Statistical learning Image processing 

Mathematics Subject Classification

62G08 62G20 62J07 

Notes

References

  1. Anscombe, F.J.: The transformation of Poisson binomial and negative-binomial data. Biometrika 35(3/4), 246–254 (1948).  https://doi.org/10.2307/2332343 MathSciNetCrossRefzbMATHGoogle Scholar
  2. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009).  https://doi.org/10.1137/080716542 MathSciNetCrossRefzbMATHGoogle Scholar
  3. Bertero, M., Lanteri, H., Zanni, L.: Iterative image reconstruction: a point of view. pp 37–63 (2008)Google Scholar
  4. Bogdan, M., van den Berg, E., Sabatti, C., Su, W., Candès, E.J.: SLOPE-adaptive variable selection via convex optimization. Ann. Appl. Stat. 9(3), 1103 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  5. Bonettini, S., Benvenuto, F., Zanella, R., Zanni, L., Bertero, M.: Gradient projection approaches for optimization problems in image deblurring and denoising. In: 2009 17th European Signal Processing Conference, pp. 1384–1388 (2009)Google Scholar
  6. Candès, E.J., Wakin, M.B., Boyd, S.P.: Enhancing sparsity by reweighted \(\ell \)1 minimization. J. Fourier Anal. Appl. 14(5–6), 877–905 (2008).  https://doi.org/10.1007/s00041-008-9045-x MathSciNetCrossRefzbMATHGoogle Scholar
  7. De Vito, E., Rosasco, L., Caponnetto, A., Giovannini, U.D., Odone, F.: Learning from examples as an inverse problem. J. Mach. Learn. Res. 6(May), 883–904 (2005)MathSciNetzbMATHGoogle Scholar
  8. Dobson, A.J., Barnett, A.: An Introduction to Generalized Linear Models. CRC Press, Boca Raton (2008)zbMATHGoogle Scholar
  9. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R., et al.: Least angle regression. Ann. Stat. 32(2), 407–499 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  10. Figueiredo, M.A.T., Bioucas-Dias, J.M.: Restoration of Poissonian images using alternating direction optimization. IEEE Trans. Image Process. 19(12), 3133–3145 (2010).  https://doi.org/10.1109/TIP.2010.2053941 MathSciNetCrossRefzbMATHGoogle Scholar
  11. Friedman, J., Hastie, T., Tibshirani, R.: The Elements of Statistical Learning, vol. 1. Springer, Berlin (2001)zbMATHGoogle Scholar
  12. Friedman, J., Hastie, T., Hfling, H., Tibshirani, R., et al.: Pathwise coordinate optimization. Ann. Appl. Stat. 1(2), 302–332 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  13. Friedman, J., Hastie, T., Tibshirani, R.: Regularization paths for generalized linear models via coordinate descent. J. Stat. Softw. https://doi.org/10.18637/jss.v033.i01 (2010)
  14. Gu, R., Dogandžić, A.: A fast proximal gradient algorithm for reconstructing nonnegative signals with sparse transform coefficients. In: 48th Asilomar Conference on, IEEE Signals, Systems and Computers, 2014, pp. 1662–1667 (2014)Google Scholar
  15. Hansen, N.R., Reynaud-Bouret, P., Rivoirard, V., et al.: Lasso and probabilistic inequalities for multivariate point processes. Bernoulli 21(1), 83–143 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  16. Harmany, Z.T., Marcia, R.F., Willett, R.M.: SPIRAL out of convexity: sparsity-regularized algorithms for photon-limited imaging. In: International Society for Optics and Photonics IS&T/SPIE Electronic Imaging, pp. 75,330R–75,330R (2010)Google Scholar
  17. Ivanoff, S., Picard, F., Rivoirard, V.: Adaptive Lasso and group-Lasso for functional Poisson regression. J. Mach. Learn. Res. 17(55), 1–46 (2016)MathSciNetzbMATHGoogle Scholar
  18. Jiang, X., Reynaud-Bouret, P., Rivoirard, V., Sansonnet, L., Willett, R.: A data-dependent weighted LASSO under Poisson noise. arXiv preprint arXiv:1509.08892 (2015)
  19. Marschner, I.C., et al.: Glm2: fitting generalized linear models with convergence problems. The R Journal 3(2), 12–15 (2011)CrossRefGoogle Scholar
  20. Martinez, J.G., Carroll, R.J., Mller, S., Sampson, J.N., Chatterjee, N.: Empirical performance of cross-validation with oracle methods in a genomics context. Am. Stat. 65(4), 223–228 (2011)MathSciNetCrossRefGoogle Scholar
  21. McCullagh, P., Nelder, J.A.: Generalized linear models, 2nd edn. Chapman & Hall, New York (1989)Google Scholar
  22. Peyré, G.: The numerical tours of signal processing. Comput. Sci. Eng. 13(4), 94–97 (2011).  https://doi.org/10.1109/MCSE.2011.71 CrossRefGoogle Scholar
  23. Prince, J.L., Links, J.M.: Medical Imaging Signals and Systems. Pearson Prentice Hall Upper Saddle River, New Jersey (2006)Google Scholar
  24. Silva, J., Tenreyro, S.: Poisson: some convergence issues. Stata Journal 11(2), 207–212 (2011)CrossRefGoogle Scholar
  25. Starck, J.L., Murtagh, F.: Astronomical Image and Data Analysis. Springer, New York (2007)zbMATHGoogle Scholar
  26. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B (Methodological) 58, 267–288 (1996)MathSciNetzbMATHGoogle Scholar
  27. Zanella, R., Boccacci, P., Zanni, L., Bertero, M.: Corrigendum: efficient gradient projection methods for edge-preserving removal of Poisson noise. Inverse Probl. 29(11), 119–501 (2013)CrossRefzbMATHGoogle Scholar
  28. Zhao, P., Yu, B.: On model selection consistency of lasso. J Mach Learn Res 7, 2541–2563 (2006)MathSciNetzbMATHGoogle Scholar
  29. Zou, H.: The adaptive lasso and its oracle properties. J. Am. Stat. Assoc. 101(476), 1418–1429 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  30. Zou, H., Zhang, H.H.: On the adaptive elastic-net with a diverging number of parameters. Ann. Stat. 37(4), 1733 (2009)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of MathematicsUniversita’ degli Studi di GenovaGenovaItaly

Personalised recommendations