Skip to main content
Log in

Abstract

We propose the Bayesian adaptive Lasso (BaLasso) for variable selection and coefficient estimation in linear regression. The BaLasso is adaptive to the signal level by adopting different shrinkage for different coefficients. Furthermore, we provide a model selection machinery for the BaLasso by assessing the posterior conditional mode estimates, motivated by the hierarchical Bayesian interpretation of the Lasso. Our formulation also permits prediction using a model averaging strategy. We discuss other variants of this new approach and provide a unified framework for variable selection using flexible penalties. Empirical evidence of the attractiveness of the method is demonstrated via extensive simulation studies and data analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Andrews, D. F., Mallows, C. L. (1974). Scale mixtures of normal distributions. Journal of the Royal Statistical Society, Series B, 36, 99–102.

    Google Scholar 

  • Atchade, Y. F. (2011). A computational framework for empirical Bayes inference. Statistics and Computing, 21, 463–473.

    Google Scholar 

  • Barbieri, M. M., Berger, J. O. (2004). Optimal predictive model selection. The Annals of Statistics, 32, 870–897.

    Google Scholar 

  • Bazaraa, M. S., Sherali, H. D., Shetty, C. M. (2006). Nonlinear Programming (3rd ed.). New Jersey: Wiley.

  • Casella, G. (2001). Empirical Bayes Gibbs sampling. Biostatistics, 2, 485–500.

    Google Scholar 

  • Efron, B., Hastie, T., Johnstone, I., Tibshirani, R. (2004). Least angle regression (with discussion). The Annals of Statistics, 32, 407–451.

    Google Scholar 

  • Figueiredo, M., Nowak, R., Wright, S. (2007). Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE Journal of Selected Topics in Signal Processing: Special Issue on Convex Optimization Methods for Signal Processing, 1(4), 586–598.

  • Good, I. J. (1952). Rational decisions. Journal of the Royal Statistical Society, Series B, 14, 107–114.

  • Griffin, J. E., Brown, P. J. (2011). Bayesian hyper-lassos with non-convex penalization. Australian and New Zealand Journal of Statistics, 53, 423–442.

    Google Scholar 

  • Hans, C. (2010). Model uncertainty and variable selection in Bayesian Lasso regression. Statistics and Computing, 20, 221–229.

    Google Scholar 

  • Hoeting, J. A., Madigan, D., Raftery, A. E., Volinsky, C. T. (1999). Bayesian model averaging: a tutorial. Statistical Science, 14, 382–417.

    Google Scholar 

  • Johnson, R. W. (1996). Fitting percentage of body fat to simple body measurements. Journal of Statistics Education, 4(1), 265–266.

    Google Scholar 

  • Kyung, M., Gill, J., Ghosh, M., Casella, G. (2010). Penalized regression, standard errors and Bayesian Lassos. Beyesian Statistics, 5, 369–412.

    Google Scholar 

  • Lehmann, E. L., Casella, G. (1998). Theory of Point Estimation (2nd ed.). New York: Springer.

  • Osborne, M. R., Presnell, B., Turlach, B. A. (2000). A new approach to variable selection in least squares problems. IMA Journal of Numerical Analysis, 20, 389–404.

    Google Scholar 

  • Park, T., Casella, G. (2008). The Bayesian Lasso. Journal of the American Statistical Association, 103, 681–686.

    Google Scholar 

  • Raftery, A. E., Madigan, D., Hoeting, J. A. (1997). Bayesian model averaging for linear regression models. Journal of the American Statistical Association, 92, 179–191.

    Google Scholar 

  • Stamey, T., Kabalin, J., McNeal, J., Johnstone, I., Freiha, F., Redwine, E., et al. (1989). Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate ii. radical prostatectomy treated patients. Journal of Urology, 16, 1076–1083.

    Google Scholar 

  • Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society, Series B, 58, 267–288.

    Google Scholar 

  • Wang, H., Leng, C. (2007). Unified Lasso estimation via least squares approximation. Journal of the American Statistical Association, 52, 5277–5286.

    Google Scholar 

  • Wang, H., Leng, C. (2008). A note on adaptive group Lasso. Computational Statistics and Data Analysis, 52, 5277–5286.

    Google Scholar 

  • Wang, H., Li, G., Tsai, C. L. (2007). Regression coefficients and autoregressive order shrinkage and selection via the Lasso. Journal of the Royal Statistical Society, Series B, 69, 63–78.

    Google Scholar 

  • Yuan, M., Lin, Y. (2005). Efficient empirical Bayes variable selection and estimation in linear models. Journal of the American Statistical Association, 100, 1215–1225.

    Google Scholar 

  • Yuan, M., Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B, 68, 49–67.

    Google Scholar 

  • Zhao, P., Yu, B. (2006). On model selection consistency of Lasso. Journal of Machine Learning Research, 7, 2541–2563.

    Google Scholar 

  • Zhao, P., Rocha, G., Yu, B. (2009). The composite absolute penalties family for grouped and hierarchical variable selection. The Annals of Statistics, 37, 3468–3497.

    Google Scholar 

  • Zou, H. (2006). The adaptive Lasso and its oracle properties. Journal of the American Statistical Association, 101, 1418–1429.

    Google Scholar 

Download references

Acknowledgments

The authors would like to thank the referees for the insightful comments which helped to improve the manuscript. The final part of this work was done while M.-N. Tran was visiting the Vietnam Institute for Advanced Study in Mathematics. He would like to thank the institute for supporting the visit.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Minh-Ngoc Tran.

About this article

Cite this article

Leng, C., Tran, MN. & Nott, D. Bayesian adaptive Lasso. Ann Inst Stat Math 66, 221–244 (2014). https://doi.org/10.1007/s10463-013-0429-6

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10463-013-0429-6

Keywords

Navigation