Analysis of Bayesian LASSO Using High Dimensional Data

  • Xuan HuangEmail author
  • Yinsong Ye
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1117)


The sparse models play an important role in the field of machine learning. It can dimensionality reduction and effectively solve the over-fitting problem in modeling. The Bayesian method forms a priori distribution by fusing different information to obtain high-quality statistical inference. This paper summarizes the representative sparse model LASSO and Bayesian theory-based model Bayesian LASSO, and discussed the relationship between the two models. Through numeric experiments, the effects of the two models on variable selection were compared, and the parameter estimation of Bayesian LASSO under different prior conditions is further analyzed. Results attained showed that the two models have good effects in the variable selection. When the number of samples is small, especially when the number of samples is much smaller than the number of features, the effect of Bayesian LASSO is more prominent. It is also possible to estimate the model parameters and calculate the Bayesian confidence interval for each regression coefficient at a certain level of confidence, which is more flexible and convenient.


Sparsity Feature selection LASSO Bayesian LASSO Parameter estimation 


  1. 1.
    Lorbert, A., Ramadge, P.J.: The pairwise elastic net support vector machine for automatic fMRI feature selection. In: International Conference on Acoustics, Speech, and Signal Processing, pp. 1036–1040 (2013)Google Scholar
  2. 2.
    Chandran, M.: Analysis of Bayesian group-lasso in regression models. University of Florida, Florida, USA (2011)Google Scholar
  3. 3.
    Tung, D.T., Tran, M.N., Cuong, T.M.: Bayesian adaptive lasso with variational Bayes for variable selection in high-dimensional generalized linear mixed models. Commun. Stat. Simul. Comput. 48(2), 530–543 (2019)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Bondell, H.D., Reich, B.J.: Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with OSCAR. Biometrics 64(1), 115–123 (2008)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Liu, J., Cui, L., Liu, Z., et al.: Survey on the regularized sparse models. Chin. J. Comput. 38(7), 1307–1325 (2015)MathSciNetGoogle Scholar
  6. 6.
    Zeng, L., Xie, J.: Group variable selection via SCAD-L2. Statistics 48(1), 49–66 (2014)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Park, T., Casella, G.: The Bayesian lasso. J. Am. Stat. Assoc. 103(482), 681–686 (2008)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Alhamzawi, R., Taha Mohammad Ali, A.: A new Gibbs sampler for Bayesian lasso. Commun. Stat. Simul. Comput. 1–17 (2018)Google Scholar
  9. 9.
    Shang, H., Feng, M., Zhang, B., et al.: Variable selection and outlier detection based on Bayesian lasso method. Appl. Res. Comput. 32(12), 3586–3589 (2015)Google Scholar
  10. 10.
    Tibshiranit, R.: Regression shrinkage and selection via the lasso. J. Roy. Stat. Soc. Ser. B 58(1), 267–288 (1996)MathSciNetGoogle Scholar
  11. 11.
    Lu, W., Yu, Z., Gu, Z., et al.: Variable selection using the Lasso-Cox model with Bayesian regularization. In: Conference on Industrial Electronics and Applications, Wuhan, China, pp. 924–927 (2018)Google Scholar
  12. 12.
    Xu, X., Ghosh, M.: Bayesian variable selection and estimation for group lasso. Bayesian Anal. 10(4), 909–936 (2015)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Botev, Z., Chen, Y., L’Ecuyer, P., et al.: Exact posterior simulation from the linear LASSO regression. In: Winter Simulation Conference, Gothenburg, Sweden, pp. 1706–1717 (2018)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Chengdu College of University of Electronic Science and Technology of ChinaChengduChina
  2. 2.Chongqing University of Posts and TelecommunicationsChongqingChina

Personalised recommendations