Skip to main content

A Tutorial on : R Package for the Linearized Bregman Algorithm in High-Dimensional Statistics

  • Chapter
  • First Online:
Handbook of Big Data Analytics

Part of the book series: Springer Handbooks of Computational Statistics ((SHCS))

  • 4364 Accesses

Abstract

The R package, , stands for the LInearized BRegman Algorithm in high-dimensional statistics. The Linearized Bregman Algorithm is a simple iterative procedure which generates sparse regularization paths of model estimation. This algorithm was firstly proposed in applied mathematics for image restoration, and is particularly suitable for parallel implementation in large-scale problems. The limit of such an algorithm is a sparsity-restricted gradient descent flow, called the Inverse Scale Space, evolving along a parsimonious path of sparse models from the null model to overfitting ones. In sparse linear regression, the dynamics with early stopping regularization can provably meet the unbiased oracle estimator under nearly the same condition as LASSO, while the latter is biased. Despite its successful applications, proving the consistency of such dynamical algorithms remains largely open except for some recent progress on linear regression. In this tutorial, algorithmic implementations in the package are discussed for several widely used sparse models in statistics, including linear regression, logistic regression, and several graphical models (Gaussian, Ising, and Potts). Besides the simulation examples, various applications are demonstrated, with real-world datasets such as diabetes, publications of COPSS award winners, as well as social networks of two Chinese classic novels, Journey to the West and Dream of the Red Chamber.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 299.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 379.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 379.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Burger M, Osher S, Xu J, Gilboa G (2005) Nonlinear inverse scale space methods for image restoration. In: Variational, geometric, and level set methods in computer vision. Springer, Berlin, pp 25–36

    Google Scholar 

  • Burger M, Möller M, Benning M, Osher S (2013) An adaptive inverse scale space method for compressed sensing. Math Comput 82(281):269–299

    Article  MathSciNet  Google Scholar 

  • Efron B, Hastie T, Johnstone I, Tibshirani R (2004) Least angle regression. Ann Stat 32(2):407–499

    Google Scholar 

  • Fan J, Li R (2001) Variable selection via nonconcave penalized likelihood and its oracle properties. J Am Stat Assoc 96:1348–1360

    Article  MathSciNet  Google Scholar 

  • Friedman J, Hastie T, Tibshirani R (2008) Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9(3):432–441

    Article  Google Scholar 

  • Hassner M, Sklansky J (1980) The use of Markov random fields as models of texture. Comput Graph Image Process 12(4):357–370

    Article  Google Scholar 

  • Hastie TJ, Tibshirani RJ, Friedman JH (2009) The elements of statistical learning: data mining, inference, and prediction, 2nd edn. Springer, New York. http://opac.inria.fr/record=b1127878. Autres impressions: 2011 (corr.), 2013 (7e corr.)

  • Ising E (1925) Beitrag zur theorie des ferromagnetismus. Z Phys A Hadrons Nucl 31(1):253–258

    Article  Google Scholar 

  • Ji P, Jin J (2016) Coauthorship and citation networks for statisticians. Ann Appl Stat 10(4):1779–1812. http://dx.doi.org/10.1214/15-AOAS896

    Article  MathSciNet  Google Scholar 

  • Manning CD, Schütze H (1999) Foundations of statistical natural language processing, vol 999. MIT, Cambridge

    Google Scholar 

  • Osher S, Ruan F, Xiong J, Yao Y, Yin W (2016) Sparse recovery via differential inclusions. Appl Comput Harmon Anal. https://doi.org/10.1016/j.acha.2016.01.002

    Article  MathSciNet  Google Scholar 

  • Ravikumar P, Wainwright MJ, Lafferty JD et al (2010) High-dimensional Ising model selection using l 1-regularized logistic regression. Ann Stat 38(3):1287–1319

    Article  MathSciNet  Google Scholar 

  • Shi JV, Yin W, Osher SJ (2013) Linearized Bregman for l 1-regularized logistic regression. In: Proceedings of the 30th international conference on machine learning (ICML)

    Google Scholar 

  • Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B 58:267–288

    MathSciNet  MATH  Google Scholar 

  • Xue L, Zou H, Cai T (2012) Nonconcave penalized composite conditional likelihood estimation of sparse Ising models. Ann Stat 40(3):1403–1429. https://doi.org/10.1214/12-AOS1017

    Article  MathSciNet  Google Scholar 

  • Yin W, Osher S, Darbon J, Goldfarb D (2008) Bregman iterative algorithms for compressed sensing and related problems. SIAM J Imag Sci 1(1):143–168

    Article  Google Scholar 

  • Zhao T, Liu H (2012) The huge package for high-dimensional undirected graph estimation in R. J Mach Learn Res 13:1059–1062

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Chendi Huang, Stanley J. Osher, Ming Yan, and Wotao Yin for helpful discussions. The research of Jiechao Xiong and Yuan Yao was supported in part by National Basic Research Program of China: 2015CB85600 and 2012CB825501, National Natural Science Foundation of China: 61370004 and 11421110001 (A3 project), as well as grants from Baidu and Microsoft Research Asia. The research of Feng Ruan was partially supported by the E.K. Potter Stanford Graduate Fellowship.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuan Yao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Xiong, J., Ruan, F., Yao, Y. (2018). A Tutorial on : R Package for the Linearized Bregman Algorithm in High-Dimensional Statistics. In: Härdle, W., Lu, HS., Shen, X. (eds) Handbook of Big Data Analytics. Springer Handbooks of Computational Statistics. Springer, Cham. https://doi.org/10.1007/978-3-319-18284-1_17

Download citation

Publish with us

Policies and ethics