Skip to main content

The Bias of the Lasso and Worst Possible Sub-directions

  • 2757 Accesses

Part of the Lecture Notes in Mathematics book series (LNMECOLE,volume 2159)

Abstract

Bounds for the bias of the Lasso are derived. These bounds are based on so-called worst possible sub-directions or surrogate versions thereof. Both random design as well as fixed design is considered. In the fixed design case the bounds for groups of variables may be different than the ones for single variables due to a different choice of the surrogate inverse. An oracle inequality for subsets of the variables is presented, where it is assumed that the 1-operator norm of the worst possible sub-direction is small. It is shown that the latter corresponds to the irrepresentable condition. It is furthermore examined under what circumstances variables with small coefficients are de-selected by the Lasso. To explain the terminology “worst possible sub-direction”, a section on the semi-parametric lower bound is added.

Keywords

  • Small Eigenvalue
  • Projection Theory
  • Column Space
  • Nuclear Norm
  • Surrogate Projection

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (Canada)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (Canada)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (Canada)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • P. Bickel, C. Klaassen, Y. Ritov, J. Wellner, Efficient and Adaptive Estimation for Semiparametric Models (Johns Hopkins University, Press Baltimore, 1993)

    MATH  Google Scholar 

  • P. Bühlmann, S. van de Geer, Statistics for High-Dimensional Data: Methods, Theory and Applications (Springer, Heidelberg, 2011)

    CrossRef  MATH  Google Scholar 

  • K. Lounici, Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators. Electron. J. Stat. 2, 90–102 (2008)

    CrossRef  MathSciNet  MATH  Google Scholar 

  • S. van de Geer, P. Bühlmann, On the conditions used to prove oracle results for the Lasso. Electron. J. Stat. 3, 1360–1392 (2009)

    CrossRef  MathSciNet  MATH  Google Scholar 

  • S. van de Geer, J. Janková, Semi-parametric efficiency bounds and efficient estimation for high-dimensional models. ArXiv:1601.00815 (2016)

    Google Scholar 

  • A. van der Vaart, Asymptotic Statistics, vol. 3 (Cambridge University Press, Cambridge, 2000)

    MATH  Google Scholar 

  • G. Watson, Characterization of the subdifferential of some matrix norms. Linear Algebra Appl. 170, 33–45 (2015)

    CrossRef  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and Permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

van de Geer, S. (2016). The Bias of the Lasso and Worst Possible Sub-directions. In: Estimation and Testing Under Sparsity. Lecture Notes in Mathematics(), vol 2159. Springer, Cham. https://doi.org/10.1007/978-3-319-32774-7_4

Download citation