Abstract
Results obtained for least-squares loss are now extended to general convex loss. The regularization penalty is again a norm, and the concept of weak decomposability and “allowedness” is extended as well: the norm used in the penalty is required to have the triangle property. The generalized notion of compatibility will be effective sparsity. The oracle results obtained require almost-differentiability of the loss. It is moreover assumed that the population version of the problem, the theoretical risk function has strict convexity properties. This will be called the (two point) margin condition.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
An example where this is not the case is where \(\mathcal{B}\) is a lower-dimensional subspace of \(\bar{\mathcal{B}}\). This is comparable to the situation where one approximates a function (an ∞-dimensional object) by a p-dimensional linear function (with p large).
References
S. van de Geer, Empirical Processes in M-Estimation (Cambridge University Press, Cambridge, 2000)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
van de Geer, S. (2016). General Loss with Norm-Penalty. In: Estimation and Testing Under Sparsity. Lecture Notes in Mathematics(), vol 2159. Springer, Cham. https://doi.org/10.1007/978-3-319-32774-7_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-32774-7_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-32773-0
Online ISBN: 978-3-319-32774-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)