Skip to main content
Log in

Simple Proof of the Risk Bound for Denoising by Exponential Weights for Asymmetric Noise Distributions

  • Statistics
  • Published:
Journal of Contemporary Mathematical Analysis (Armenian Academy of Sciences) Aims and scope Submit manuscript

Abstract

In this note, we consider the problem of aggregation of estimators in order to denoise a signal. The main contribution is a short proof of the fact that the exponentially weighted aggregate satisfies a sharp oracle inequality. While this result was already known for a wide class of symmetric noise distributions, the extension to asymmetric distributions presented in this note is new.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. This means that the density of \(\xi_{i}\) is equal to \((2\mu_{i})^{-1}\exp(-|x|/\mu_{i})\).

REFERENCES

  1. P. Alquier and K. Lounici, ‘‘PAC-Bayesian bounds for sparse regression estimation with exponential weights,’’ Electron. J. Stat. 5, 127–145 (2011). https://doi.org/10.1214/11-ejs601

    Article  MathSciNet  Google Scholar 

  2. P. C. Bellec, ‘‘Optimal bounds for aggregation of affine estimators,’’ Ann. Stat. 46, 30–59 (2018). https://doi.org/10.1214/17-aos1540

    Article  MathSciNet  Google Scholar 

  3. F. Bunea, A. B. Tsybakov, and M. H. Wegkamp, ‘‘Aggregation for Gaussian regression,’’ Ann. Stat. 35, 1674–1697 (2007). https://doi.org/10.1214/009053606000001587

    Article  MathSciNet  Google Scholar 

  4. E. Chernousova, Yu. Golubev, and E. Krymova, ‘‘Ordered smoothers with exponential weighting,’’ Electron. J. Stat. 7, 2395–2419 (2013). https://doi.org/10.1214/13-ejs849

    Article  MathSciNet  Google Scholar 

  5. A. S. Dalalyan and A. B. Tsybakov, ‘‘Sparse regression learning by aggregation and Langevin Monte-Carlo,’’ J. Comput. Syst. Sci. 78, 1423–1443 (2009). https://doi.org/10.1016/j.jcss.2011.12.023

    Article  MathSciNet  Google Scholar 

  6. A. S. Dalalyan, ‘‘Exponential weights in multivariate regression and a low-rankness favoring prior,’’ Ann. Inst. Henri Poincaré, Probabilités Statistiques 56, 1465–1483 (2020). https://doi.org/10.1214/19-aihp1010

    Article  MathSciNet  Google Scholar 

  7. A. S. Dalalyan and J. Salmon, ‘‘Sharp oracle inequalities for aggregation of affine estimators,’’ Ann. Stat. 40, 2327–2355 (2012). https://doi.org/10.1214/12-aos1038

    Article  MathSciNet  Google Scholar 

  8. A. S. Dalalyan and A. B. Tsybakov, ‘‘Aggregation by exponential weighting and sharp oracle inequalities,’’ in Learning Theory, Ed. by N.H. Bshouty and C. Gentile, Lecture Notes in Computer Sciences, Vol. 4539 (Springer, Berlin, 2007), pp. 97–111. https://doi.org/10.1007/978-3-540-72927-3_9

  9. A. S. Dalalyan and A. B. Tsybakov, ‘‘Aggregation by exponential weighting, sharp PAC-Bayesian bounds and sparsity,’’ Mach. Learn. 72, 39–61 (2008). https://doi.org/10.1007/s10994-008-5051-0

    Article  Google Scholar 

  10. E. Donier-Meroz, A. S. Dalalyan, F. Kramarz, P. Chone, and X. D. D’Haultfoeuille, ‘‘Graphon estimation in bipartite graphs with observable edge labels and unobservable node labels,’’ (2023). https://doi.org/10.48550/arXiv.2304.03590

  11. E. I. George, ‘‘Combining minimax shrinkage estimators,’’ J. Am. Stat. Assoc. 81, 437–445 (1986). https://doi.org/10.1080/01621459.1986.10478288

    Article  MathSciNet  Google Scholar 

  12. E. I. George, ‘‘Minimax multiple shrinkage estimation,’’ Ann. Stat. 14, 188–205 (1986). https://doi.org/10.1214/aos/1176349849

    Article  MathSciNet  Google Scholar 

  13. A. Juditsky, P. Rigollet, and A. B. Tsybakov, ‘‘Learning by mirror averaging,’’ Ann. Stat. 36, 2183–2206 (2008). https://doi.org/10.1214/07-aos546

    Article  MathSciNet  Google Scholar 

  14. G. Lecu+й and Sh. Mendelson, ‘‘On the optimality of the aggregate with exponential weights for low temperatures,’’ Bernoulli 19, 646–675 (2013). https://doi.org/10.3150/11-bej408

  15. G. Leung and A. R. Barron, ‘‘Information theory and mixing least-squares regressions,’’ IEEE Trans. Inf. Theory 52, 3396–3410 (2006). https://doi.org/10.1109/tit.2006.878172

    Article  MathSciNet  Google Scholar 

  16. Ph. Rigollet and A. Tsybakov, ‘‘Exponential screening and optimalrates of sparse estimation,’’ Ann. Stat. 39, 731–771 (2011).

    Article  Google Scholar 

  17. A. B. Tsybakov, ‘‘Optimal rates of aggregation,’’ in Learning Theory and Kernel Machines, Ed. by B. Schölkopf and M. K. Warmuth, Lecture Notes in Computer Science, Vol. 2777 (Springer, Berlin, 2003), pp. 303–313. https://doi.org/10.1007/978-3-540-45167-9_23

    Book  Google Scholar 

  18. A. B. Tsybakov, ‘‘Aggregation and minimax optimality inhigh-dimensional estimation,’’ in Proc. Int. Congress of Mathematicians (Seoul, 2014), pp. 225–246.

  19. R. Vershynin, ‘‘High-Dimensional Probability: An Introduction with Applications in Data Science,’’ Cambridge Series in Statistical and Probabilistic Mathematics (Cambridge Univ. Press, 2018). https://doi.org/10.1017/9781108231596

    Book  Google Scholar 

  20. Yu. Yang, ‘‘Combining different procedures for adaptive regression,’’ J. Multivariate Anal. 74, 135–161 (2000). https://doi.org/10.1006/jmva.1999.1884

    Article  MathSciNet  Google Scholar 

  21. Y. Yang, ‘‘Regression with multiple candidate models: Selecting ormixing?,’’ Stat. Sin. 13, 783–809 (2003).

    Google Scholar 

  22. Yu. Yang, ‘‘Aggregating regression procedures to improve performance,’’ Bernoulli 10 (1), 25–47 (2004). https://doi.org/10.3150/bj/1077544602

    Article  MathSciNet  Google Scholar 

Download references

Funding

The work of the author was supported by the grant Investissements d’Avenir (ANR-11-IDEX-0003/Labex Ecodec/ANR-11-LABX-0047), the FAST Advance grant and the center Hi! PARIS.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. S. Dalalyan.

Ethics declarations

The author declares that he has no conflicts of interest.

Additional information

Publisher’s Note.

Allerton Press remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dalalyan, A.S. Simple Proof of the Risk Bound for Denoising by Exponential Weights for Asymmetric Noise Distributions. J. Contemp. Mathemat. Anal. 58, 391–399 (2023). https://doi.org/10.3103/S106836232306002X

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3103/S106836232306002X

Keywords:

Navigation