Abstract
We obtain sharp minimax results for estimation of an n-dimensional normal mean under quadratic loss. The estimators are chosen by penalized least squares with a penalty that grows like ck log(n/k), for k equal to the number of nonzero elements in the estimating vector. For a wide range of sparse parameter spaces, we show that the penalized estimator achieves the exact minimax rate with the correct multiplication constant if and only if c equals 2. Our results unify the theory obtained by many other authors for penalized estimation of normal means. In particular we establish that a conjecture by Abramovich et al. (Ann Stat 34:584–653, 2006) is true.
Article PDF
Similar content being viewed by others
References
Abramovich F., Benjamini Y., Donoho D.L., Johnstone I.M.: Adapting to unknown sparsity by controlling the false discovery rate. Ann. Stat. 34, 584–653 (2006)
Abramovich F., Grinshtein V., Pensky M.: On optimality of Bayesian testimation in the normal means problem. Ann. Stat. 35, 2261–2286 (2007)
Akaike H.: Information theory and an extension of the maximum likelihood principle. In: Petrov, B.N., Czáki, F. (eds) Second International Symposium on Information Theory, pp. 267–281. Akadémiai Kiadó, Budapest (1973)
Akaike H.: A new look at the statistical model identification. IEEE Trans. Automa. Control. 19(6), 716–723 (1974)
Barron A.R., Birgé L., Massart P.: Risk bounds for model selection via penalization. Probab. Theory. Relat. Fields 113, 301–413 (1999)
Benjamini Y., Hochberg Y.: Controlling the false discovery rate: A practical and powerful approach to multiple testing. J. R. Stat. Soc. B. 57, 289–300 (1995)
Benjamini Y., Gavrilov Y.: A simple forward selection procedure based on false discovery rate control. Ann. Appl. Stat. 3, 1, 179–198 (2009)
Birgé l., Massart P.: Gaussian model selection. J. Eur. Math. Soc. 3, 203–268 (2001)
Birgé L., Massart P.: Minimal penalties for Gaussian model selection. Probab. Theory. Relat. Fields 138, 33–73 (2007)
Brown L.D., Low M.G.: Asymptotic equivalence of nonparametric regression and white noise. Ann. Stat. 24, 2384–2398 (1996)
Brown L.D., Cai T.T., Zhou H.H.: Robust nonparametric estimation via wavelet median regression. Ann. Stat. 36, 2055–2084 (2008)
Brown L.D., Cai T., Zhou H.H.: Nonparametric regression in exponential families. Ann. Stat. 38, 2005–2046 (2010)
Cai T.T., Zhou H.H.: Asymptotic equivalence and adaptive estimation for robust nonparametric regression. Ann. Stat. 37, 3204–3235 (2009)
Cörgő S., Mason D.: Central limit theorems for sums of extreme values. Math. Proc. Camb. 98, 547– (1985)
Donoho D.L., Johnstone I.M., Hoch J.C., Stern A.S.: Maximum entropy and the nearly black object. J. R. Stat. Soc. B. 54(1), 41–81 (1992)
Donoho D.L., Johnstone I.M.: Ideal spatial adaptation by wavelet shrinkage. Biometrika 81(3), 425–455 (1994)
Donoho D.L., Johnstone I.M.: Minimax Risk Over l p -Balls for l q -Error. Probab. Theory. Relat. Fields 99, 277–303 (1994)
Donoho D.L., Johnstone I.M.: Adapting to unknown smoothness via wavelet Shrinkage. J. Am. Stat. Assoc. 90, 1200–1224 (1995)
Donoho D.L., Johnstone I.M.: Minimax estimation via wavelet shrinkage. Ann. Stat. 26, 879–921 (1998)
David H.A., Nagaraja H.N.: Order Statistics, 3rd edn. Wiley & Sons, New York (2003)
Efron B.: Robbins, empirical Bayes and microarrays. Ann. Stat. 31, 366–378 (2003)
Foster D.P., George E.I.: The risk inflation criterion for multiple regression. Ann. Stat. 22, 1947–1975 (1994)
Foster D.P., Stine R.A.: Local asymptotic coding and the minimum description length. IEEE Trans. Inf. Theory 45, 1289–1293 (1999)
George E.I., Foster D.P.: Calibration and empirical Bayes variable selection. Biometrika 87, 731–747 (2000)
Golubev G.K., Nussbaum M., Zhou H.H.: Asymptotic equivalence of spectral density estimation and Gaussian white noise. Ann. Stat. 38, 181–214 (2010)
Johnstone I.M.: Minimax Bayes asymptotic minimax and sparse wavelet priors. In: Gupta, S., Berger, J. (eds) Statistical Decision Theory and Related Topics V, pp. 303–326. Springer, Berlin (1994)
Johnstone, I.M.: Gaussian Estimation: Sequence and Multiresolution Models. Unpublished manuscript. http://www-stat.stanford.edu/~imj/ (2011)
Nussbaum M.: Asymptotic equivalence of density estimation and Gaussian white noise. Ann. Stat. 24, 2399–2430 (1996)
Tibshirani R., Knight K.: The covariance inflation criterion for adaptive model selection. J. R. Stat. Soc. B. 61, 529–546 (1999)
Yang Y., Barron A.R.: An asymptotic property of model selection criteria. IEEE Trans. Inf. Theory 44, 117–133 (1998)
Yang Y.: Model selection for nonparametric regression. Stat. Sinica. 9, 475–499 (1999)
Author information
Authors and Affiliations
Corresponding author
Additional information
The research of Z. Wu was supported in part by NIH Grant GM590507. The research of H. Zhou was supported in part by NSF Grants DMS-0645676 and DMS-0854975.
Rights and permissions
About this article
Cite this article
Wu, Z., Zhou, H.H. Model selection and sharp asymptotic minimaxity. Probab. Theory Relat. Fields 156, 165–191 (2013). https://doi.org/10.1007/s00440-012-0424-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00440-012-0424-5
Keywords
- FDR
- Minimax estimation
- Model selection
- Multiple comparisons
- Sharp asymptotic minimaxity
- Smoothing parameter selection
- Thresholding
- Wavelet denoising
- Wavelets