Skip to main content

Advertisement

SpringerLink
Log in
Menu
Find a journal Publish with us
Search
Cart
  1. Home
  2. Probability Theory and Related Fields
  3. Article
Asymptotic minimax risk for sup-norm loss: Solution via optimal recovery
Download PDF
Download PDF
  • Published: June 1994

Asymptotic minimax risk for sup-norm loss: Solution via optimal recovery

  • David L. Donoho1 

Probability Theory and Related Fields volume 99, pages 145–170 (1994)Cite this article

  • 304 Accesses

  • 48 Citations

  • Metrics details

Summary

We study the problem of estimating an unknown function on the unit interval (or itsk-th derivative), with supremum norm loss, when the function is observed in Gaussian white noise and the unknown function is known only to obey Lipschitz-β smoothness, β>k≧0. We discuss an optimization problem associated with the theory ofoptimal recovery. Although optimal recovery is concerned with deterministic noise chosen by a clever opponent, the solution of this problem furnishes the kernel of the minimax linear estimate for Gaussian white noise. Moreover, this minimax linear estimator is asymptotically minimax among all estimates. We sketch also applications to higher dimensions and to indirect measurement (e.g. deconvolution) problems.

Download to read the full article text

Working on a manuscript?

Avoid the common mistakes

References

  • Adler, R.J.: The geometry of random fields. New York: Wiley 1981

    Google Scholar 

  • Bickel, P.J., Rosenblatt, M.: On some global measures of the deviation of density function estimates. Ann. Stat. (6)1, 1071–1095 (1973)

    Google Scholar 

  • Borell, C.: The Brunn-Minkowski inequality in Gauss space. Inventiones Mathematicae20, 205–216 (1975)

    Google Scholar 

  • Brown, L.D., Low, M.G.: Asymptotic equivalence of nonparametric regression and white noise. Ann. Stat.22 (March) (to appear)

  • Donoho, D.L.: Asymptotic Minimax Risk for Sup-Norm Loss: Solution via Optimal Recovery. Technical Report, Department of Statistics, Stanford University 1991

  • Donoho, D.L.: Statistical Estimation and Optimal Recovery. Ann. Stat.22 (March) (to appear)

  • Donoho, D.L., Liu, R.C.: Geometrizing rates of convergence, III. Ann. Stat.19, 668–701 (1991)

    Google Scholar 

  • Donoho, D.L., Low, M.G.: Renormalization exponents and optimal pointwise rates of convergence. Ann. Stat.20, 944–970 (1992)

    Google Scholar 

  • Efroimovich, S.Y., Pinsker, M.S.: Estimation of square-integrable [spectral] density on the basis of a sequence of observations. Problems Inform. Transmission17, 50–68 (1981)

    Google Scholar 

  • Efroimovich, S.Y., Pinsker, M.S.: Estimation of square-integrable probability density of a random variable. Problems Inform. Transmission18, 175–182 (1982)

    Google Scholar 

  • Ibragimov, I.A., Has'minskii, R.Z.: On nonparametric estimation of the value of a linear functional in a Gaussian white noise. Teor. Verojatnost. I Primenen.29, 19–32 (1984)

    Google Scholar 

  • Korostelev, A.P.: Exact asymptotic minimax estimate for a nonparametric regression in the uniform norm. Theory Probab. Appl. (to appear. 1992)

  • Le Cam, L.: Asymptotic methods in statistical decision theory. Berlin Heidelberg New York: Springer 1986

    Google Scholar 

  • Leadbetter, M.R., Lindgren, G., Rootzen, H.: Extremes and related properties of random sequences and processes. Berlin Heidelberg New York: Springer 1983

    Google Scholar 

  • Micchelli, C.: Optimal recovery of linear functionals. IBM Technical Report 1975.

  • Micchelli, C., Rivlin, T.J.: A survey of optimal recovery In: Micchelli and Rivlin (eds.) Optimal estimation in approximation theory, pp. 1–54, New York: Plenum 1977

    Google Scholar 

  • Nussbaum, M.: Spline smoothing in regression models and asymptotic efficiency inL 2. Ann. Stat.13, 984–997 (1985)

    Google Scholar 

  • Pinsker, M.S.: Optimal filtering of square integrable signals in Gaussian white noise. Problems Inform. Transmission16, 52–68 (1980)

    Google Scholar 

  • Strasser, H.: Mathematical Theory of Statistics. New York: de Gruyter 1985

    Google Scholar 

  • Talagrand, M.: Small tails for the supremum of a Gaussian process. Annales de L'Institut Henri Poincaré24, 307–315 (1988)

    Google Scholar 

  • Traub, J., Wasilkowski, G., Woźniakowski: Information-Based Complexity. Reading, MA, Addison-Wesley 1988

    Google Scholar 

  • Watanabe, H., Qualls, C.: Asymptotic properties of Gaussian Random Fields. Trans. Am. Math. Soc.177, 155–171 (1973).

    Google Scholar 

Download references

Author information

Authors and Affiliations

  1. Statistics Department, Stanford University, Sequoia Hall, 94305, Stanford, CA, USA

    David L. Donoho

Authors
  1. David L. Donoho
    View author publications

    You can also search for this author in PubMed Google Scholar

Additional information

Dedicated to R.Z. Khas'minskii for his 60th birthday

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Donoho, D.L. Asymptotic minimax risk for sup-norm loss: Solution via optimal recovery. Probab. Th. Rel. Fields 99, 145–170 (1994). https://doi.org/10.1007/BF01199020

Download citation

  • Received: 29 December 1992

  • Revised: 04 February 1994

  • Issue Date: June 1994

  • DOI: https://doi.org/10.1007/BF01199020

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Mathematics Subject Classifications

  • 62G07
  • 62C20
  • 60G70
  • 41A25
Download PDF

Working on a manuscript?

Avoid the common mistakes

Advertisement

Search

Navigation

  • Find a journal
  • Publish with us

Discover content

  • Journals A-Z
  • Books A-Z

Publish with us

  • Publish your research
  • Open access publishing

Products and services

  • Our products
  • Librarians
  • Societies
  • Partners and advertisers

Our imprints

  • Springer
  • Nature Portfolio
  • BMC
  • Palgrave Macmillan
  • Apress
  • Your US state privacy rights
  • Accessibility statement
  • Terms and conditions
  • Privacy policy
  • Help and support

167.114.118.210

Not affiliated

Springer Nature

© 2023 Springer Nature