Advertisement

Convolution and Minimax Theorems

  • Aad W. van der Vaart
  • Jon A. Wellner
Part of the Springer Series in Statistics book series (SSS)

Abstract

Let H be a linear subspace of a Hilbert space with inner product (·, ·) and norm I · II. For each n ∈ N and hH, let P n,h be a probability measure on a measurable space (X n , A n ). Consider the problem of estimating a “parameter” k n (h) given an “observation” X n with law P n,h . The convolution theorem and the minimax theorem give a lower bound on how well k n (h) can be estimated asymptotically as ↦ ∞. Suppose the sequence of statistical experiments (X n, A n , P n,h : hH) is “asymptotically normal” and the sequence of parameters is “regular”. Then the limit distribution of every “regular” estimator sequence is the convolution of a certain Gaussian distribution and a noise factor. Furthermore, the maximum risk of any estimator sequence is bounded below by the “risk” of this Gaussian distribution. These concepts are defined as follows.

Keywords

Loss Function Coordinate Projection Separable Banach Space Estimator Sequence Brownian Bridge 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1996

Authors and Affiliations

  • Aad W. van der Vaart
    • 1
  • Jon A. Wellner
    • 2
  1. 1.Department of Mathematics and Computer ScienceFree UniversityAmsterdamThe Netherlands
  2. 2.StatisticsUniversity of WashingtonSeattleUSA

Personalised recommendations