# Empirical and Poisson processes on classes of sets or functions too large for central limit theorems

• R. M. Dudley
Article

## Summary

Let P be the uniform probability law on the unit cube Id in d dimensions, and Pnthe corresponding empirical measure. For various classes ∉ of sets A⊂Id, upper and lower bounds are found for the probable size of sup {¦Pn−P) (A)¦∶ A ε ∉}. If ∉ is the collection of lower layers in I2, or of convex sets in I3, an asymptotic lower bound is ((log n)/n)1/2(log log n)−δ−1/2 for any δ>0. Thus the law of the iterated logarithm fails for these classes.

If α>0, β is the greatest integer <α, and 0<K<∞, let ∉ be the class of all sets {xd≦f(x1,...,xd-1)} where f has all its partial derivatives of orders ≦ β bounded by K and those of order β satisfy a uniform Hölder condition ¦Dp(f(x)−f(y))¦≦K¦x −y¦α−β. For 0<α<d−1 one gets a universal lower bound δnα/(d−1+α) for a constant δ= δ(d,α)>0. When α = d-1 the same lower bound is obtained as for the lower layers in I2 or convex sets in I3. For 0<α≦d – 1 there is also an upper bound equal to a power of log n times the lower bound, so the powers of n are sharp.

## Preview

### References

1. 1.
Bakhvalov [Bahvalov], N.S.: On approximate calculation of multiple integrals (in Russian). Vestnik Mosk. Univ. Ser. Mat. Mekh. Astron. Fiz. Khim. 1959, no. 4, 3–18 (1959)Google Scholar
2. 2.
Bennett, G.: Probability inequalities for sums of independent random variables. J. Amer. Statist. Assoc. 57, 33–45 (1962)Google Scholar
3. 3.
4. 4.
Dudley, R.M.: Sample functions of the Gaussian process. Ann. Probab. 1, 66–103 (1973)Google Scholar
5. 5.
Dudley, R.M.: Metric entropy of some classes of sets with differentiable boundaries. J. Approximation Theory 10, 227–236 (1974); Correction, ibid. 26, 192–193 (1979)Google Scholar
6. 6.
Dudley, R.M.: Central limit theorems for empirical measures. Ann. Probab. 6, 899–929 (1978); Correction, ibid. 7, 909–911 (1979)Google Scholar
7. 7.
Dudley, R.M.: Lower layers in ℝ2 and convex sets in ℝ3 are not GB classes. Lecture Notes in Math. no. 709, 97–102. Berlin-Heidelberg-New York: Springer 1979Google Scholar
8. 8.
Evstigneev, I.V.: “Markov times” for random fields. Theor. Probability Appls. 22, 563–569 = Teor. Verojatnost. i Primenen. 22, 575–581 (1977)Google Scholar
9. 9.
Feller, W.: An Introduction to Probability Theory and its Applications, vol. I, 3d ed.; vol. II, 2d ed. New York: Wiley 1968 and 1971Google Scholar
10. 10.
Kac, M.: On deviations between theoretical and empirical distributions. Proc. Nat. Acad. Sci. USA 35, 252–257 (1949)Google Scholar
11. 11.
Kaufman, R., Walter, Philipp: A uniform law of the iterated logarithm for classes of functions. Ann. Probab. 6, 930–952 (1978)Google Scholar
12. 12.
Kaufman, Robert: Smooth functions and Gaussian processes. Approximation Theory III, ed. E.W. Cheney, p. 561–564. N.Y.: Academic Press 1980Google Scholar
13. 13.
Kolmogorov, A.N., Tikhomirov, V.M.: ɛ-entropy and ɛ-capacity of sets in functional spaces. Amer. Math. Soc. Transi. (Ser. 2) 17, 277–364 (1961) = Uspekhi Mat. Nauk. 14, vyp. 2(86), 3–86 (1959)Google Scholar
14. 14.
Kuelbs, J., Dudley, R.M.: Log log laws for empirical measures. Ann. Probab. 8, 405–418 (1980)Google Scholar
15. 15.
Pyke, R.: The weak convergence of the empirical process with random sample size. Proc. Cambridge Philos. Soc. 64, 155–160 (1968)Google Scholar
16. 16.
Rao, R. Ranga: Relations between weak and uniform convergence of measures with applications. Ann. Math. Statist. 33, 659–680 (1962)Google Scholar
17. 17.
Révész, P.: On strong approximation of the multidimensional empirical process. Ann. Probab. 4, 729–743 (1976)Google Scholar
18. 18.
Révész, P.: Three theorems of multivariate empirical process. Empirical Distributions and Processes, ed. P. Gaenssler and P. Révész. Lecture Notes in Math. 566, 106–126. Berlin-Heidelberg-New York: Springer 1976Google Scholar
19. 19.
Roberts, A.W., Varberg, D.E.: Convex Functions. New York: Academic Press 1973Google Scholar
20. 20.
Schmidt, W.: Irregularities of distribution IX. Acta Arith. 27, 385–396 (1975)Google Scholar
21. 21.
Steele, J. Michael: Empirical discrepancies and subadditive processes. Ann. Probab. 6, 118–127 (1978)Google Scholar
22. 22.
Stute, W.: Convergence rates for the isotrope discrepancy. Ann. Probab. 5, 707–723 (1977)Google Scholar
23. 23.
Sudakov, V.N.: Gaussian and Cauchy measures and ɛ-entropy. Soviet Math. Doklady 10, 310–313 (1969)Google Scholar
24. 24.
Sun, Tze-Gong, Pyke, R.: Weak convergence of empirical processes. Technical Report, Dept. Statist. Univ. Washington, Seattle (1982)Google Scholar
25. 25.
Wright, F.T.: The empirical discrepancy over lower layers and a related law of large numbers. Ann. Probab. 9, 323–329 (1981)Google Scholar