Advertisement

Weak convergence under contiguous alternatives of the empirical process when parameters are estimated: The Dk approach

  • G. Neuhaus
Conference paper
Part of the Lecture Notes in Mathematics book series (LNM, volume 566)

Keywords

Weak Convergence Empirical Process Reproduce Kernel Hilbert Space Fredholm Determinant Asymptotic Power 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Behnen, K. (1971). Asymptotic optimality and ARE of certain rank-order tests under contiguity. Ann. Math. Statist. 42 225–229.MathSciNetCrossRefzbMATHGoogle Scholar
  2. [2]
    Behnen, K. (1972). A characterization of certain rank-order tests with bounds for the asymptotic relative efficiency. Ann.Math. Statist. 43 1839–1851.MathSciNetCrossRefzbMATHGoogle Scholar
  3. [3]
    Behnen, K. and Neuhaus, G. (1975). A central limit theorem under contiguous alternatives. Ann. Statist. 3 1349–1353.MathSciNetCrossRefzbMATHGoogle Scholar
  4. [4]
    Billingsley, P. (1968). Convergence of probability measures. J. Wiley, New York.zbMATHGoogle Scholar
  5. [5]
    Cramér, H. (1945). Mathematical methods of statistics. Almqvist and Wiksells, Uppsala. Princeton 1946.zbMATHGoogle Scholar
  6. [6]
    Darling, D.A. (1955). The Cramér-Smirnov test in the parametric case. Ann. Math. Statist. 26 1–20.MathSciNetCrossRefzbMATHGoogle Scholar
  7. [7]
    Doob, J.L. (1949). Heuristic approach to the Kolmogorov-Smirnov theorems. Ann. Math. Statist. 20 393–403.MathSciNetCrossRefzbMATHGoogle Scholar
  8. [8]
    Durbin, J. (1973). Weak convergence of the sample distribution function when parameters are estimated. Ann. Statist. 1 279–290.MathSciNetCrossRefzbMATHGoogle Scholar
  9. [9]
    Hájek, J. and Šidák, Z. (1967). Theory of Rank Tests. Academic Press, New York.zbMATHGoogle Scholar
  10. [10]
    Kac, M. (1951). On some connections between probability theory and differential and integral equations. Proc. Second Berkeley Symp. Math. Statist. Probab., Univ. of Calif. Press 189–215.Google Scholar
  11. [11]
    Kac, M., Kiefer, J. and Wolfowitz, J. (1955) On tests of normality and other test of goodness of fit based on distance methods. Ann. Math. Statist. 26 189–211.MathSciNetCrossRefzbMATHGoogle Scholar
  12. [12]
    Le Cam, L. (1960). Locally asymptotically normal families of distributions. Univ. of Calif.Publ. in Stat. 3 37–98.Google Scholar
  13. [13]
    Neuhaus, G. (1971). On weak convergence of stochastic processes with multi-dimensional time parameter. Ann. Math. Statist. 42 1285–1295.MathSciNetCrossRefzbMATHGoogle Scholar
  14. [14]
    Neuhaus, G. (1973). Asymptotic properties of the Cramér-von Mises statistic when parameters are estimated. Proc. Prague Symp. on Asymptotic Stat. Sept. 3–6, 1973 (J.Hájek,ed.) Universita Karlova Praha 2 257–297.MathSciNetGoogle Scholar
  15. [15]
    Neuhaus, G. (1973). Zur Verteilungskonvergenz einiger Varianten der Cramér-von Mises-Statistik. Math. Operationsforschung u. Statist. 4 473–484.MathSciNetCrossRefzbMATHGoogle Scholar
  16. [16]
    Neuhaus, G. (1976). Asymptotic power properties of the Cramér-von Mises test under contiguous alternatives. J. Multivariate Anal. 6 95–110.MathSciNetCrossRefzbMATHGoogle Scholar
  17. [17]
    Park, W.J. (1970). A multi-parameter Gaussian process. Ann. Math. Statist. 41 1582–1595.MathSciNetCrossRefzbMATHGoogle Scholar
  18. [18]
    Parzen, E. (1959). Statistical inference on time series by Hilbert space methods, I. Technical report No. 23, Department of Statistics, Stanford Univ..Google Scholar
  19. [19]
    Sukhatme, S. (1972). Fredholm determinant of a positive kernel of a special type and its applications. Ann. Math. Statist. 43 1914–1926.MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag 1976

Authors and Affiliations

  • G. Neuhaus
    • 1
  1. 1.Math. InstituteUniversity of GiessenGermany

Personalised recommendations