## Summary

Denote by*℘*
_{
k
} a class of families**P**={P_{θ}} of distributions on the line R^{1} depending on a general scalar parameter θεΘ, Θ being an interval of R^{1}, and such that the moments µ_{1}(θ)=∫*xdP*
_{θ},...,µ_{2k
}(θ)=∫*x*
^{2k}
*dP*
_{θ} are finite, μ_{1}‴ (θ), ..., μ_{k}‴ (θ), μ_{k+1} ″ (θ) ..., μ_{
k
} ″ (θ) exist and are continuous, with μ_{1}′ (θ) ≠ 0, and μ_{
j
}+1 (θ)= μ_{1} (θ)μ_{
j
}(θ) +[μ_{2}(θ) -μ_{1}(θ)^{2}]μ_{
j
}′ (θ)/ μ_{1}′ (θ), J=2, ..., k. Let μ_{1}=¯*x*=*x*
_{1} + ... +*x*
_{n/n}, α_{2}=*x*
_{1}
^{2} + ... +*x*
_{n}
^{2}/*n*, ..., α_{
k
}=(*x*
_{1}
^{k} + ... +*x*
_{n}
^{k}/*n* denote the sample moments constructed for a sample x_{1}, ..., x_{n} from a population with distribution Pg. We prove that the estimator of the parameter θ by the method of moments determined from the equation α_{1}= μ_{1}(θ) and depending on the observations x_{1}, ..., x_{n} only via the sample mean ¯x is asymptotically admissible (and optimal) in the class ℐ_{
k
} of the estimators determined by the estimator equations of the form λ_{0} (θ) + λ_{1} (θ) α_{1} + ... + λ_{
k
} (θ) α_{
k
}=0 if and only if*P*∈*℘*
_{
k
}.

The asymptotic admissibility (respectively, optimality) means that the variance of the limit, as n → ∞ (normal) distribution of an estimator normalized in a standard way is less than the same characteristic for any estimator in the class under consideration for at least one 9 (respectively, for every θ).

The scales arise of classes*℘*
_{1}⊃*℘*
_{2}⊃... of parametric families and of classes ℐ_{1}⊂ ℐ_{2} ⊂ ... of estimators related so that the asymptotic admissibility of an estimator by the method of moments in the class κ_{
k
} is equivalent to the membership of the family**P** in the class*℘*
_{
k
}.

The intersection

consists only of the families of distributions with densities of the form h(x) exp {C_{0}(θ) + C_{1}(θ) x } when for the latter the problem of moments is definite, that is, there is no other family with the same moments μ_{1} (θ), μ_{2} (θ), ...

Such scales in the problem of estimating the location parameter were predicted by Linnik about 20 years ago and were constructed by the author in [1] (see also [2, 3]) in exact, not asymptotic, formulation.

This is a preview of subscription content, log in to check access.

## Literature cited

- 1.
A. M. Kagan, “On the estimation theory of location parameter,” Sankhya,

__A28__, No. 4, 335–352 (1966). - 2.
A. M. Kagan, Yu. V. Linnik, and C. R. Rao, Characterization Problems of Mathematical Statistics [in Russian], Nauka, Moscow (1972).

- 3.
S. Zaks, Theory of Statistical Inference, New York (1971).

- 4.
C. R. Rao, Linear Statistical Inference and Its Applications, 2nd ed., New York (1973).

- 5.
V. P. Godambe, “An optimum property of regular maximum likelihood estimation,” Ann. Math. Statist.,

__31__, No. 4, 1208–1212 (1960). - 6.
A. M. Kagan, “Fisher information contained in a finite-dimensional linear space, and a well-defined version of the method of moments,” Probl. Peredachi Inf.,

__XII__, No. 2, 20–42 (1976). - 7.
I. A. Melamed, “Characterization problems arising in asymptotic estimation of the location and scale parameters,” Soobshch. Akad. Nauk GruzSSR,

__76__, No. 2, 293–296 (1974). - 8.
E. L. Lehmann, Testing Statistical Hypotheses, Wiley, New York (1959).

## Additional information

Translated from Problemy Ustoichivosti Stokhasticheskikh Modelei, pp. 41–47, 1981.

## Rights and permissions

## About this article

### Cite this article

Kagan, A.M. A graded scale of parametric families of distributions, and parameter estimates based on the sample mean.
*J Math Sci* **34, **1482–1487 (1986). https://doi.org/10.1007/BF01089785

Issue Date:

### Keywords

- Parameter Estimate
- Location Parameter
- Parametric Family
- Estimator Equation