Skip to main content

Advertisement

Log in

Contribution to the Theory of Pitman Estimators

  • Published:
Journal of Mathematical Sciences Aims and scope Submit manuscript

New inequalities are proved for the variance of the Pitman estimators (minimum variance equivariant estimators) of θ constructed from samples of fixed size from populations F(xθ). The inequalities are closely related to the classical Stam inequality for the Fisher information, its analog in small samples, and a powerful variance drop inequality. The only condition required is finite variance of F; even the absolute continuity of F is not assumed. As corollaries of the main inequalities for small samples, one obtains alternate proofs of known properties of the Fisher information, as well as interesting new observations like the fact that the variance of the Pitman estimator based on a sample size n scaled by n monotonically decreases in n. Extensions of the results to polynomial versions of the Pitman estimators and a multivariate location parameter are given. Also, the search for the characterization of equality conditions for one of the inequalities leads to a Cauchy-type functional equation for independent random variables, and an interesting new behavior of its solutions is described. Bibliography: 21 titles.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. S. Artstein, K. M. Ball, F. Barthe, and A. Naor, “Solution of Shannon’s problem on the monotonicity of entropy,” J. Amer. Math. Soc. (Electronic), 17, 975–982 (2004).

    Article  MATH  MathSciNet  Google Scholar 

  2. A. DasGupta, “Letter to the editors,” IMS Bulletin, 37, 16 (2008).

    Google Scholar 

  3. E. A. Carlen, “Superadditivity of Fisher’s information and logarithmic Sobolev inequalities,” J. Funct. Anal., 101, 194–211 (1991).

    Article  MATH  MathSciNet  Google Scholar 

  4. B. Efron and C. Stein, “The jackknife estimate of variance,” Ann. Statist., 9, 586–596 (1981).

    Article  MATH  MathSciNet  Google Scholar 

  5. W. Hoeffding, “ A class of statistics with asymptotically normal distribution,” Ann. Math. Statist., 19, 293–325 (1948).

    Article  MATH  MathSciNet  Google Scholar 

  6. J. Hoffmann-Jørgensen, A. M. Kagan, L. D. Pitt, and L. A. Shepp, “Strong decomposition of random variables,” J. Theor. Probab., 20, 211–220 (2007).

    Article  Google Scholar 

  7. I. A. Ibragimov and R. Z. Hasminskii, Statistical Estimation: Asymptotic Theory, Applications of Mathematics, Vol. 16, Springer, New York (1981).

    Book  Google Scholar 

  8. A. Kagan and Z. Landsman, “Statistical meaning of Carlen’s superadditivity of the Fisher’s information,” Statist. Probab. Lett., 32, 175–179 (1997).

    Article  MATH  MathSciNet  Google Scholar 

  9. A. M. Kagan and Ya. Malinovsky, “Monotonicity in the sample size of the length of classical confidence intervals,” Statist. Probab. Lett. (2013) (accepted).

  10. A. Kagan, “An inequality for the Pitman estimators related to the Stam inequality,” Sankhyā, Ser. A, 64, 281–292 (2002).

    MathSciNet  Google Scholar 

  11. A. M. Kagan, “On the estimation theory of location parameter,” Sankhyā, Ser. A, 28, 335–352 (1996).

    MathSciNet  Google Scholar 

  12. A. M. Kagan, “Fisher information contained in a finite-dimensional linear space, and a properly formulated version of the method of moments,” Probl. Peredačhi Inform., 12, 20–42 (1976).

    MATH  MathSciNet  Google Scholar 

  13. A. M. Kagan, L. B. Klebanov, and S. M. Fintušal, “Asymptotic behavior of polynomial Pitman estimators,” Zap. Nauchn. Semin. LOMI, 43, 30–39 (1974).

    Google Scholar 

  14. E. Lukacs, Characteristic Functions, 2nd ed., Hafner Publishing Co., New York (1970).

    MATH  Google Scholar 

  15. M. Madiman, A. R. Barron, A. M. Kagan, and T. Yu, “Fundamental limits for distributed estimation: the case of a location parameter,” Preprint (2009).

  16. M. Madiman, A. R. Barron, A. M. Kagan, and T. Yu, “A model for pricing data bundles based on minimax risks for estimation of a location parameter,” in: Proc. IEEE Inform. Theory, Workshop Volos, Greece (June 2009).

    Google Scholar 

  17. M. Madiman and A.R. Barron, “Generalized entropy power inequalities and monotonicity properties of information,” IEEE Trans. Inform. Theory, 53, 2317–2329 (2007).

    Article  MathSciNet  Google Scholar 

  18. J. Shao, Mathematical Statistics, 2nd ed., Springer, New York (2003).

    Book  MATH  Google Scholar 

  19. N.-Z. Shi, “Letter to the Editors” IMS Bulletin, 36, 4 (2008).

    Google Scholar 

  20. A. J. Stam, “Some inequalities satisfied by the quantities of information of Fisher and Shannon,” Inform. Control., 2, 101–112 (1959).

    Article  MATH  MathSciNet  Google Scholar 

  21. R. Zamir, “A proof of the Fisher information inequality via a data of processing argument,” IEEE Trans. Inform. Theory, 4, 1246–1250 (1998).

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. M. Kagan.

Additional information

Published in Zapiski Nauchnykh Seminarov POMI, Vol. 408, 2012, pp. 245–267.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kagan, A.M., Tinghui, Y., Barron, A. et al. Contribution to the Theory of Pitman Estimators. J Math Sci 199, 202–214 (2014). https://doi.org/10.1007/s10958-014-1847-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10958-014-1847-6

Keywords

Navigation