Skip to main content
Log in

Generalized estimators, slope, efficiency, and fisher information bounds

  • Research Paper
  • Published:
Information Geometry Aims and scope Submit manuscript

Abstract

Point estimators may not exist, need not be unique, and their distributions are not parameter invariant. Generalized estimators provide distributions that are parameter invariant, unique, and exist when point estimates do not. Comparing point estimators using variance is less useful when estimators are biased. A squared slope \(\Lambda \) is defined that can be used to compare both point and generalized estimators and is unaffected by bias. Fisher information I and variance are fundamentally different quantities: the latter is defined at a distribution that need not belong to a family, while the former cannot be defined without a family of distributions, M. Fisher information and \(\Lambda \) are similar quantities as both are defined on the tangent bundle \(T\!M\) and I provides an upper bound, \(\Lambda \le I\), that holds for all sample sizes—asymptotics are not required. Comparing estimators using \(\Lambda \) rather than variance supports Fisher’s claim that I provides a bound even in small samples. \(\Lambda \)-efficiency is defined that extends the efficiency of unbiased estimators based on variance. While defined by the slope, \(\Lambda \)-efficiency is simply \(\rho ^{2}\), the square of the correlation between estimator and score function.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Data availibility

There are no data associated with this work.

Notes

  1. Here, ’large’ means in absolute value.

References

  1. Amari, S.-I.: Differential-Geometrical Methods in Statistics. Lecture Notes in Statistics. Springer, New York (1990)

    Google Scholar 

  2. Barndorff-Nielsen, O.: Comments on paper by B. Efron and D. V. Hinkley. Biometrika 65(3), 482 (1978)

    Article  Google Scholar 

  3. Barnett, V.: Comparative Statistical Inference. Wiley Series in Probability and Statistics, 3rd edn. Wiley, Chichester (1999)

    Google Scholar 

  4. Brown, L.D., Cai, T.T., DasGupta, A.: Interval estimation for a binomial proportion. Stat. Sci. 16(2), 101–103 (2001)

    Article  MathSciNet  Google Scholar 

  5. Chyzak, F., Nielsen, F.: A closed-form formula for the Kullback-Leibler divergence between Cauchy distributions (2019). arXiv e-prints arXiv:1905.10965

  6. Cox, D.R.: Some pioneers of modern statistical theory: a personal reflection. Biometrika 103(4), 747–759 (2016)

    Article  MathSciNet  Google Scholar 

  7. Efron, B., Hinkley, D.V.: Assessing the accuracy of the maximum likelihood estimator: observed versus expected fisher information. Biometrika 65(3), 457–483 (1978)

    Article  MathSciNet  Google Scholar 

  8. Fisher, R.A.: Theory of statistical estimation. Math Proc Camb Philos Soc 22(5), 700–725 (1925)

    Article  ADS  Google Scholar 

  9. Fisher, R.: Statistical methods and scientific induction. J R Stat Soc Ser B (Methodological) 17(1), 69–78 (1955)

    MathSciNet  Google Scholar 

  10. Godambe, V.P.: An optimum property of regular maximum likelihood estimation. Ann Math Stat 31(4), 1208–1211 (1960)

    Article  MathSciNet  Google Scholar 

  11. McLeish, D.L., Small, C.G.: The Theory and Applications of Statistical Inference Functions. Lecture Notes in Statistics, 1988th edn. Springer, New York (2012)

    Google Scholar 

  12. Rider, P.R.: Variance of the median of samples from a Cauchy distribution. J Am Stat Assoc 55(290), 322–323 (1960)

    Article  MathSciNet  Google Scholar 

  13. Vos, P., Holbert, D.: Frequentist statistical inference without repeated sampling. Synthese 200(2), 1–25 (2022)

    Article  Google Scholar 

  14. Vos, P., Wu, Q.: Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families. Bernoulli 21(4), 2140–2148 (2015)

    Article  MathSciNet  Google Scholar 

  15. Wu, Q., Vos, P.: Decomposition of Kullback–Leibler risk and unbiasedness for parameter-free estimators. J Stat Plan Inference 142(6), 1525–1536 (2012)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul Vos.

Ethics declarations

Conflict of interest

There is no conflict of interest.

Additional information

Communicated by Shinto Eguchi.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vos, P. Generalized estimators, slope, efficiency, and fisher information bounds. Info. Geo. 7 (Suppl 1), 151–170 (2024). https://doi.org/10.1007/s41884-022-00085-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s41884-022-00085-7

Keywords

Navigation