Advertisement

Some Information Theoretic Ideas Useful in Statistical Inference

  • Takis PapaioannouEmail author
  • Kosmas Ferentinos
  • Charalampos Tsairidis
Article

Abstract

In this paper we discuss four information theoretic ideas and present their implications to statistical inference: (1) Fisher information and divergence generating functions, (2) information optimum unbiased estimators, (3) information content of various statistics, (4) characterizations based on Fisher information.

Keywords

Information generating function Information optimum estimation Information content Acid test properties Quantal random censoring Koziol–Green model Truncated data Characterizations of Fisher information 

AMS 2000 Subject Classification

Primary 62B10 Secondary 94A17 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. A. A. Abdushukurov, and L. V. Kim, “Lower Cramer–Rao and Bhattacharya bounds for randomly censored observations,” Journal of Soviet Mathematics vol. 38 pp. 2171–2185, 1987.zbMATHCrossRefGoogle Scholar
  2. Z. Abo-Eleneen, and H. N. Nagaraja, “Fisher information in an order statistic and its concomitant,” Annals of the Institute of Statistical Mathematics vol. 54 pp. 667–680, 2002.zbMATHCrossRefMathSciNetGoogle Scholar
  3. J. Aczel, “Characterizing information measures: approaching the end of an era.” In B. Bouchon and R. R. Yager (eds.), Uncertainty on Knowledge-Based Systems. Lecture Notes in Computer Science, pp. 359–384, Springer: Berlin Heidelberg New York, 1986.Google Scholar
  4. J. Aczel, and Z. Daroczy, On Measures of Information and their Characterizations, Academic: New York, 1975.zbMATHGoogle Scholar
  5. B. Arnold, N. Balakrishnan, and H. N. Nagaraja, Records, Wiley: New York, 1998.zbMATHGoogle Scholar
  6. M. J. Bayarri, M. H. DeGroot, and P. K. Goel, “Truncation, information and the coefficient of variation.” In L. Gleser, M. Perlman, S. J. Press and A. Sampson (eds.), Contributions to Probability and Statistics. Essays in Honor of Ingram Olkin, pp. 412–428, Springer: Berlin Heidelberg New York, 1989.Google Scholar
  7. R. Beran, “Minimum Hellinger distance estimates for parametric models,” The Annals of Statistics vol. 5 pp. 445–463, 1977.zbMATHMathSciNetGoogle Scholar
  8. D. Blackwell, Comparison of experiments. Proceedings of the second Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, pp. 93–102, 1951.Google Scholar
  9. T. M. Cover, and J. A. Thomas, Elements of Information Theory, Wiley: New York, 1991.zbMATHGoogle Scholar
  10. N. Cressie, and T. R. C. Read, “Multinomial goodness of fit tests,” Journal of the Royal Statistical Society, Series B vol. 46 pp. 440–464, 1984.zbMATHMathSciNetGoogle Scholar
  11. B. Ebanks, P. Sahoo, and W. Sander, Characterizations of Information Measures, World Scientific: Singapore, 1998.Google Scholar
  12. B. Efron, and I. M. Johnstone, “Fisher’s information in terms of the hazard rate,” The Annals of Statistics vol. 18 pp. 38–62, 1990.zbMATHMathSciNetGoogle Scholar
  13. K. Ferentinos, and T. Papaioannou, “New parametric measures of information,” Information and Control vol. 51 pp. 193–208, 1981.zbMATHCrossRefMathSciNetGoogle Scholar
  14. K. Ferentinos, and T. Papaioannou, “Information in experiments and sufficiency,” Journal of Statistical Planning and Inference vol. 6 pp. 309–317, 1982.zbMATHCrossRefMathSciNetGoogle Scholar
  15. I. Gertsbakh, and A. Kagan, “Characterization of the Weibull distribution by properties of the Fisher information under type I censoring,” Statistics and Probability Letters vol. 42 pp. 99–105, 1999.zbMATHCrossRefMathSciNetGoogle Scholar
  16. S. Golomb, “The information generating function of a probability distribution,” IEEE Transactions on Information Theory vol. IT-12 pp. 75–77, 1966.CrossRefMathSciNetGoogle Scholar
  17. S. Guiasu, and C. Reischer, “The relative information generating function,” Information Sciences vol. 35 pp. 235–241, 1985.zbMATHCrossRefMathSciNetGoogle Scholar
  18. R. D. Gupta, R. C. Gupta, and P. G. Sankaran, “Some characterization results based on factorization of the (reversed) hazard rate function,” Communications in Statistics—Theory and Methods vol. 33 pp. 3009–3031, 2004.zbMATHCrossRefMathSciNetGoogle Scholar
  19. G. Hofmann, and H. N. Nagaraja, “Fisher information in record data,” Metrika vol. 57 pp. 177–193, 2003.CrossRefMathSciNetGoogle Scholar
  20. M. Hollander, F. Proschan, and J. Sconing, “Measuring information in right-censored models,” Naval Research Logistics vol. 34 pp. 669–681, 1987.zbMATHMathSciNetGoogle Scholar
  21. M. Hollander, F. Proschan, and J. Sconing, “Information, censoring and dependence,” Institute of Mathematical Statistics, Lecture Notes—Monograph Series, Topics in Statistical Dependence vol. 16 pp. 257–268, 1990.MathSciNetGoogle Scholar
  22. S. Iyengar, P. Kvam, and H. Singh, “Fisher information in weighted distributions,” The Canadian Journal of Statistics vol. 27 pp. 833–841, 1999.zbMATHMathSciNetGoogle Scholar
  23. A. Kagan, “A discrete version of the Stam inequality and a characterization of the Poisson distribution,” Journal of Statistical Planning and Inference vol. 92 pp. 7–12, 2001.zbMATHCrossRefMathSciNetGoogle Scholar
  24. J. N. Kapur, Maximum-Entropy Models in Science and Engineering, Wiley: New York, 1989.zbMATHGoogle Scholar
  25. S. Kullback, Information Theory and Statistics, Peter Smith: USA, 1959.zbMATHGoogle Scholar
  26. W. Nelson, Applied Life Data Analysis, Wiley: New York, 1982.zbMATHCrossRefGoogle Scholar
  27. T. Papaioannou, “On distances and measures of information: a case of diversity.” In C. A. Charalambides, M. V. Koutras and N. Balakrishnan (eds.), Probability and Statistical Models with applications, pp. 503–515, Chapman and Hall: London, UK, 2001.Google Scholar
  28. T. Papaioannou, and K. Ferentinos, “On two forms of Fisher’s measure of information,” Communications in Statistics—Theory and Methods vol. 34 pp. 1461–1470, 2005.zbMATHMathSciNetGoogle Scholar
  29. T. Papaioannou, and O. Kempthorne, On statistical information theory and related measures of information. ARL. Tech. Report N o 71-0059, Aerospace Research Laboratories, Wright-Patterson A.F.B., Ohio, 1971.Google Scholar
  30. V. Papathanasiou, “Some characteristic properties of the Fisher information matrix via Cacoullos-type inequalities,” Journal of Multivariate Analysis vol. 44 pp. 256–265, 1993.zbMATHCrossRefMathSciNetGoogle Scholar
  31. S. Park, and G. Zheng, “Equal Fisher information in order statistics,” Sankhya: The Indian Journal of Statistics, Series B vol. 66 pp. 20–34, 2004.MathSciNetGoogle Scholar
  32. T. Rothenberg, “Identification in parameter models,” Econometrica vol. 39 pp. 577–591, 1971.zbMATHCrossRefMathSciNetGoogle Scholar
  33. A. Stam, “Some inequalities satisfied by the quantities of information of Fisher and Shannon,” Information and Control vol. 2 pp. 101–112, 1959.CrossRefMathSciNetGoogle Scholar
  34. Ch. Tsairidis, K. Ferentinos, and T. Papaioannou, “Information and random censoring,” Information Sciences vol. 92 pp. 159–174, 1996.zbMATHCrossRefMathSciNetGoogle Scholar
  35. Ch. Tsairidis, K. Zografos, K. Ferentinos, and T. Papaioannou, “Information in quantal response data and random censoring,” Annals of the Institute of Statistical Mathematics vol. 53 pp. 528–542, 2001.zbMATHCrossRefMathSciNetGoogle Scholar
  36. I. Vincze, “Some questions concerning the probabilistic concept of information,” AMS and IMS Selected Translations vol. V pp. 373–380, 1965.Google Scholar
  37. P. W. Vos, “Minimum f-divergence estimators and quasi-likelihood functions,” Annals of the Institute of Statistical Mathematics vol. 44 pp. 261–279, 1992.zbMATHCrossRefMathSciNetGoogle Scholar
  38. R. Wijman, “On the attainment of the Cramer–Rao lower bound,” The Annals of Statistics vol. 1 pp. 538–542, 1973.MathSciNetGoogle Scholar
  39. S. Zacks, The Theory of Statistical Inference, Wiley: New York, 1971.Google Scholar
  40. G. Zheng, “A characterization of the factorization of hazard function by the Fisher information under Type II censoring with application to the Weibull family,” Statistics and Probability Letters vol. 52 pp. 249–253, 2001.zbMATHCrossRefMathSciNetGoogle Scholar
  41. G. Zheng, and J. L. Gastwirth, “Where is the Fisher information in an ordered sample?,” Statistica Sinica vol. 10 pp. 1267–1280, 2000.zbMATHMathSciNetGoogle Scholar
  42. G. Zheng, and J. L. Gastwirth, “On the Fisher information in randomly censored data,” Statistics and Probability Letters vol. 52 pp. 421–426, 2001.zbMATHCrossRefMathSciNetGoogle Scholar
  43. G. Zheng, and J. L. Gastwirth, “Do tails of symmetric distributions contain more Fisher information about the scale parameter?,” Sankhya: The Indian Journal of Statistics, Series B vol. 64 pp. 289–300, 2002.MathSciNetGoogle Scholar
  44. G. Zheng, and J. L. Gastwirth, “Fisher information in ordered randomly censored data with applications to characterizations problems,” Statistica Sinica vol. 13 pp. 507–517, 2003.zbMATHMathSciNetGoogle Scholar
  45. K. Zografos, and K. Ferentinos, “An information theoretic argument for the validity of the exponential model,” Metrika vol. 41 pp. 109–119, 1994.zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2007

Authors and Affiliations

  • Takis Papaioannou
    • 1
    Email author
  • Kosmas Ferentinos
    • 2
  • Charalampos Tsairidis
    • 3
  1. 1.Department of Statistics and Insurance ScienceUniversity of PiraeusPiraeusGreece
  2. 2.Department of MathematicsUniversity of IoanninaIoanninaGreece
  3. 3.Department of Social AdministrationDemocritus University of ThraceKomotiniGreece

Personalised recommendations