Skip to main content
Log in

Shannon information in record data

  • Published:
Metrika Aims and scope Submit manuscript

Abstract

In this article we present results on the Shannon information (SI) contained in upper (lower) record values and associated record times in a sequence of i.i.d continuous variables. We then establish an interesting relationship between the SI content of a random sample of fixed size, and the SI in the data consisting of sequential maxima. We also consider the information contained in the record data from an inverse sampling plan (ISP).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Ahmadi J, Arghami NR (1998) On the Fisher information in record values. Metrika 53: 195–206

    Article  MathSciNet  Google Scholar 

  • Ahmed NA, Gokhale DV (1989) Entropy expressions and their estimators for multivariate distributions. IEEE Trans Inf Theory 35: 688–692

    Article  MathSciNet  MATH  Google Scholar 

  • Arnold BC, Balakrishnan N, Nagaraja HN (1998) Records. Wiley, New York

    MATH  Google Scholar 

  • Baratpour S, Ahmadi J, Arghami NR (2007) Entropy properties of record statistics. Stat Pap 48: 197–213

    Article  MathSciNet  MATH  Google Scholar 

  • Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New York

    Book  MATH  Google Scholar 

  • Darbellay GA, Vajda I (2000) Entropy expressions for multivariate continuous distributions. IEEE Trans Inf Theory 46: 709–712

    Article  MathSciNet  MATH  Google Scholar 

  • Harris B (1982) Entropy. In: Kotz S, Johanson NL (eds) Encyclopedia of statistical sciences, vol 2. pp 512–516

  • Hartley RTV (1928) Tranmission of information. Bell Syst Tech J 7: 535–563

    Google Scholar 

  • Hofmann G, Nagaraja HN (2003) Fisher information in record data. Metrika 57: 177–193

    Article  MathSciNet  Google Scholar 

  • Lazo ACG, Rathie PN (1978) On the entropy of continuous distribution. IEEE Trans Inf Theory 24: 120–122

    Article  MATH  Google Scholar 

  • Nadarajah S, Zografos K (2003) Formulas for Rényi information and related measures for univariate distribution. Inf Sci 155: 119–138

    Article  MathSciNet  MATH  Google Scholar 

  • Nanda AK, Paul P (2006) Some results on generalized past entropy. J Stat Plan Inference 136: 3659–3674

    Article  MathSciNet  MATH  Google Scholar 

  • Nayak TK (1985) On diversity measures based on entropy functions. Commun Stat Theory Methods 14: 203–215

    Article  MathSciNet  MATH  Google Scholar 

  • Nyquist H (1924) Certain affecting telegraph speed. Bell Syst Tech J 3: 324–346

    Google Scholar 

  • Nyquist H (1928) Certain topics in telegraph transmission theory. AIEEE Trans 47: 617–644

    Google Scholar 

  • Shannon CE (1946) A mathematical theory of communication. Bell Syst Tech J 27: 379–432

    MathSciNet  Google Scholar 

  • Soofi ES (2000) Principal information theoretic approaches. J Am Stat Assoc 95: 1349–1353

    Article  MathSciNet  MATH  Google Scholar 

  • Zahedi H, Shakil M (2006) Properties of entropies of record values in reliability and life testing context. Commun Stat Theory Methods 35: 977–1010

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohsen Madadi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Madadi, M., Tata, M. Shannon information in record data. Metrika 74, 11–31 (2011). https://doi.org/10.1007/s00184-009-0287-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00184-009-0287-7

Keywords

Navigation