Skip to main content
Log in

Quantile-Based Shannon Entropy for Record Statistics

  • Published:
Communications in Mathematics and Statistics Aims and scope Submit manuscript

Abstract

The quantile-based entropy measures possess some unique properties than their distribution function approach. The present communication deals with the study of the quantile-based Shannon entropy for record statistics. In this regard a generalized model is considered for which cumulative distribution function or probability density function does not exist and various examples are provided for illustration purpose. Further we consider the dynamic versions of the proposed entropy measure for record statistics and also give a characterization result for that. At the end, we study \(F^{\alpha }\)-family of distributions for the proposed entropy measure.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Abbasnejad, M., Arghami, N.R.: Renyi entropy properties of order statistics. Commun. Stat. Theory Methods 40, 40–52 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  2. Abramowitz, M., Stegun, I.A.: Handbook of Mathematical Functions, with Formulas, Graphs, and Mathematical Tables. Dover, New York (1970)

    MATH  Google Scholar 

  3. Ahmadi, J., Balakrishnan, N.: Preservation of some reliability properties by certain record statistics. Stat. J. Theoret. Appl. Stat. 39(4), 347–354 (2005)

    MathSciNet  MATH  Google Scholar 

  4. Ahsanullah, M.: Record Values-Theory and Applications. University Press of America Inc., New York (2004)

    Google Scholar 

  5. Arnold, B.C., Balakrishnan, N., Nagaraja, H.N.: A First Course in Order Statistics. Wiley, New York (1992)

    MATH  Google Scholar 

  6. Baratpur, S., Ahmadi, J., Arghami, N.R.: a). Entropy properties of record statistics. Stat. Pap. 48, 197–213 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  7. Baratpour, S., Ahmadi, J., Arghami, N.R.: b). Some characterizations based on entropy of order statistics and record values. Commun. Stat. Theory Methods 36, 47–57 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  8. Chandler, K.N.: The distribution and frequency of record values. J. R. Stat. Soc. 14(B), 220–228 (1952)

    MathSciNet  MATH  Google Scholar 

  9. Chaudhry, M.A.: On a family of logarithmic and exponential integrals occurring in probability and reliability theory. J. Austral. Math. Soc. Ser. B 35, 469–478 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  10. Cook, J. D.: Determining distribution parameters from quantiles. biostats.bepress.com (2010)

  11. David, H.A., Nagaraja, H.N.: Order Statistics. Wiley, Hoboken (2003)

    Book  MATH  Google Scholar 

  12. Di. Crescenzo, A., Longobardi, M.: Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Probab. 39, 434–440 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  13. Ebrahimi, N.: How to measure uncertainty in the residual life time distribution. Sankhya A 58, 48–56 (1996)

    MathSciNet  MATH  Google Scholar 

  14. Gilchrist, W.: Statistical Modelling with Quantile Functions. Chapman and Hall/CRC, Boca Raton (2000)

    Book  MATH  Google Scholar 

  15. Gradshteyn, I., Ryzhik, I.: Tables of Integrals, Series, and Products. Academic Press, New York (1980)

    MATH  Google Scholar 

  16. Gupta, R.C., Kirmani, S.N.U.A.: Characterization based on convex conditional mean function. J. Stat. Plan. Inference 138(4), 964–970 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  17. Hankin, R.K.S., Lee, A.: A new family of non-negative distributions. Austral. N. Z. J. Stat. 48, 67–78 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  18. Jeffrey, A.: Mathematical Formulas and Integrals. Academic Press, San Diego (1995)

    Google Scholar 

  19. Kamps, U.: Reliability propertiof record values from non-identically distributed random variables. Commun. Stat. Theory Methods 23(7), 2102–2112 (1994)

    Article  Google Scholar 

  20. Kamps, U.: A concept of generalized order statistics. J. Stat. Plan. Inference 48(1), 1–23 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  21. Kayal, S., Tripathy, M.R.: A quantile-based Tsallis-\(\alpha \) divergence. Physica A 492, 496–505 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  22. Kumar, V.: Generalized entropy measure in record values and its applications. Stat. Probab. Lett. 106, 46–51 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  23. Kumar, V.: Some results on Tsallis entropy measure and k-record values. Physica A Stat. Mech. Appl. 462, 667–673 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  24. Kumar, V.: Rekha: quantile approach of dynamic generalized entropy (divergence) measure. Statistica 78, 2 (2018)

    Google Scholar 

  25. Madadi, M., Tata, M.: Shannon information in record data. Metrika 74, 11–31 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  26. Madadi, M., Tata, M.: Shannon information In k-records. Commun. Stat. Theory Methods 43(15), 3286–3301 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  27. Nair, N.U., Sankaran, P.G., Balakrishnan, N.: Quantile-Based Reliability Analysis. Springer, New York (2013)

    Book  MATH  Google Scholar 

  28. Raqab, M.Z., Awad, A.M.: A note on characterization based on Shannon entropy of record statistics. Statistics 35, 411–413 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  29. Sankaran, P.G., Sunoj, S.M.: Quantile-based cumulative entropies. Commun. Stat. Theory Methods 46(2), 805–814 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  30. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 279423, 623–656 (1948)

    Article  MathSciNet  MATH  Google Scholar 

  31. Sunoj, S.M., Sankaran, P.G.: Quantile based entropy function. Stat. Probab. Lett. 82, 1049–1053 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  32. Sunoj, S.M., Sankaran, P.G., Nanda, A.K.: Quantile based entropy function in past lifetime. Stat. Probab. Lett. 83(1), 366–372 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  33. Tarsitano, A.: Estimation of the Generalized Lambda Distribution Parameters for Grouped Data. Commun. Stat. Theory Methods 34(8), 1689–1709 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  34. Van Staden, P.J., Loots, M.R.: L-moment estimation for the generalized lambda distribution. In: Third Annual ASEARC Conference, New Castle, Australia (2009)

  35. Zahedi, H., Shakil, M.: Properties of entropies of record values in reliability and life testing context. Commun. Stat. Theory Methods 35(6), 997–1010 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  36. Zarezadeh, S., Asadi, M.: Results on residual Renyi entropy of order statistics and record values. Inf. Sci. 180(21), 4195–4206 (2010)

    Article  MATH  Google Scholar 

Download references

Acknowledgements

The first author wishes to acknowledge the Science and Engineering Research Board (SERB), Government of India, for the financial assistance (Ref. No. ECR/2017/001987) for carrying out this research work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bhawna Dangi.

Appendices

Appendix: Review of Some Simple Mathematical Results

The following definitions and mathematical results will be useful in the computations of the entropy for record value distributions.

Definition D1 Gamma function. Let \(\alpha > 0\). The integral

$$\begin{aligned} \int _0^{\infty } t^{\alpha -1} e^{-t} {\mathrm{d}}t = \Gamma (\alpha ) \end{aligned}$$
(5.1)

is called a (complete) gamma function.

Definition D2 Incomplete gamma functions. The upper incomplete gamma function is defined as:

$$\begin{aligned} \Gamma (s,x) = \int _x^{\infty } t^{s-1} e^{-t} {\mathrm{d}}t, \end{aligned}$$
(5.2)

whereas the lower incomplete gamma function is defined as:

$$\begin{aligned} \gamma (s,x) = \int _0^x t^{s-1} e^{-t} {\mathrm{d}}t. \end{aligned}$$
(5.3)

Definition D3 Digamma function. A digamma function, denoted by \(\Psi (z)\), called a psi function, is defined as

$$\begin{aligned} \Psi (z) = \frac{{\mathrm{d}}}{{\mathrm{d}}z}[\log (\Gamma (z))] = \frac{\Gamma '(z)}{\Gamma (z)}. \end{aligned}$$
(5.4)
  • Some properties related to incomplete gamma functions

    • \(\Gamma (s+1,x) = s\Gamma (s,x) + x^s e^{-x} \)

    • \(\gamma (s+1,x) = s\gamma (s,x) - x^s e^{-x} \)

    • \(\Gamma (s,x) + \gamma (s,x) = \Gamma (s)\)

    • \(\Gamma (s,0)=\lim _{x\rightarrow \infty } \gamma (s,x) = \Gamma (s)\)

  • Some properties related to digamma function

    • \(\Psi (z+1) = \Psi (z) + (1/z)\)

    • \(\Psi (1) = -\gamma , \text {~where~} \gamma = \lim _{j\rightarrow \infty }[\{ 1/1 + 1/2 + \cdots + 1/(j-1) \} - \ln (j-1)] \approx 0.57721566 \text {~is Euler's constant}.\)

    • \(\Psi (n) = -\gamma + \sum _{k=1}^{n-1}\frac{1}{k},~~~~~~~ \forall \text {~integers~} n\ge 2.\)

    • \(\int _0^{\infty } t^{n-1} e^{-t} \ln (t){\mathrm{d}}t = \Gamma (n)\Psi (n), ~~~\forall \text {~integers~} n\ge 1\).

For proofs and further related topics on these results, see for example, Abramowitz and Stegun [2], Gradshteyn and Ryzhik [15], Chaudhry [9], and Jeffrey [18] .

Conclusion

The quantile-based entropy measures possess some unique properties than its distribution function approach. The quantile-based entropy of record statistics has several advantages. Record values can be viewed as order statistics from a sample whose size is determined by the values and the order of occurrence of observations. They are closely connected with the occurrence times of a corresponding non-homogeneous Poisson process and reliability theory. The computation of proposed measure is quite simple in cases where the distribution function is not tractable while the quantile function has a simpler form.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kumar, V., Dangi, B. Quantile-Based Shannon Entropy for Record Statistics. Commun. Math. Stat. 11, 283–306 (2023). https://doi.org/10.1007/s40304-021-00248-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40304-021-00248-5

Keywords

Mathematics Subject Classification

Navigation