Skip to main content

On Partial Monotonic Behaviour of Past Entropy and Convolution of Extropy

  • Conference paper
  • First Online:

Part of the book series: Studies in Computational Intelligence ((SCI,volume 863))

Abstract

Shannon [20] gave a measure of uncertainty that plays an irresistable role in the field of communication theory. Since the proposal of the Shannon entropy many entropies have been proposed later and found to be useful in different areas. Recently, Frank et al. [6] gave a complementary dual of the Shannon entropy and named it ‘extropy’. The uncertainty associated with a random experiment is expected to get reduced when the interval containing the outcome gets smaller. However this result is in general not true for entropies with absolutely continuous random variables. In the present paper we define conditional cumulative past entropy and give the conditions necessary for its partial monotonic behaviour. Further, a result on convolution of extropy have been presented.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Ash, R.B.: Information Theory. Dover Publications Inc., New York (1990)

    MATH  Google Scholar 

  2. Bagnoli, M., Bergstrom, T.: Log-concave probability and its applications. Econ. Theory 26(2), 445–469 (2005)

    Article  MathSciNet  Google Scholar 

  3. Chen, J.: A partial order on uncertainty and information. J. Theor. Probab. 26(2), 349–359 (2013)

    Article  MathSciNet  Google Scholar 

  4. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, Hoboken (2012)

    MATH  Google Scholar 

  5. Di Crescenzo, A., Longobardi, M.: On cumulative entropies. J. Stat. Plann. Infer. 139(12), 4072–4087 (2009)

    Article  MathSciNet  Google Scholar 

  6. Frank, L., Sanfilippo, G., Agro, G.: Extropy: complementary dual of entropy. Stat. Sci. 30(1), 40–58 (2015)

    Article  MathSciNet  Google Scholar 

  7. Gupta, N., Bajaj, R.K.: On partial monotonic behaviour of some entropy measures. Stat. Probab. Lett. 83(5), 1330–1338 (2013)

    Article  MathSciNet  Google Scholar 

  8. Hooda, D.: A coding theorem on generalized r-norm entropy. Korean J. Comput. Appl. Math. 8(3), 657–664 (2001)

    MathSciNet  MATH  Google Scholar 

  9. Jose, J., Abdul Sathar, E.: Residual extropy of k-record values. Stat. Probab. Lett. 146, 1–6 (2019)

    Article  MathSciNet  Google Scholar 

  10. Kapur, J.N.: Generalized entropy of order \(\alpha \) and type \(\beta \). In: The Math. Seminar, vol. 4, pp. 78–82 (1967)

    Google Scholar 

  11. Kovačević, M., Stanojević, I., Šenk, V.: Some properties of Rényi entropy over countably infinite alphabets. Probl. Inf. Transm. 49, 99–110 (2013)

    Article  MathSciNet  Google Scholar 

  12. Maszczyk, T., Duch, W.: Comparison of Shannon, Renyi and Tsallis entropy used in decision trees. In: International Conference on Artificial Intelligence and Soft Computing, vol. 5097, pp. 643–651 (2008)

    Google Scholar 

  13. Qiu, G.: The extropy of order statistics and record values. Stat. Probab. Lett. 120, 52–60 (2017)

    Article  MathSciNet  Google Scholar 

  14. Qiu, G., Jia, K.: The residual extropy of order statistics. Stat. Probab. Lett. 133, 15–22 (2018)

    Article  MathSciNet  Google Scholar 

  15. Raqab, M.Z., Qiu, G.: On extropy properties of ranked set sampling. Statistics 53(1), 1–15 (2018)

    MathSciNet  MATH  Google Scholar 

  16. Rényi, A.: On measures of entropy and information. Tech. rep, Hungarian Academy of Sciences Budapest Hungary (1961)

    Google Scholar 

  17. Sadek, S., Al-Hamadi, A., Michaelis, B., Sayed, U.: An efficient method for noisy cell image segmentation using generalized \(\alpha \)-entropy. In: Signal Processing, Image Processing and Pattern Recognition, pp. 33–40 (2009)

    Google Scholar 

  18. Sati, M.M., Gupta, N.: On partial monotonic behaviour of Varma entropy and its application in coding theory. J. Indian Stat. Assoc. (2015)

    Google Scholar 

  19. Shangari, D., Chen, J.: Partial monotonicity of entropy measures. Stat. Probab. Lett. 82(11), 1935–1940 (2012)

    Article  MathSciNet  Google Scholar 

  20. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 (1948)

    Article  MathSciNet  Google Scholar 

  21. Sunoj, S., Sankaran, P., Maya, S.: Characterizations of life distributions using conditional expectations of doubly (interval) truncated random variables. Commun. Stat.-Theory Methods 38(9), 1441–1452 (2009)

    Article  MathSciNet  Google Scholar 

  22. Tsallis, C.: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52, 479–487 (1988)

    Article  MathSciNet  Google Scholar 

  23. Tuli, R.: Mean codeword lengths and their correspondence with entropy measures. Int. J. Eng. Nat. Sci. 4, 175–180 (2010)

    Google Scholar 

  24. Varma, R.: Generalizations of Renyi’s entropy of order \(\alpha \). J. Math. Sci. 1, 34–48 (1966)

    MathSciNet  MATH  Google Scholar 

  25. Xia, W.: Partial monotonicity of entropy revisited. Stat. Probab. Lett. 145, 248–253 (2019)

    Article  MathSciNet  Google Scholar 

  26. Yeung, R.W.: A First Course in Information Theory. Springer, Berlin (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shilpa Bansal .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bansal, S., Gupta, N. (2020). On Partial Monotonic Behaviour of Past Entropy and Convolution of Extropy. In: Castillo, O., Jana, D., Giri, D., Ahmed, A. (eds) Recent Advances in Intelligent Information Systems and Applied Mathematics. ICITAM 2019. Studies in Computational Intelligence, vol 863. Springer, Cham. https://doi.org/10.1007/978-3-030-34152-7_16

Download citation

Publish with us

Policies and ethics