Abstract
Shannon [20] gave a measure of uncertainty that plays an irresistable role in the field of communication theory. Since the proposal of the Shannon entropy many entropies have been proposed later and found to be useful in different areas. Recently, Frank et al. [6] gave a complementary dual of the Shannon entropy and named it ‘extropy’. The uncertainty associated with a random experiment is expected to get reduced when the interval containing the outcome gets smaller. However this result is in general not true for entropies with absolutely continuous random variables. In the present paper we define conditional cumulative past entropy and give the conditions necessary for its partial monotonic behaviour. Further, a result on convolution of extropy have been presented.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Ash, R.B.: Information Theory. Dover Publications Inc., New York (1990)
Bagnoli, M., Bergstrom, T.: Log-concave probability and its applications. Econ. Theory 26(2), 445–469 (2005)
Chen, J.: A partial order on uncertainty and information. J. Theor. Probab. 26(2), 349–359 (2013)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, Hoboken (2012)
Di Crescenzo, A., Longobardi, M.: On cumulative entropies. J. Stat. Plann. Infer. 139(12), 4072–4087 (2009)
Frank, L., Sanfilippo, G., Agro, G.: Extropy: complementary dual of entropy. Stat. Sci. 30(1), 40–58 (2015)
Gupta, N., Bajaj, R.K.: On partial monotonic behaviour of some entropy measures. Stat. Probab. Lett. 83(5), 1330–1338 (2013)
Hooda, D.: A coding theorem on generalized r-norm entropy. Korean J. Comput. Appl. Math. 8(3), 657–664 (2001)
Jose, J., Abdul Sathar, E.: Residual extropy of k-record values. Stat. Probab. Lett. 146, 1–6 (2019)
Kapur, J.N.: Generalized entropy of order \(\alpha \) and type \(\beta \). In: The Math. Seminar, vol. 4, pp. 78–82 (1967)
Kovačević, M., Stanojević, I., Šenk, V.: Some properties of Rényi entropy over countably infinite alphabets. Probl. Inf. Transm. 49, 99–110 (2013)
Maszczyk, T., Duch, W.: Comparison of Shannon, Renyi and Tsallis entropy used in decision trees. In: International Conference on Artificial Intelligence and Soft Computing, vol. 5097, pp. 643–651 (2008)
Qiu, G.: The extropy of order statistics and record values. Stat. Probab. Lett. 120, 52–60 (2017)
Qiu, G., Jia, K.: The residual extropy of order statistics. Stat. Probab. Lett. 133, 15–22 (2018)
Raqab, M.Z., Qiu, G.: On extropy properties of ranked set sampling. Statistics 53(1), 1–15 (2018)
Rényi, A.: On measures of entropy and information. Tech. rep, Hungarian Academy of Sciences Budapest Hungary (1961)
Sadek, S., Al-Hamadi, A., Michaelis, B., Sayed, U.: An efficient method for noisy cell image segmentation using generalized \(\alpha \)-entropy. In: Signal Processing, Image Processing and Pattern Recognition, pp. 33–40 (2009)
Sati, M.M., Gupta, N.: On partial monotonic behaviour of Varma entropy and its application in coding theory. J. Indian Stat. Assoc. (2015)
Shangari, D., Chen, J.: Partial monotonicity of entropy measures. Stat. Probab. Lett. 82(11), 1935–1940 (2012)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 (1948)
Sunoj, S., Sankaran, P., Maya, S.: Characterizations of life distributions using conditional expectations of doubly (interval) truncated random variables. Commun. Stat.-Theory Methods 38(9), 1441–1452 (2009)
Tsallis, C.: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52, 479–487 (1988)
Tuli, R.: Mean codeword lengths and their correspondence with entropy measures. Int. J. Eng. Nat. Sci. 4, 175–180 (2010)
Varma, R.: Generalizations of Renyi’s entropy of order \(\alpha \). J. Math. Sci. 1, 34–48 (1966)
Xia, W.: Partial monotonicity of entropy revisited. Stat. Probab. Lett. 145, 248–253 (2019)
Yeung, R.W.: A First Course in Information Theory. Springer, Berlin (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Bansal, S., Gupta, N. (2020). On Partial Monotonic Behaviour of Past Entropy and Convolution of Extropy. In: Castillo, O., Jana, D., Giri, D., Ahmed, A. (eds) Recent Advances in Intelligent Information Systems and Applied Mathematics. ICITAM 2019. Studies in Computational Intelligence, vol 863. Springer, Cham. https://doi.org/10.1007/978-3-030-34152-7_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-34152-7_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-34151-0
Online ISBN: 978-3-030-34152-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)