Abstract
Keyfitz entropy index is a new indicator that measures the sensitivity of life expectancy to a change in mortality rate. Understanding the characteristics of this indicator can significantly help life table studies in survival analysis. In this paper, we take a closer look at some mathematical properties of Keyfitz entropy index. First, using theoretical studies we show that in some cases this index belongs to the interval [0, 1] and in other cases, it is greater than 1. We also provide two inequalities for Keyfitz entropy using Shannon entropy and pth central moments of random variables. Then, we present an empirical value for it. This value can be useful and provides initial information about Keyfitz entropy value to the researcher, especially before estimating the population survival function with common parametric and nonparametric methods. Second, we propose a new nonparametric method for estimating the survival function in life table using information theory which applies existing information from the population, such as average and moments. The survival function estimated by this method provides the maximum value for Keyfitz entropy indicating the maximum sensitivity of life expectancy to changes in age-specific mortality rates. We also demonstrate that the survival function estimated by this method can be a powerful competitor to its counterparts which are estimated by common parametric and nonparametric methods.
Similar content being viewed by others
References
Ananda, M.M., Dalpatadu, R.J., Singh, A.K.: Estimating parameters of the force of mortality in actuarial studies. Actuar. Res. Clear. House 1, 129141 (1993)
Baudisch, A.: The pace and shape of ageing. Methods Ecol. Evol. 2, 375–382 (2011)
Bretschneider, C.A.: Theoriae logarithmi integralis lineamenta nova. Crelle’s J. 17, 257–285 (1837). (in Latin)
Brocket, P.: Information theoretic approch to actuarial science: a unification and extension of relevant theory and applications. Trans. Soc. Actuar. 43, 73–114 (1991)
Bulinski, A., Dimitrov, D.: Statistical estimation of the Shannon entropy. Acta Math. Sin. Engl. Ser. 35(1), 17–46 (2019)
Carriere, J.F.: Parametric models for life tables. Trans. Soc. Actuar. 44, 77–99 (1992)
Chakraborti, S., Jardim, F., Epprecht, E.: Higher order moments using the survival function: the alternative expectation formula. Am. Stat. 73, 191–194 (2017)
Ciavolino, E., Dahlgaard, J.J.: Simultaneous equation model based on the generalized maximum entropy for studying the effect of management factors on enterprise performance. J. Appl. Stat. 36(7), 801-15 (2009)
Colchero, F., Rau, R., Jones, O.R., Barthold, J.A., Conde, D.A., Lenart, A., Nemeth, L., Scheuerlein, A., Schoeley, J., Torres, C., Zarulli, V., Altmann, J., Brockman, D.K., Bronikowski, A.M., Fedigan, L.M., Pusey, A.E., Stoinski, T.S., Strier, K.B., Baudisch, A., Alberts, S.C., Vaupel, J.W.: The emergence of longevous populations. PNAS 113(48), E7681–E7690 (2016)
Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, New York (2006)
Demetrius, L.: Demographic paramaters and natural selection. Proc. Natl. Acad. Sci. U. S. A 17, 4645–4647 (1974)
Demetrius, L.: Natural selection and age-structured populations. Genetics 79, 535–544 (1975)
Demetrius, L.: Measures of variability in age-structured populations. J. Theor. Biol 63, 397–404 (1976)
Demetrius, L.: Adaptive value, entropy and survivorship curves. Nature 275, 213–214 (1978)
Demetrius, L.: Relations between demographic parameters. Demography 16, 329–338 (1979)
Ding, Y.S., Zhang, T.L., Gu, Q., Zhao, P.Y., Chou, K.C.: Using maximum entropy model to predict protein secondary structure with single sequence. Protein Pept. Lett. 16(5), 552–60 (2009)
Mohammad-Djafari, A.: A Matlab program to calculate the maximum entropy distributions. In: Maximum Entropy and Bayesian Methods. Springer, Dordrecht, pp. 221–233 (1992)
Fernandez, O.E., Beltran-Sanchez, H.: The entropy of the life table: a reappraisal theoretical. Popul. Biol. 104, 26–45 (2015)
Forte, B., Hughes, W.: The maximum entropy principle: a tool to define new entropies. Rep. Math. Phys. 26(2), 227–35 (1988)
Goldman, N., Lord, G.: A new look at entropy and the life table. DEMOGRAPH 23, 275–282 (1986)
Guure, C.B., Ibrahim, N.A., Adam, M.B., Bosomprah, S., Ahmed, A.O.: Bayesian parameter and reliability estimate of Weibull failure time distribution. Bull. Malays. Math. Sci. Soc. 2(14), 611-32 (2014)
Heric, D., Zazula, D.: Reconstruction of object contours using directional wavelet transform. WSEAS Trans. Comput. 4(10), 1305–12 (2005)
Hill, G.: The entropy of the survival curve: an alternative measure. Can. Stud. Popul. 20(1), 43–57 (1993)
Janklvitch, W.: La mort. Ph.D. thesis, Flammarion, Paris (1977)
Jayne, E.T.: Information theory and statistical mechanics. Phys. Rev. 106, 620–630 (1957)
Keyfitz, N., Caswell, H.: Applied Mathematical Demography. Springer, New York (2005)
Keyfitz, N.: What difference would it make if cancer were eradicated? An examination of the taeuber paradox. Demography 14, 411–418 (1977)
Khodabin, M., Ahmadabadi, A.: Some properties of generalized gamma distribution. J. Sci. (Islam. Azad Univ.) 4(1), 9–28 (2010)
Liu, J.: Information theoretic content and probability. Ph.D. thesis, University of Florida, USA (2007)
Meyer, P., Ponthire, G.: Human lifetime entropy in a historical perspective (1750–2014). Cliometrica 14, 1–39 (2019)
Noorollahi, T.: Making Annual Life Table for Iran. The Statistical Center of Iran, Tehran (2013)
Noyer, A., Coleman, C.: A Universal Pattern of the Evolution of Life Table Entropy and Life expectancy. mimeo, Huntingdon (2013)
Pasha, E.A., Khodabin, M., Mohtashami, B.G.: Entropy in exponentaial families. J. Sci. (Islam. Azad Univ.) 16, 1–9 (2006)
Pierce, J.: An Introduction to Information Theory. Symbols, Signals and Noise. Dover, London (1980)
Preston, S.H., Heuveline, P., Guillot, M.: Demography: Measuring and Modeling Population Processes. Blackwell, Oxford (2000)
Rao, M.: More on a new concept of entropy and information. J. Theor. Probab. 18, 967–981 (2005)
Rao, M., Chen, Y., Vemuri, B.C.: Cumulative residual entropy: a new measure of information. IEEE Trans. Inf. Theory 50(6), 1220–1228 (2004)
Rowland, D.T.: Demographic Methods and Concepts. Oxford University Press, Oxford (2003)
Sahragard, H.P., Ajorlo, M.: A comparison of logistic regression and maximum entropy for distribution modeling of range plant species (a case study in rangelands of western Taftan, southeastern Iran). Turk. J. Bot. 42(1), 28–37 (2018)
Smith, C.R., Grandy Jr., W.T.: Maximum-Entropy and Bayesian Methods in Inverse Problems. Springer, Berlin (2013)
Singapore Department of Statistics: Complete Life Tables 2003–2006 for Singapore Resident Population. Singapore Department of Statistics, Singapore (2008)
Shannon, C.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–423 (1948)
Teh, C.S., Lim, C.P.: A probabilistic SOM-KMER model for intelligent data analysis. WSEAS Trans. Syst. 5(4), 825–32 (2006)
Tuba, M.: Maximum entropy method and underdetermined systems applied to computer network topology and routing. In: Proceedings of the 9th WSEAS International Conference on Applied Informatics and Communications 2009 Aug 20, pp. 127–132. World Scientific and Engineering Academy and Society (WSEAS)
United Nations: Demographic Yearbook. Tech. Rep. United Nations, Statistical Office, New York, United States (2012)
Vaupel, J.W., Canudas Romo, V.: Decomposing change in life expectancy: a bouquet of formulas in honor of Nathan Keyfitz’s 90th birthday. Demography 40(2), 201–216 (2003)
Vaupel, J.W.: How change in age-specific mortality affects life expectancy. Popul. Stud. 40, 147–157 (1986)
Vaupel, J.W., Zhang, Z., van Raalte, A.: Life expectancy and disparity: an international comparison of life table data. BMJ Open 1, e000128 (2011)
Wang, F., Vemuri, B. C., Rao, M., Chen, Y.: Cumulative residual entropy, a new measure of information and its application to image alignment. In: Proceedings of the Ninth IEEE International Conference on Computer Vision (ICCV), vol. 2, Set 0-7695-1950-4/03 (2003)
Wiener, N.: Cybernetics or Control and Communication in the Animal and the Machine. MIT Press, Cambridge (1965)
Wrycza, T.F.: Entropy of the Gompertz–Makeham mortality model. Demogr. Res. 30, 1397–1404 (2014). https://doi.org/10.4054/DemRes.2014.30.49
Wrycza, T.F., Baudisch, A.: How life-expectancy varies with perturbations in age-specic mortality. Demogr. Res. 27(13), 365–376 (2012). https://doi.org/10.4054/DemRes.2012.27.13
Yari, Gh, Mirhabibi, A., Saghafi, A.: Estimation of the Weibull parameters by Kullback–Leibler divergence of survival functions. Appl. Math. Inf. Sci. 7(1), 187–192 (2013)
Zhang, Z., Vaupel, J.W.: The threshold between compression and expansion of mortality. Paper presented at the Population Association of America Annual Meeting (2008)
Zhang, Z.: The age separating early deaths from late deaths. Demogr. Res. 20, 721–730 (2009)
Zukang, Z.: Limit theorems for the ratio of the Kaplan–Meier estimator or the Altshuler estimator to the true survival function. Acta Math. Sin. Engl. Ser. 10(4), 337–347 (1994)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1
In this section, we refer to some of the indicators presented about the life table uncertainty studies. (With the help of [30] literature.)
Among the indicators of uncertainty associated with survival studies are:
Wiener’s entropy index (1965) Wiener’s entropy index measures the amount of information revealed by the occurrence of a single event whose probability of occurrence is known [50]. When a person of age k dies, while the probability of death at age k (conditionally on reaching age k) is \(d_{k}\), the amount of information learnt by the event “death at age k” is given by Wiener’s entropy \(W(d_{k})=-{\mathrm{Log}}_{2}(d_{k})\).
Hill’s entropy index (1993) Hill’s entropy of the age at death is defined as \({\mathrm{HI}}_{k}=-\sum _{i=k}^{\omega }p_{i,k}{\mathrm{ln}}(p_{i,k})\) ([23]).
Shannon’s lifetime entropy index (Meyer’s entropy) (2019) Life tables allowed humans to shift from uncertainty to risk about the duration of life. Meyer and Ponthiere develop an indicator of risk about the duration of life whose metric has a concrete counterpart for the layman and makes that risk commensurable with the risk involved in more common situations (see [30].). They also propose to measure risk about the duration of life by means of Shannon’s lifetime entropy index defined to the base 2 as \(H_{\mathrm{M}}=-\sum _{i=k}^{\omega }p_{i,k}\log _{2} p_{i,k},\) where \(p_{i,k}\) is the probability of a life of (remaining) duration i for an individual of age k [34, 42]. \(H_{\mathrm{M}}\) measures the mathematical expectation, along the life cycle, of the amount of information that is learnt from the event “death at a particular age \(i \ge k\).” That index quantifies the risk relative to the duration of life (or, similarly, the risk about the age at death) in terms of bits. Therefore, \(H_{\mathrm{M}}=HI_{k}\log _{2}e\).
Appendix 2
In the following, we indicated how to form a life table’s entropy. For this purpose, consider \(\mu (x)=\frac{f(x)}{S(x)}\) be the force of mortality at age x that f(x) is a density function and S(x) is the survival function of x. According to this ratio and features of f(x) and S(x), the probability of surviving from birth to age x, also can be expressed as \(S_{x}[\mu (s)]={\mathrm{exp}}(-\int _{0}^{x}\mu (s){\mathrm{d}}s)\). Now according to above S(x), \(e_{x}\) is obtained using \(e_{x}[\mu (s)]=\int _{x}^{\infty }{\mathrm{exp}}(-\int _{0}^{a}\mu (s){\mathrm{d}}s){\mathrm{d}}a\) and therefore \(e_{0}[\mu (s)]\) is life expectancy at birth (see more details [18]). Now, similar to what Keyfitz proposed, consider a relative increase \(\epsilon >0\) in \(\mu\) at all ages [27]. Then, the new mortality function is \((1+\epsilon )\mu (s)\) (that \(\frac{\delta \mu }{\mu }=\epsilon\)), the new probability of surviving from birth to age x, the new life expectancy at age x and life expectancy at birth are, respectively, \(S_{x}[(1+\epsilon )\mu (s)]=(S_{x}[\mu (s)])^{1+\epsilon }\), \(e_{x}[(1+\epsilon )\mu (s)]=\int _{x}^{\infty }S(a)^{1+\epsilon }{\mathrm{d}}a\) and \(e_{0}[(1+\epsilon )\mu (s)]=\int _{0}^{\infty }S(a)^{1+\epsilon }{\mathrm{d}}a\). Keyfitz calculate \(\frac{\mathrm{d}e_{0}}{\mathrm{d}\epsilon }\rfloor _{\epsilon =0}\) [26] because they expected that a relative increase in mortality should result in a relative reduction in life expectancy, so they considered \(\epsilon\) finite but small, to achieve the following approximation:
Because \(0 \le S(x) \le 1\), the sign in parentheses is negative. Accordingly, the negative of the expression in above parentheses is known as the entropy of the life table and is customarily denoted by \(H_{\mathrm{K}}\).
Appendix 3: discrete approximations
The following approximation formulas can be used to calculate \(e_{0}\) and \(e^{\dagger }\) [18]:
where l(0, t), L(x, t), d(x, t), and e(x, t) correspond to the following life table values at age x, time t: radix at age 0, person-years lived, deaths, and life expectancy. Also, we have:
Also, Brocket has shown that \(H_{\mathrm{Sh}}=E(-{\mathrm{ln}}f_{X}(X)) \cong -\sum _{x=0}^{\omega }q(x){\mathrm{ln}} q(x)\)( [4]).
Appendix 4
The principle of maximum entropy (ME) Using Lagrange and as respects to \(\int f(x){\mathrm{d}}x=1\) for probability distribution f that is satisfied in condition \(\varLambda _{i}=\int _{s} f(x) x^{i}{\mathrm{d}}x, 1\le i<m\), f(x) will distribute so that have maximum entropy uniquely, in this way that \(J(f)=-\int f(x){\mathrm{ln}}f(x)+\varLambda _{0}\int f(x)+\sum _{i=1}^{m}\varLambda _{i}\int f(x)x^{i}\), now by the differential of J(f) based on f(x) and equal zero we have \(f(x)={\mathrm{exp}}(\varLambda _{0}-1+\sum _{i=1}^{m}\varLambda _{i}x^{i})\) [19].
Rights and permissions
About this article
Cite this article
Rezaei, R., Yari, G. Keyfitz entropy: investigating some mathematical properties and its application for estimating survival function in life table. Math Sci 15, 229–240 (2021). https://doi.org/10.1007/s40096-020-00354-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40096-020-00354-5