Advertisement

Residual and Past Entropy in Actuarial Science and Survival Models

  • Athanasios SachlasEmail author
  • Takis Papaioannou
Article

Abstract

The best policy for an insurance company is that which lasts for a long period of time and is less uncertain with reference to its claims. In information theory, entropy is a measure of the uncertainty associated with a random variable. It is a descriptive quantity as it belongs to the class of measures of variability, such as the variance and the standard deviation. The purpose of this paper is to investigate the effect of inflation, truncation or censoring from below (use of a deductible) and truncation or censoring from above (use of a policy limit) on the entropy of losses of insurance policies. Losses are differentiated between per-payment and per-loss (franchise deductible). In this context we study the properties of the resulting entropies such as the residual loss entropy and the past loss entropy which are the result of use of a deductible and a policy limit, respectively. Interesting relationships between these entropies are presented. The combined effect of a deductible and a policy limit is also studied. We also investigate residual and past entropies for survival models. Finally, an application is presented involving the well-known Danish data set on fire losses.

Keywords

Entropy Loss distributions Truncation and censoring Residual and past entropy Proportional hazards Proportional reversed hazards Frailty models 

AMS 2000 Subject Classifications

62B10 62P05 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abramowitz M, Stegun IA (1972) Handbook of mathematical functions. National Bureau of Standards. Applied Mathematics Series, No. 55Google Scholar
  2. Asadi M, Ebrahimi N, Hamedani G, Soofi ES (2004) Maximum dynamic entropy models. J Appl Probab 41:379–390CrossRefzbMATHMathSciNetGoogle Scholar
  3. Baxter LA (1989) A note on information and censored absolutely continuous random variables. Stat Decis 7:193–198zbMATHMathSciNetGoogle Scholar
  4. Belzunce F, Navarro J, Ruiz J, Aguila Y (2004) Some results on residual entropy function. Metrika 59:147–161CrossRefzbMATHMathSciNetGoogle Scholar
  5. Block HW, Borges WS, Savits TH (1985) Age-dependent minimal repair. J Appl Probab 22:370–385CrossRefzbMATHMathSciNetGoogle Scholar
  6. Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New YorkCrossRefzbMATHGoogle Scholar
  7. Cox DR (1972) Regression models and life tables. J R Stat Soc 34:187–202zbMATHGoogle Scholar
  8. Cox SH (1991) Bounds on expected values of insurance payments and option prices. Transactions of Society of Actuaries 43:231–260Google Scholar
  9. Di Crescenzo A (2000) Some results on the proportional reversed hazards model. Stat Probab Lett 50:313–321CrossRefzbMATHGoogle Scholar
  10. Di Crescenzo A, Longobardi M (2002) Entropy-based measure of uncertainty in past lifetime distributions. J Appl Probab 39:434–440CrossRefzbMATHMathSciNetGoogle Scholar
  11. Ebrahimi N (1996) How to measure uncertainty in the residual life distributions. Sankhya 58:48–57zbMATHMathSciNetGoogle Scholar
  12. Ebrahimi N, Pellerey F (1995) New partial ordering of survival functions based on the notion of uncertainty. J Appl Probab 32:202–211CrossRefzbMATHMathSciNetGoogle Scholar
  13. Ebrahimi N, Kirmani SNUA, Soofi ES (2007) Multivariate dynamic information. J Multivar Anal 98:328–349CrossRefzbMATHMathSciNetGoogle Scholar
  14. Gupta RC, Gupta RD (2007) Proportional reversed hazard rate model and its applications. J Stat Plan Inference 137:3525–3536CrossRefzbMATHGoogle Scholar
  15. Harris B (1982) Entropy. In: Kotz S, Johnson NL (eds) Encyclopedia of statistical sciences, vol 2. Wiley, New York, pp 512–516Google Scholar
  16. Kluggman AS, Panjer HH, Willmot EG (2008) Loss models, from data to decisions, 3rd edn. Wiley, New YorkCrossRefGoogle Scholar
  17. McNeil AJ (1997) Estimating the tails of loss severity distributions using extreme value theory. ASTIN bull 27:117–137CrossRefGoogle Scholar
  18. Nadarajah S, Zografos K (2003) Formulas for Rényi information and related measures for univariate distributions. Inf Sci 155:119–138CrossRefzbMATHMathSciNetGoogle Scholar
  19. Nair C, Prabhakar B, Shah D (2006) On entropy for mixtures of discrete and continuous variables. Computing Research Repository (CoRR). abs/cs/0607075
  20. Pardo L (2006) Statistical inference based on divergence meaures. Chapman & Hall/CRCGoogle Scholar
  21. Pigeon M, Denuit M (2011) Composite Lognormal-Pareto model with random threshold. Scand Actuar J 3:177–192CrossRefMathSciNetGoogle Scholar
  22. Resnick SI (1997) Discussion of the Danish data on large fire insurance losses. ASTIN Bull 27:139–151CrossRefMathSciNetGoogle Scholar
  23. Sankaran PG, Gleeja CL (2008) Proportional reversed hazard and frailty models. Metrika 68:333–342CrossRefzbMATHMathSciNetGoogle Scholar
  24. Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech 27:379–423CrossRefzbMATHMathSciNetGoogle Scholar
  25. Vonta F (1996) Efficient estimation in a nonproportional hazards model in survival analysis. Scand J Statist 23:49–62zbMATHMathSciNetGoogle Scholar
  26. Vonta F, Karagrigoriou A (2010) Generalized measures of divergence in survival analysis and reliability. J Appl Probab 47(1):216–234CrossRefzbMATHMathSciNetGoogle Scholar
  27. Zografos K (2008) On some entropy and divergence type measures of variability and dependence for mixed continuous and discrete variables. J Stat Plan Inference 138:3899–3914CrossRefzbMATHMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  1. 1.Department of Statistics & Insurance ScienceUniversity of PiraeusPiraeusGreece

Personalised recommendations