The best policy for an insurance company is that which lasts for a long period of time and is less uncertain with reference to its claims. In information theory, entropy is a measure of the uncertainty associated with a random variable. It is a descriptive quantity as it belongs to the class of measures of variability, such as the variance and the standard deviation. The purpose of this paper is to investigate the effect of inflation, truncation or censoring from below (use of a deductible) and truncation or censoring from above (use of a policy limit) on the entropy of losses of insurance policies. Losses are differentiated between per-payment and per-loss (franchise deductible). In this context we study the properties of the resulting entropies such as the residual loss entropy and the past loss entropy which are the result of use of a deductible and a policy limit, respectively. Interesting relationships between these entropies are presented. The combined effect of a deductible and a policy limit is also studied. We also investigate residual and past entropies for survival models. Finally, an application is presented involving the well-known Danish data set on fire losses.
Entropy Loss distributions Truncation and censoring Residual and past entropy Proportional hazards Proportional reversed hazards Frailty models
AMS 2000 Subject Classifications
This is a preview of subscription content, log in to check access.
Zografos K (2008) On some entropy and divergence type measures of variability and dependence for mixed continuous and discrete variables. J Stat Plan Inference 138:3899–3914CrossRefzbMATHMathSciNetGoogle Scholar