Bayesian Weighted Information Measures
Following Ebrahimi et al. (J Stat Res Iran 3:113–137, 2006), we study weighted information measure in univariate case. In particular, we address the concept of comparison models based on information measure and, in our case, specially Kullback–Leibler discrimination measure. The main result is presenting the relationship of weighted mutual information measure and weighted entropy. Indeed, the importance of Weibull distribution family in weighted Kullback–Leibler information and Kullback–Leibler information has been carefully examined, which is useful in comparison models. As a notable application of the result, we study normal distributions, which can prove the expected motivation.
KeywordsMutual Information Weibull Distribution Information Measure Conditional Entropy Leibler Divergence
The author is grateful to the referees for their useful comments. I would like to thank Professor Adriano Polpo for his valuable comments that helped in the improvement of this chapter.
- 5.Ebrahimi, N., Kirmani, S.N.U.A., Soofi, E.S.: Dynamic Bayesian information measures. J. Stat. Res. Iran 3, 113–137 (2006)Google Scholar
- 14.Zellner, A. Maximal data information prior distributions. In: Aykac, A., Brumat, C. (eds.) New Developments in Application of Bayesian Methods, 211–232. North Holland, Amsterdom (1977)Google Scholar