Advertisement

Bayesian Weighted Information Measures

  • Salimeh Yasaei Sekeh
Conference paper
Part of the Springer Proceedings in Mathematics & Statistics book series (PROMS, volume 118)

Abstract

Following Ebrahimi et al. (J Stat Res Iran 3:113–137, 2006), we study weighted information measure in univariate case. In particular, we address the concept of comparison models based on information measure and, in our case, specially Kullback–Leibler discrimination measure. The main result is presenting the relationship of weighted mutual information measure and weighted entropy. Indeed, the importance of Weibull distribution family in weighted Kullback–Leibler information and Kullback–Leibler information has been carefully examined, which is useful in comparison models. As a notable application of the result, we study normal distributions, which can prove the expected motivation.

Keywords

Mutual Information Weibull Distribution Information Measure Conditional Entropy Leibler Divergence 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

The author is grateful to the referees for their useful comments. I would like to thank Professor Adriano Polpo for his valuable comments that helped in the improvement of this chapter.

References

  1. 1.
    Belis, M., Guiasu, S.: A quantitative-qualitative measure of lifetime in cybernetic systems. IEEE Trans. Inf. Theory IT 4, 593–594 (1968)CrossRefGoogle Scholar
  2. 2.
    Bernardo, J.M.: Expected information as expected utility. Ann. Stat. 7, 686–690 (1979)CrossRefzbMATHMathSciNetGoogle Scholar
  3. 3.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (2006)zbMATHGoogle Scholar
  4. 4.
    Di Crescenzo, A., Longobardi, M.: On weighted residual and past entropies. Scient. Math. Jpn. 64, 255–266 (2006)zbMATHMathSciNetGoogle Scholar
  5. 5.
    Ebrahimi, N., Kirmani, S.N.U.A., Soofi, E.S.: Dynamic Bayesian information measures. J. Stat. Res. Iran 3, 113–137 (2006)Google Scholar
  6. 6.
    Guiasu, S.: Grouping data by using the weighted entropy. J. Stat. Plan. Inference 15, 63–69 (1986)CrossRefzbMATHMathSciNetGoogle Scholar
  7. 7.
    Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)CrossRefzbMATHMathSciNetGoogle Scholar
  8. 8.
    Lindley, D.V.: On measure of information provided by an experiment. Ann. Math. Stat. 27, 986–1005 (1956)CrossRefzbMATHMathSciNetGoogle Scholar
  9. 9.
    AQ6Retzer, C.C., Soofi, E.S., Soyer, R.: Information importance of predictors: concepts, measures, Bayesian inference, and application. Comput. Stat. Data Anal. 53, 2363–2377 (2009)CrossRefzbMATHMathSciNetGoogle Scholar
  10. 10.
    Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 (1948)CrossRefzbMATHMathSciNetGoogle Scholar
  11. 11.
    Soofi, E.S.: Capturing the intangible concept of information. J. Am. Stat. Assoc. 89, 1243–1254 (1994)CrossRefzbMATHMathSciNetGoogle Scholar
  12. 12.
    Soofi, E.S.: Principle information theoretic approaches. J. Am. Stat. Assoc. 95, 1349–1353 (2000)CrossRefzbMATHMathSciNetGoogle Scholar
  13. 13.
    Zellner, A.: An Introduction to Bayesian Inference. Wiley, New York (1971)zbMATHGoogle Scholar
  14. 14.
    Zellner, A. Maximal data information prior distributions. In: Aykac, A., Brumat, C. (eds.) New Developments in Application of Bayesian Methods, 211–232. North Holland, Amsterdom (1977)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Department of StatisticsUFSCarSão CarlosBrazil

Personalised recommendations