Skip to main content

A Survey of Reverse Inequalities for f-Divergence Measure in Information Theory

  • Chapter
  • 2149 Accesses

Abstract

In this paper we survey some discrete inequalities for the f-divergence measure in Information Theory by the use of recent reverses of the celebrated Jensen’s inequality. Applications in connection with Hölder’s inequality and for particular measures such as Kullback–Leibler divergence measure, Hellinger discrimination, χ 2-distance and variation distance are provided as well.

Keywords

  • Convex functions
  • Jensen’s inequality
  • Reverse of Jensen’s inequality
  • Reverse of Hölder’s inequality
  • f-Divergence measure
  • Kullback–Leibler divergence measure
  • Hellinger discrimination
  • χ 2-Distance
  • Variation distance
  • Grüss’ type inequality

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-18275-9_9
  • Chapter length: 43 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   119.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-18275-9
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   159.99
Price excludes VAT (USA)
Hardcover Book
USD   169.99
Price excludes VAT (USA)

References

  1. Ali, S.M., Silvey, S.D.: A general class of coefficients of divergence of one distribution from another. J. R. Stat. Soc. Sec. B 28, 131–142 (1966)

    MathSciNet  MATH  Google Scholar 

  2. Barnett, N.S., Cerone, P., Dragomir, S.S., Sofo, A.: Approximating Csiszár f-divergence by the use of Taylor’s formula with integral remainder. Math. Ineq. Appl. 5(3), 417–434 (2002)

    MathSciNet  MATH  Google Scholar 

  3. Barnett, N.S., Cerone, P., Dragomri, S.S., Sofo, A.: Approximating two mappings associated to Csiszár f-divergence via Taylor’s expansion. Pan Am. Math. J. 12(4), 105–117 (2002)

    MATH  Google Scholar 

  4. Barnett, N.S., Cerone, P., Dragomir, S.S.: Some new inequalities for Hermite-Hadamard difference in information theory. In: Cho, Y.J., Kim, J.K., Choi, Y.K. (eds.) Stochastic Analysis and Applications, pp. 7–20. Nova Science Publishers, New York (2003)

    Google Scholar 

  5. Beth Bassat, M.: f-Entropies, probability of error and feature selection. Inform. Control 39, 227–242 (1978)

    CrossRef  Google Scholar 

  6. Beran, R.: Minimum Hellinger distance estimates for parametric models. Ann. Stat. 5, 445–463 (1977)

    MathSciNet  CrossRef  MATH  Google Scholar 

  7. Bhattacharyya, A.: On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 35, 99–109 (1943)

    MathSciNet  MATH  Google Scholar 

  8. Burbea, I., Rao, C.R.: On the convexity of some divergence measures based on entropy function. IEEE Trans. Inf. Theory 28(3), 489–495 (1982)

    MathSciNet  CrossRef  MATH  Google Scholar 

  9. Cerone, P., Dragomir, S.S.: A refinement of the Grüss inequality and applications. Tamkang J. Math. 38(1), 37–49 (2007). Preprint RGMIA. Res. Rep. Coll. 5(2), Art. 14 (2002). Online http://rgmia.org/v5n2.php.

  10. Chen, C.H.: Statistical Pattern Recognition. Hoyderc Book Co., Rocelle Park, New York (1973)

    Google Scholar 

  11. Chow, C.K., Lin, C.N.: Approximating discrete probability distributions with dependence trees. IEEE Trans. Inf. Theory 14(3), 462–467 (1968)

    CrossRef  MATH  Google Scholar 

  12. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)

    CrossRef  MATH  Google Scholar 

  13. Csiszár, I.: Information measures: a critical survey. In: Trans. 7th Prague Conf. on Info. Th., Statist. Decis. Funct., Random Processes and 8th European Meeting of Statist., vol. B, pp. 73–86. Academia Prague, Crechoslovakia (1978)

    Google Scholar 

  14. Csiszár, I.: Information-type measures of difference of probability functions and indirect observations. Studia Sci. Math. Hungar 2, 299–318 (1967)

    MathSciNet  MATH  Google Scholar 

  15. Csiszár, I., Körner, J.: Information Theory: Coding Theorem for Discrete Memoryless Systems. Academic Press, New York (1981)

    MATH  Google Scholar 

  16. Dacunha-Castelle, D.: Ecole d’ete de Probabilité s de Saint-Fleour, III-1997. Springer, Berlin/Heidelberg (1978)

    Google Scholar 

  17. Dragomir, S.S.: An improvement of Jensen’s inequality. Bull. Math. Soc. Sci. Math. Roum. 34(82)(4), 291–296 (1990)

    Google Scholar 

  18. Dragomir, S.S.: Some refinements of Ky Fan’s inequality. J. Math. Anal. Appl. 163(2), 317–321 (1992)

    MathSciNet  CrossRef  MATH  Google Scholar 

  19. Dragomir, S.S.: Some refinements of Jensen’s inequality. J. Math. Anal. Appl. 168(2), 518–522 (1992)

    MathSciNet  CrossRef  MATH  Google Scholar 

  20. Dragomir, S.S.: A further improvement of Jensen’s inequality. Tamkang J. Math. 25(1), 29–36 (1994)

    MathSciNet  MATH  Google Scholar 

  21. Dragomir, S.S.: A new improvement of Jensen’s inequality. Indian J. Pure Appl. Math. 26(10), 959–968 (1995)

    MathSciNet  MATH  Google Scholar 

  22. Dragomir, S.S.: Some inequalities for \(\left (m,M\right )\)-convex mappings and applications for the Csiszár f-divergence in information theory. Math. J. Ibaraki Univ. 33, 35–50 (2001)

    MathSciNet  CrossRef  MATH  Google Scholar 

  23. Dragomir, S.S.: Some inequalities for two Csiszár divergences and applications. Mat. Bilten 25, 73–90 (2001)

    MATH  Google Scholar 

  24. Dragomir, S.S.: On a reverse of Jessen’s inequality for isotonic linear functionals. J. Inequal. Pure Appl. Math. 2(3), Article 36 (2001)

    Google Scholar 

  25. Dragomir, S.S.: An upper bound for the Csiszár f-divergence in terms of variational distance and applications. Pan Am. Math. J. 12(4), 165–173 (2002)

    Google Scholar 

  26. Dragomir, S.S.: Upper and lower bounds for Csiszár f-divergence in terms of Hellinger discrimination and applications. Nonlinear Anal. Forum 7(1), 1–13 (2002)

    MathSciNet  MATH  Google Scholar 

  27. Dragomir, S.S.: Bounds for f-divergences under likelihood ratio constraints. Appl. Math. 48(3), 205–223 (2003)

    MathSciNet  CrossRef  MATH  Google Scholar 

  28. Dragomir, S.S.: A Grüss type inequality for isotonic linear functionals and applications. Demonstratio Math. 36(3), 551–562 (2003). Preprint RGMIA. Res. Rep. Coll. 5, Suplement, Art. 12 (2002). Online http://rgmia.org/v5(E).php

  29. Dragomir, S.S.: Some inequalities for the Csiszár f -divergence. J. KSIAM (Korea), 7(1), 63–77 (2003)

    Google Scholar 

  30. Dragomir, S.S.: New inequalities for Csiszár divergence and applications. Acta Math. Vietnam. 28(2), 123–134 (2003)

    MathSciNet  MATH  Google Scholar 

  31. Dragomir, S.S.: A converse inequality for the Csisz ár f-divergence. Tamsui Oxf. J. Math. Sci. (Taiwan) 20(1), 35–53 (2004)

    Google Scholar 

  32. Dragomir, S.S.: A converse inequality for the Csiszár Φ-divergence. Tamsui Oxf. J. Math. Sci. 20(1), 35–53 (2004). Preprint in S.S. Dragomir (ed.) Inequalities for Csiszár f-Divergence in Information Theory. RGMIA Monographs, Victoria University (2000). http://rgmia.org/monographs/csiszar_list.html#chap1

  33. Dragomir, S.S.: Some inequalities for the Csiszár f-divergence when f is an L-Lipschitzian function and applications. Ital. J. Pure Appl. Math. 15, 57–76 (2004)

    MATH  Google Scholar 

  34. Dragomir, S.S.: Semi-inner Products and Applications. Nova Science, New York (2004)

    MATH  Google Scholar 

  35. Dragomir, S.S.: Discrete Inequalities of the Cauchy-Bunyakovsky-Schwarz Type. Nova Science Publishers, New York (2004)

    MATH  Google Scholar 

  36. Dragomir, S.S.: Bounds for the normalized Jensen functional. Bull. Aust. Math. Soc. 74(3), 471–476 (2006)

    CrossRef  MATH  Google Scholar 

  37. Dragomir, S.S.: Bounds for the deviation of a function from the chord generated by its extremities. Bull. Aust. Math. Soc. 78(2), 225–248 (2008)

    MathSciNet  CrossRef  MATH  Google Scholar 

  38. Dragomir, S.S.: A refinement of Jensen’s inequality with applications for f-divergence measures. Taiwanese J. Math. 14(1), 153–164 (2010)

    MathSciNet  MATH  Google Scholar 

  39. Dragomir, S.S.: A new refinement of Jensen’s inequality in linear spaces with applications. Math. Comput. Model. 52(9–10), 1497–1505 (2010)

    CrossRef  MATH  Google Scholar 

  40. Dragomir, S.S.: Inequalities in terms of the Gâteaux derivatives for convex functions on linear spaces with applications. Bull. Aust. Math. Soc. 83(3), 500–517 (2011)

    MathSciNet  CrossRef  MATH  Google Scholar 

  41. Dragomir, S.S.: A refinement and a divided difference reverse of Jensen’s inequality with applications. Preprint RGMIA. Res. Rep. Coll. 14, Art. 74 (2011). Online http://rgmia.org/papers/v14/v14a74.pdf

  42. Dragomir, S.S.: Some Slater’s type Inequalities for convex functions defined on linear spaces and applications. Abstr. Appl. Anal. 2012 1–16 (2012), doi:10.1155/2012/168405. http://projecteuclid.org/euclid.aaa/135549564,MR2889076: http://www.ams.org/mathscinet-getitem?mr=2889076

  43. Dragomir, S.S.: Some reverses of the Jensen inequality with applications. Bull. Aust. Math. Soc. 87(2), 177–194 (2013)

    MathSciNet  CrossRef  MATH  Google Scholar 

  44. Dragomir, S.S.: Reverses of the Jensen inequality in terms of first derivative and applications. Acta Math. Vietnam. 38(3), 429–446 (2013)

    MathSciNet  CrossRef  MATH  Google Scholar 

  45. Dragomir, S.S., Goh, C.J.: A counterpart of Jensen’s discrete inequality for differentiable convex mappings and applications in information theory. Math. Comput. Model. 24(2), 1–11 (1996)

    MathSciNet  CrossRef  MATH  Google Scholar 

  46. Dragomir, S.S., Goh, C.J.: Some bounds on entropy measures in information theory. Appl. Math. Lett. 10, 23–28 (1997)

    MathSciNet  CrossRef  MATH  Google Scholar 

  47. Dragomir, S.S., Goh, C.J.: Some counterpart inequalities in for a functional associated with Jensen’s inequality. J. Inequal. Appl. 1, 311–325 (1997)

    MathSciNet  MATH  Google Scholar 

  48. Dragomir, S.S., Goh, C.J.: A counterpart of Jensen’s continuous inequality and applications in information theory. Anal. St. Univ. “Al. I. Cuza”, Iaşi, XLVII, 239–262 (2001)

    Google Scholar 

  49. Dragomir, S.S., Ionescu, N.M.: Some converse of Jensen’s inequality and applications. Rev. Anal. Numér. Théor. Approx. 23(1), 71–78 (1994)

    MathSciNet  MATH  Google Scholar 

  50. Dragomir, S.S., Pečarić, J., Persson, L.E.: Properties of some functionals related to Jensen’s inequality. Acta Math. Hung. 70(1–2), 129–143 (1996)

    CrossRef  MATH  Google Scholar 

  51. Dragomir, S.S., Scholz, M., Šunde, J.: Some upper bounds for relative entropy and applications. Comput. Math. Appl. 39, 91–100 (2000)

    CrossRef  MATH  Google Scholar 

  52. Frieden, B.R.: Image enhancement and restoration. In: Huang, T.S. (ed.) Picture Processing and Digital Filtering. Springer, Berlin (1975)

    Google Scholar 

  53. Gallager, R.G.: Information Theory and Reliable Communications. Wiley, New York (1968)

    Google Scholar 

  54. Gokhale, D.V., Kullback, S.: Information in Contingency Tables. Merul Dekker, New York (1978)

    MATH  Google Scholar 

  55. Havrda, J.H., Charvat, F.: Quantification method classification process: concept of structural α-entropy. Kybernetika 3, 30–35 (1967)

    MathSciNet  MATH  Google Scholar 

  56. Hellinger, E.: Neue Bergrüirdung du Theorie quadratisher Formerus von uneudlichvieleu Veränderlicher. J. für reine Augeur. Math. 36, 210–271 (1909)

    MathSciNet  Google Scholar 

  57. Jeffreys, H.: An invariant form for the prior probability in estimation problems. Proc. Roy. Soc. Lond. Ser. A 186, 453–461 (1946)

    MathSciNet  CrossRef  MATH  Google Scholar 

  58. Jistice, J.H. (ed.): Maximum Entropy and Bayesian Methods in Applied Statistics. Cambridge University Press, Cambridge (1986)

    Google Scholar 

  59. Kadota, T.T., Shepp, L.A.: On the best finite set of linear observables for discriminating two Gaussian signals. IEEE Trans. Inf. Theory 13, 288–294 (1967)

    Google Scholar 

  60. Kailath, T.: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. Commun. Technol. 15, 52–60 (1967)

    CrossRef  Google Scholar 

  61. Kapur, J.N.: A comparative assessment of various measures of directed divergence. Adv. Manage. Stud. 3(1), 1–16 (1984)

    MathSciNet  Google Scholar 

  62. Kapur, J.N.: On the roles of maximum entropy and minimum discrimination information principles in statistics. In: Technical Address of the 38th Annual Conference of the Indian Society of Agricultural Statistics, pp. 1–44 (1984)

    Google Scholar 

  63. Kullback, S.: Information Theory and Statistics. Wiley, New York (1959)

    MATH  Google Scholar 

  64. Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)

    MathSciNet  CrossRef  MATH  Google Scholar 

  65. Kazakos, D., Cotsidas, T.: A decision theory approach to the approximation of discrete probability densities. IEEE Trans. Perform. Anal. Mach. Intell. 1, 61–67 (1980)

    CrossRef  Google Scholar 

  66. Kemperman, J.H.B.: On the optimum note of transmitting information. Ann. Math. Stat. 40, 2158–2177 (1969)

    MathSciNet  CrossRef  Google Scholar 

  67. Kraft, C.: Some conditions for consistency and uniform consistency of statistical procedures. Univ. Calif. Publ. Stat. 1, 125–142 (1955)

    MathSciNet  Google Scholar 

  68. Leahy, R.M., Goutis, C.E.: An optimal technique for constraint-based image restoration and mensuration. IEEE Trans. Acoust. Speech Signal Process. 34, 1692–1642 (1986)

    Google Scholar 

  69. Lecam, L.: Asymptotic Methods in Statistical Decision Theory. Springer, New York (1986)

    Google Scholar 

  70. Liese, F., Vajda, I.: Convex Statistical Distances. Teubner Verlag, Leipzig (1987)

    MATH  Google Scholar 

  71. Lin, J.: Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theroy 37(1), 145–151 (1991)

    CrossRef  MATH  Google Scholar 

  72. Lin, J., Wong, S.K.M.: A new directed divergence measure and its characterization. Int. J. Gen. Syst. 17, 73–81 (1990)

    CrossRef  MATH  Google Scholar 

  73. Matić, M., Pearce, C.E.M., Pečarić, J.: Improvements of some bounds on entropy measures in information theory. Math. Inequal. Appl. 1, 295–304 (1998)

    MathSciNet  MATH  Google Scholar 

  74. Matić, M., Pečarić, J., Ujević, N.: On new estimation of the remainder in generalised Taylor’s formula. Math. Inequal. Appl. 2(3), 343–361 (1999)

    MathSciNet  MATH  Google Scholar 

  75. Mckean, H.P. Jr.: Speed of approach to equilibrium for Koc’s caricature of a Maximilian gas. Arch. Ration. Mech. Anal. 21, 343–367 (1966)

    MathSciNet  CrossRef  MATH  Google Scholar 

  76. Mei, M.: The theory of genetic distance and evaluation of human races. Jpn. J. Hum. Genet. 23, 341–369 (1978)

    CrossRef  Google Scholar 

  77. Pielou, E.C.: Ecological Diversity. Wiley, New York (1975)

    Google Scholar 

  78. Pinsker, M.S.: Information and Information Stability of Random variables and processes (in Russian). Izv. Akad. Nouk, Moscow (1960)

    Google Scholar 

  79. Rao, C.R.: Diversity and dissimilarity coefficients: a unified approach. Theor. Popul. Biol. 21, 24–43 (1982)

    CrossRef  MATH  Google Scholar 

  80. Rényi, A.: On measures of entropy and information. In: Proc. Fourth Berkeley Symp. Math. Stat. and Prob., vol. 1, pp. 547–561. University of California Press, Berkeley (1961)

    Google Scholar 

  81. Sen, A.: On Economic Inequality. Oxford University Press, London (1973)

    CrossRef  Google Scholar 

  82. Sharma, B.D., Mittal, D.P.: New non-additive measures of relative information. J. Comb. Inf. Syst. Sci. 2(4), 122–132 (1977)

    MathSciNet  MATH  Google Scholar 

  83. Shioya, H., Da-Te, T.: A generalisation of Lin divergence and the derivative of a new information divergence. Electron. Commun. Jpn. 78(7), 37–40 (1995)

    CrossRef  Google Scholar 

  84. Simić, S.: On a global upper bound for Jensen’s inequality. J. Math. Anal. Appl. 343, 414–419 (2008)

    MathSciNet  CrossRef  MATH  Google Scholar 

  85. Taneja, I.J.: Generalised Information Measures and their Applications (2001). http://www.mtm.ufsc.br/~taneja/bhtml/bhtml.html

  86. Theil, H.: Economics and Information Theory. North-Holland, Amsterdam (1967)

    Google Scholar 

  87. Theil, H.: Statistical Decomposition Analysis. North-Holland, Amsterdam (1972)

    MATH  Google Scholar 

  88. Topsoe, F.: Some inequalities for information divergence and related measures of discrimination. Res. Rep. Coll. RGMIA 2(1), 85–98 (1999)

    Google Scholar 

  89. Toussaint, G.T.: Sharper lower bounds for discrimination in terms of variation. IEEE Trans. Inf. Theory 21, 99–100 (1975)

    MathSciNet  CrossRef  MATH  Google Scholar 

  90. Vajda, I.: Theory of Statistical Inference and Information. Kluwer, Boston (1989)

    MATH  Google Scholar 

  91. Vajda, I.: Note on discrimination information and variation. IEEE Trans. Inf. Theory 16, 771–773 (1970)

    MathSciNet  CrossRef  MATH  Google Scholar 

  92. Volkonski, V.A., Rozanov, J.A.: Some limit theorems for random function -I (English Trans.). Theory Probab. Appl. 4, 178–197 (1959)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. S. Dragomir .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Dragomir, S.S. (2015). A Survey of Reverse Inequalities for f-Divergence Measure in Information Theory. In: Daras, N., Rassias, M. (eds) Computation, Cryptography, and Network Security. Springer, Cham. https://doi.org/10.1007/978-3-319-18275-9_9

Download citation