Advertisement

Acta Mathematica Vietnamica

, Volume 38, Issue 3, pp 429–446 | Cite as

Reverses of the Jensen inequality in terms of first derivative and applications

  • S. S. DragomirEmail author
Article

Abstract

Two new reverses of the celebrated Jensen integral inequality for convex functions with applications for means, the Hölder inequality and f-divergence measures in information theory are given.

Keywords

Jensen’s inequality Measurable functions Lebesgue integral Divergence measures f-Divergence 

Mathematics Subject Classification (2000)

26D15 26D20 94A05 

Notes

Acknowledgements

The author would like to thank the anonymous referee for reading carefully the manuscript and providing some suggestions that have been implemented in the final version of the paper.

References

  1. 1.
    Ali, S.M., Silvey, S.D.: A general class of coefficients of divergence of one distribution from another. J. R. Stat. Soc., Ser. B, Stat. Methodol. 28, 131–142 (1966) MathSciNetzbMATHGoogle Scholar
  2. 2.
    Bhattacharyya, A.: On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 35, 99–109 (1943) MathSciNetzbMATHGoogle Scholar
  3. 3.
    Bethbassat, M.: f-entropies, probability of error, and feature selection. Inf. Control 39, 227–242 (1978) CrossRefGoogle Scholar
  4. 4.
    Burbea, I., Rao, C.R.: On the convexity of some divergence measures based on entropy function. IEEE Trans. Inf. Theory 28(3), 489–495 (1982) MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Chen, C.H.: Statistical Pattern Recognition. Hoyderc Book Co., Rocelle Park (1973) Google Scholar
  6. 6.
    Chow, C.K., Lin, C.N.: Approximating discrete probability distributions with dependence trees. IEEE Trans. Inf. Theory 14(3), 462–467 (1968) CrossRefzbMATHGoogle Scholar
  7. 7.
    Csiszár, I.: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hung. 2, 299–318 (1967) zbMATHGoogle Scholar
  8. 8.
    Csiszár, I.: On topological properties of f-divergences. Studia Sci. Math. Hung. 2, 329–339 (1967) zbMATHGoogle Scholar
  9. 9.
    Csiszár, I., Körner, J.: Information Theory: Coding Theorem for Discrete Memoryless Systems. Academic Press, New York (1981) zbMATHGoogle Scholar
  10. 10.
    Dragomir, S.S.: Bounds for the deviation of a function from the chord generated by its extremities. Bull. Aust. Math. Soc. 78(2), 225–248 (2008) MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Dragomir, S.S.: A converse result for Jensen’s discrete inequality via Gruss’ inequality and applications in information theory. An. Univ. Oradea Fasc. Mat. 7, 178–189 (1999/2000) Google Scholar
  12. 12.
    Dragomir, S.S.: On a reverse of Jessen’s inequality for isotonic linear functionals. J. Inequal. Pure Appl. Math. 2(3), 36 (2001) MathSciNetGoogle Scholar
  13. 13.
    Dragomir, S.S.: A Grüss type inequality for isotonic linear functionals and applications. Demonstr. Math. 36(3), 551–562 (2003). Preprint RGMIA Res. Rep. Coll. 5(2002), Supplement, Art. 12. [Online http://rgmia.org/v5(E).php] zbMATHGoogle Scholar
  14. 14.
    Dragomir, S.S., Ionescu, N.M.: Some converse of Jensen’s inequality and applications. Rev. Anal. Numér. Théor. Approx. 23(1), 71–78 (1994) MathSciNetzbMATHGoogle Scholar
  15. 15.
    Gokhale, D.V., Kullback, S.: The Information in Contingency Tables. Marcel Decker, New York (1978) zbMATHGoogle Scholar
  16. 16.
    Havrda, J.H., Charvat, F.: Quantification method classification process: concept of structural α-entropy. Kybernetika 3, 30–35 (1967) MathSciNetzbMATHGoogle Scholar
  17. 17.
    Hellinger, E.: Neue Bergrüirdung du Theorie quadratisher Formerus von uneudlichvieleu Veränderlicher. J. reine Augeur. Math. 36, 210–271 (1909) Google Scholar
  18. 18.
    Jeffreys, H.: An invariant form for the prior probability in estimating problems. Proc. R. Soc. Lond. Ser. A, Math. Phys. Sci. 186, 453–461 (1946) MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Kadota, T.T., Shepp, L.A.: On the best finite set of linear observables for discriminating two Gaussian signals. IEEE Trans. Inf. Theory 13, 288–294 (1967) Google Scholar
  20. 20.
    Kailath, T.: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. Commun. 15(1), 52–60 (1967) CrossRefGoogle Scholar
  21. 21.
    Kapur, J.N.: A comparative assessment of various measures of directed divergence. Adv. Manage. Stud. 3, 1–16 (1984) CrossRefGoogle Scholar
  22. 22.
    Kazakos, D., Cotsidas, D.: A decision theory approach to the approximation of discrete probability densities. IEEE Trans. Pattern Anal. Mach. Intell. 2(1), 61–67 (1980) CrossRefzbMATHGoogle Scholar
  23. 23.
    Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951) MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Lin, J.: Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theory 37(1), 145–151 (1991) CrossRefzbMATHGoogle Scholar
  25. 25.
    Mei, M.: The theory of genetic distance and evaluation of human races. Jpn. J. Hum. Genet. 23, 341–369 (1978) CrossRefGoogle Scholar
  26. 26.
    Niculescu, C.P.: An extension of Chebyshev’s inequality and its connection with Jensen’s inequality. J. Inequal. Appl. 6(4), 451–462 (2001) MathSciNetzbMATHGoogle Scholar
  27. 27.
    Pielou, E.C.: Ecological Diversity. Wiley, New York (1975) Google Scholar
  28. 28.
    Rao, C.R.: Diversity and dissimilarity coefficients: a unified approach. Theor. Popul. Biol. 21(1), 24–43 (1982) CrossRefzbMATHGoogle Scholar
  29. 29.
    Rényi, A.: On measures of entropy and information. In: Proc. Fourth Berkeley Symp. on Math. Statist. and Prob. (Univ. of Calif. Press), vol. 1, pp. 547–561 (1961) Google Scholar
  30. 30.
    Roberts, A.W., Varberg, D.E.: Convex Functions. Academic Press, New York (1973) zbMATHGoogle Scholar
  31. 31.
    Sen, A.: On Economic Inequality. Oxford University Press, London (1973) CrossRefGoogle Scholar
  32. 32.
    Sharma, B.D., Mittal, D.P.: New non-additive measures of relative information. J. Comb. Inf. Syst. Sci. 2(4), 122–132 (1977) MathSciNetzbMATHGoogle Scholar
  33. 33.
    Shioya, H., Da-Te, T.: A generalisation of Lin divergence and the derivative of a new information divergence. Electron. Commun. Jpn. 78(7), 37–40 (1995) Google Scholar
  34. 34.
    Taneja, I.J.: Generalised Information Measures and Their Applications (2001). Universidade Federal de Santa Catarina. http://www.mtm.ufsc.br/~taneja/bhtml/bhtml.html Google Scholar
  35. 35.
    Topsoe, F.: Some inequalities for information divergence and related measures of discrimination. IEEE Trans. Inf. Theory 46(4), 1602–1609 (2000) MathSciNetCrossRefGoogle Scholar
  36. 36.
    Theil, H.: Economics and Information Theory. North-Holland, Amsterdam (1967) Google Scholar
  37. 37.
    Theil, H.: Statistical Decomposition Analysis. North-Holland, Amsterdam (1972) zbMATHGoogle Scholar
  38. 38.
    Vajda, I.: Theory of Statistical Inference and Information. Kluwer Academic, Dordrecht (1989) zbMATHGoogle Scholar

Copyright information

© Institute of Mathematics, Vietnam Academy of Science and Technology (VAST) and Springer Science+Business Media Singapore 2013

Authors and Affiliations

  1. 1.Mathematics, School of Engineering & ScienceVictoria UniversityMelbourne CityAustralia
  2. 2.School of Computational & Applied MathematicsUniversity of the WitwatersrandJohannesburgSouth Africa

Personalised recommendations