Abstract
In this paper we survey some discrete inequalities for the f-divergence measure in Information Theory by the use of recent reverses of the celebrated Jensen’s inequality. Applications in connection with Hölder’s inequality and for particular measures such as Kullback–Leibler divergence measure, Hellinger discrimination, χ 2-distance and variation distance are provided as well.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ali, S.M., Silvey, S.D.: A general class of coefficients of divergence of one distribution from another. J. R. Stat. Soc. Sec. B 28, 131–142 (1966)
Barnett, N.S., Cerone, P., Dragomir, S.S., Sofo, A.: Approximating Csiszár f-divergence by the use of Taylor’s formula with integral remainder. Math. Ineq. Appl. 5(3), 417–434 (2002)
Barnett, N.S., Cerone, P., Dragomri, S.S., Sofo, A.: Approximating two mappings associated to Csiszár f-divergence via Taylor’s expansion. Pan Am. Math. J. 12(4), 105–117 (2002)
Barnett, N.S., Cerone, P., Dragomir, S.S.: Some new inequalities for Hermite-Hadamard difference in information theory. In: Cho, Y.J., Kim, J.K., Choi, Y.K. (eds.) Stochastic Analysis and Applications, pp. 7–20. Nova Science Publishers, New York (2003)
Beth Bassat, M.: f-Entropies, probability of error and feature selection. Inform. Control 39, 227–242 (1978)
Beran, R.: Minimum Hellinger distance estimates for parametric models. Ann. Stat. 5, 445–463 (1977)
Bhattacharyya, A.: On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 35, 99–109 (1943)
Burbea, I., Rao, C.R.: On the convexity of some divergence measures based on entropy function. IEEE Trans. Inf. Theory 28(3), 489–495 (1982)
Cerone, P., Dragomir, S.S.: A refinement of the Grüss inequality and applications. Tamkang J. Math. 38(1), 37–49 (2007). Preprint RGMIA. Res. Rep. Coll. 5(2), Art. 14 (2002). Online http://rgmia.org/v5n2.php.
Chen, C.H.: Statistical Pattern Recognition. Hoyderc Book Co., Rocelle Park, New York (1973)
Chow, C.K., Lin, C.N.: Approximating discrete probability distributions with dependence trees. IEEE Trans. Inf. Theory 14(3), 462–467 (1968)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)
Csiszár, I.: Information measures: a critical survey. In: Trans. 7th Prague Conf. on Info. Th., Statist. Decis. Funct., Random Processes and 8th European Meeting of Statist., vol. B, pp. 73–86. Academia Prague, Crechoslovakia (1978)
Csiszár, I.: Information-type measures of difference of probability functions and indirect observations. Studia Sci. Math. Hungar 2, 299–318 (1967)
Csiszár, I., Körner, J.: Information Theory: Coding Theorem for Discrete Memoryless Systems. Academic Press, New York (1981)
Dacunha-Castelle, D.: Ecole d’ete de Probabilité s de Saint-Fleour, III-1997. Springer, Berlin/Heidelberg (1978)
Dragomir, S.S.: An improvement of Jensen’s inequality. Bull. Math. Soc. Sci. Math. Roum. 34(82)(4), 291–296 (1990)
Dragomir, S.S.: Some refinements of Ky Fan’s inequality. J. Math. Anal. Appl. 163(2), 317–321 (1992)
Dragomir, S.S.: Some refinements of Jensen’s inequality. J. Math. Anal. Appl. 168(2), 518–522 (1992)
Dragomir, S.S.: A further improvement of Jensen’s inequality. Tamkang J. Math. 25(1), 29–36 (1994)
Dragomir, S.S.: A new improvement of Jensen’s inequality. Indian J. Pure Appl. Math. 26(10), 959–968 (1995)
Dragomir, S.S.: Some inequalities for \(\left (m,M\right )\)-convex mappings and applications for the Csiszár f-divergence in information theory. Math. J. Ibaraki Univ. 33, 35–50 (2001)
Dragomir, S.S.: Some inequalities for two Csiszár divergences and applications. Mat. Bilten 25, 73–90 (2001)
Dragomir, S.S.: On a reverse of Jessen’s inequality for isotonic linear functionals. J. Inequal. Pure Appl. Math. 2(3), Article 36 (2001)
Dragomir, S.S.: An upper bound for the Csiszár f-divergence in terms of variational distance and applications. Pan Am. Math. J. 12(4), 165–173 (2002)
Dragomir, S.S.: Upper and lower bounds for Csiszár f-divergence in terms of Hellinger discrimination and applications. Nonlinear Anal. Forum 7(1), 1–13 (2002)
Dragomir, S.S.: Bounds for f-divergences under likelihood ratio constraints. Appl. Math. 48(3), 205–223 (2003)
Dragomir, S.S.: A Grüss type inequality for isotonic linear functionals and applications. Demonstratio Math. 36(3), 551–562 (2003). Preprint RGMIA. Res. Rep. Coll. 5, Suplement, Art. 12 (2002). Online http://rgmia.org/v5(E).php
Dragomir, S.S.: Some inequalities for the Csiszár f -divergence. J. KSIAM (Korea), 7(1), 63–77 (2003)
Dragomir, S.S.: New inequalities for Csiszár divergence and applications. Acta Math. Vietnam. 28(2), 123–134 (2003)
Dragomir, S.S.: A converse inequality for the Csisz ár f-divergence. Tamsui Oxf. J. Math. Sci. (Taiwan) 20(1), 35–53 (2004)
Dragomir, S.S.: A converse inequality for the Csiszár Φ-divergence. Tamsui Oxf. J. Math. Sci. 20(1), 35–53 (2004). Preprint in S.S. Dragomir (ed.) Inequalities for Csiszár f-Divergence in Information Theory. RGMIA Monographs, Victoria University (2000). http://rgmia.org/monographs/csiszar_list.html#chap1
Dragomir, S.S.: Some inequalities for the Csiszár f-divergence when f is an L-Lipschitzian function and applications. Ital. J. Pure Appl. Math. 15, 57–76 (2004)
Dragomir, S.S.: Semi-inner Products and Applications. Nova Science, New York (2004)
Dragomir, S.S.: Discrete Inequalities of the Cauchy-Bunyakovsky-Schwarz Type. Nova Science Publishers, New York (2004)
Dragomir, S.S.: Bounds for the normalized Jensen functional. Bull. Aust. Math. Soc. 74(3), 471–476 (2006)
Dragomir, S.S.: Bounds for the deviation of a function from the chord generated by its extremities. Bull. Aust. Math. Soc. 78(2), 225–248 (2008)
Dragomir, S.S.: A refinement of Jensen’s inequality with applications for f-divergence measures. Taiwanese J. Math. 14(1), 153–164 (2010)
Dragomir, S.S.: A new refinement of Jensen’s inequality in linear spaces with applications. Math. Comput. Model. 52(9–10), 1497–1505 (2010)
Dragomir, S.S.: Inequalities in terms of the Gâteaux derivatives for convex functions on linear spaces with applications. Bull. Aust. Math. Soc. 83(3), 500–517 (2011)
Dragomir, S.S.: A refinement and a divided difference reverse of Jensen’s inequality with applications. Preprint RGMIA. Res. Rep. Coll. 14, Art. 74 (2011). Online http://rgmia.org/papers/v14/v14a74.pdf
Dragomir, S.S.: Some Slater’s type Inequalities for convex functions defined on linear spaces and applications. Abstr. Appl. Anal. 2012 1–16 (2012), doi:10.1155/2012/168405. http://projecteuclid.org/euclid.aaa/135549564,MR2889076: http://www.ams.org/mathscinet-getitem?mr=2889076
Dragomir, S.S.: Some reverses of the Jensen inequality with applications. Bull. Aust. Math. Soc. 87(2), 177–194 (2013)
Dragomir, S.S.: Reverses of the Jensen inequality in terms of first derivative and applications. Acta Math. Vietnam. 38(3), 429–446 (2013)
Dragomir, S.S., Goh, C.J.: A counterpart of Jensen’s discrete inequality for differentiable convex mappings and applications in information theory. Math. Comput. Model. 24(2), 1–11 (1996)
Dragomir, S.S., Goh, C.J.: Some bounds on entropy measures in information theory. Appl. Math. Lett. 10, 23–28 (1997)
Dragomir, S.S., Goh, C.J.: Some counterpart inequalities in for a functional associated with Jensen’s inequality. J. Inequal. Appl. 1, 311–325 (1997)
Dragomir, S.S., Goh, C.J.: A counterpart of Jensen’s continuous inequality and applications in information theory. Anal. St. Univ. “Al. I. Cuza”, Iaşi, XLVII, 239–262 (2001)
Dragomir, S.S., Ionescu, N.M.: Some converse of Jensen’s inequality and applications. Rev. Anal. Numér. Théor. Approx. 23(1), 71–78 (1994)
Dragomir, S.S., Pečarić, J., Persson, L.E.: Properties of some functionals related to Jensen’s inequality. Acta Math. Hung. 70(1–2), 129–143 (1996)
Dragomir, S.S., Scholz, M., Šunde, J.: Some upper bounds for relative entropy and applications. Comput. Math. Appl. 39, 91–100 (2000)
Frieden, B.R.: Image enhancement and restoration. In: Huang, T.S. (ed.) Picture Processing and Digital Filtering. Springer, Berlin (1975)
Gallager, R.G.: Information Theory and Reliable Communications. Wiley, New York (1968)
Gokhale, D.V., Kullback, S.: Information in Contingency Tables. Merul Dekker, New York (1978)
Havrda, J.H., Charvat, F.: Quantification method classification process: concept of structural α-entropy. Kybernetika 3, 30–35 (1967)
Hellinger, E.: Neue Bergrüirdung du Theorie quadratisher Formerus von uneudlichvieleu Veränderlicher. J. für reine Augeur. Math. 36, 210–271 (1909)
Jeffreys, H.: An invariant form for the prior probability in estimation problems. Proc. Roy. Soc. Lond. Ser. A 186, 453–461 (1946)
Jistice, J.H. (ed.): Maximum Entropy and Bayesian Methods in Applied Statistics. Cambridge University Press, Cambridge (1986)
Kadota, T.T., Shepp, L.A.: On the best finite set of linear observables for discriminating two Gaussian signals. IEEE Trans. Inf. Theory 13, 288–294 (1967)
Kailath, T.: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. Commun. Technol. 15, 52–60 (1967)
Kapur, J.N.: A comparative assessment of various measures of directed divergence. Adv. Manage. Stud. 3(1), 1–16 (1984)
Kapur, J.N.: On the roles of maximum entropy and minimum discrimination information principles in statistics. In: Technical Address of the 38th Annual Conference of the Indian Society of Agricultural Statistics, pp. 1–44 (1984)
Kullback, S.: Information Theory and Statistics. Wiley, New York (1959)
Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)
Kazakos, D., Cotsidas, T.: A decision theory approach to the approximation of discrete probability densities. IEEE Trans. Perform. Anal. Mach. Intell. 1, 61–67 (1980)
Kemperman, J.H.B.: On the optimum note of transmitting information. Ann. Math. Stat. 40, 2158–2177 (1969)
Kraft, C.: Some conditions for consistency and uniform consistency of statistical procedures. Univ. Calif. Publ. Stat. 1, 125–142 (1955)
Leahy, R.M., Goutis, C.E.: An optimal technique for constraint-based image restoration and mensuration. IEEE Trans. Acoust. Speech Signal Process. 34, 1692–1642 (1986)
Lecam, L.: Asymptotic Methods in Statistical Decision Theory. Springer, New York (1986)
Liese, F., Vajda, I.: Convex Statistical Distances. Teubner Verlag, Leipzig (1987)
Lin, J.: Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theroy 37(1), 145–151 (1991)
Lin, J., Wong, S.K.M.: A new directed divergence measure and its characterization. Int. J. Gen. Syst. 17, 73–81 (1990)
Matić, M., Pearce, C.E.M., Pečarić, J.: Improvements of some bounds on entropy measures in information theory. Math. Inequal. Appl. 1, 295–304 (1998)
Matić, M., Pečarić, J., Ujević, N.: On new estimation of the remainder in generalised Taylor’s formula. Math. Inequal. Appl. 2(3), 343–361 (1999)
Mckean, H.P. Jr.: Speed of approach to equilibrium for Koc’s caricature of a Maximilian gas. Arch. Ration. Mech. Anal. 21, 343–367 (1966)
Mei, M.: The theory of genetic distance and evaluation of human races. Jpn. J. Hum. Genet. 23, 341–369 (1978)
Pielou, E.C.: Ecological Diversity. Wiley, New York (1975)
Pinsker, M.S.: Information and Information Stability of Random variables and processes (in Russian). Izv. Akad. Nouk, Moscow (1960)
Rao, C.R.: Diversity and dissimilarity coefficients: a unified approach. Theor. Popul. Biol. 21, 24–43 (1982)
Rényi, A.: On measures of entropy and information. In: Proc. Fourth Berkeley Symp. Math. Stat. and Prob., vol. 1, pp. 547–561. University of California Press, Berkeley (1961)
Sen, A.: On Economic Inequality. Oxford University Press, London (1973)
Sharma, B.D., Mittal, D.P.: New non-additive measures of relative information. J. Comb. Inf. Syst. Sci. 2(4), 122–132 (1977)
Shioya, H., Da-Te, T.: A generalisation of Lin divergence and the derivative of a new information divergence. Electron. Commun. Jpn. 78(7), 37–40 (1995)
Simić, S.: On a global upper bound for Jensen’s inequality. J. Math. Anal. Appl. 343, 414–419 (2008)
Taneja, I.J.: Generalised Information Measures and their Applications (2001). http://www.mtm.ufsc.br/~taneja/bhtml/bhtml.html
Theil, H.: Economics and Information Theory. North-Holland, Amsterdam (1967)
Theil, H.: Statistical Decomposition Analysis. North-Holland, Amsterdam (1972)
Topsoe, F.: Some inequalities for information divergence and related measures of discrimination. Res. Rep. Coll. RGMIA 2(1), 85–98 (1999)
Toussaint, G.T.: Sharper lower bounds for discrimination in terms of variation. IEEE Trans. Inf. Theory 21, 99–100 (1975)
Vajda, I.: Theory of Statistical Inference and Information. Kluwer, Boston (1989)
Vajda, I.: Note on discrimination information and variation. IEEE Trans. Inf. Theory 16, 771–773 (1970)
Volkonski, V.A., Rozanov, J.A.: Some limit theorems for random function -I (English Trans.). Theory Probab. Appl. 4, 178–197 (1959)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Dragomir, S.S. (2015). A Survey of Reverse Inequalities for f-Divergence Measure in Information Theory. In: Daras, N., Rassias, M. (eds) Computation, Cryptography, and Network Security. Springer, Cham. https://doi.org/10.1007/978-3-319-18275-9_9
Download citation
DOI: https://doi.org/10.1007/978-3-319-18275-9_9
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-18274-2
Online ISBN: 978-3-319-18275-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)