Abstract
We prove a global assertion on logarithmic convexity of Csiszár’s ƒ-divergence. It follows that the relative s-information measure is log-convex for s ∈ ℝ, wherefrom some new inequalities connecting Kullback-Leibler divergence and χ2 and Hellinger distances arise.
Similar content being viewed by others
References
G. H. Hardy, J. E. Littlewood and G. Pólya, Inequalities, Cambridge University Press (Cambridge, 1978).
H. Jeffreys, An invariant form for the prior probability in estimation problems, Proc. Roy. Soc. Lon., Ser. A, 186 (1946), 453–461.
S. Kullback and R. A. Leibler, On information and sufficiency, Annals of Math. Stat., 22 (1951), 79–86.
A. Rényi, On measures of entropy and information, n: Proc. IV Berkeley Symp. Math. Statist. Prob., Vol. 1. University of California Press (Berkeley, 1961).
I. Csiszár, Information-type measures of difference of probability functions and indirect observations, Studia Sci. Math. Hungar., 2 (1967), 299–318.
I. J. Taneja, New developments in generalized information measures, Advances in Imaging and Electron Physics, 91 (1995), 37–135.
I. Vajda, Theory of Statistical Inference and Information, Kluwer Academic Press (London, 1989).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Simic, S. On certain new inequalities in information theory. Acta Math Hung 124, 353–361 (2009). https://doi.org/10.1007/s10474-009-8205-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10474-009-8205-z
Key words and phrases
- logarithmic convexity
- Csiszár’s ƒ-divergence
- relative s-information
- Hellinger distance
- α-order divergence