Abstract
We refine Shannon’s inequality, in its discrete and integral forms, by presenting upper estimates of the difference between its two sides. Applications to some bounds in information theory are given.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
M. S. Alencar and F. M. Assis, Inequalities involving the incomplete Renyi entropy, divergence and variational distance, Proceedings Globecom’ 98, Sydney.
J. P. Allouche, M. Mendes France and G. Tenenbaum, Entropy: an inequality, Tokyo J. Math 11 (1988), 323–328.
M. Biernacki, H. Pidek and C. Ryll-Nardzewski, Sur une inégalité entre des intégrates définies, Ann. Univ. Mariæ Curie-Sklodowska Sect. A Math. 4 (1950), 1–4.
I. Csiszar and J. Korner, Information theory: coding theorems for discrete memoryless systems, Academic Press, New York, 1981.
Z. Daróczy, Inequalities for some infinite series, Acta Math. Hungar. 75 (1997), 5–8.
S. S. Dragomir, Grüss inequality in inner product spaces, RGMIA Research Report Collection, Vol. 1, No. 1, 1998.
S. S. Dragomir and C. J. Goh, A counterpart of Jensen’s discrete inequality for differentiable convex mappings and applications in information theory, Math. Comput. Modelling 24 (1996), 1–11.
—, Some bounds on entropy measures in information theory, Appl. Math. Lett. 10 (1997), 23–28.
V. Gluščević, C. E. M. Pearce, J. Pečarić and J. Šunde, Bounds on differential entropy measures, Proceedings — 20th conference on Information Technology Interfaces (ITI’98) (1998), 383–388.
S. W. Golomb, R. E. Peile and R. A. Scholtz, Basic concepts in information theory and coding — the adventures of secret agent 00111, Plenum Press, New York, 1994.
M. Matić, C. E. M. Pearce and J. Pečarić, Improvements of some bounds on entropy measures in information theory, Math. Ineq. Appl. 1 (1998), 295–304.
—, Refinements of some bounds in information theory, submitted.
—, On an inequality for the entropy of a probability distribution, Acta Math. Hungarica, to appear.
—, Further improvements of some bounds on entropy measures in information theory, submitted.
—, Some refinements of Shannon’s inequalities, submitted.
—, Some further bounds for differential entropy measures, submitted.
M. Matić and J. Pečarić, A short proof of an extremum problem for infinite discrete distribution, submitted.
R. J. McEliece, The theory of information and coding, Addison-Wesley, Reading, Mass., 1977.
D. S. Mitrinović, J. E. Pečarić and A. M. Fink, Classical and new inequalities in analysis, Kluwer Academ. Publ., Dordrecht, 1993.
Z. Pauše, Uvod u teoriju informacije, Školska Knjiga, Zagreb, 1989.
J. E. Pečarić, F. Proschan and Y. L. Tong, Convex functions, partial orderings, and statistical applications, Academic Press, 1992.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Matić, M., Pearce, C.E.M., Pečarić, J. (2000). Shannon’s and Related Inequalities in Information Theory. In: Survey on Classical Inequalities. Mathematics and Its Applications, vol 517. Springer, Dordrecht. https://doi.org/10.1007/978-94-011-4339-4_5
Download citation
DOI: https://doi.org/10.1007/978-94-011-4339-4_5
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-010-5868-1
Online ISBN: 978-94-011-4339-4
eBook Packages: Springer Book Archive