Abstract
The problem of determining the maximum mutual information I(X; Y) and minimum entropy H(X, Y) of a pair of discrete random variables X and Y is considered under the condition that the probability distribution of X is fixed and the error probability Pr{Y ≠ X} takes a given value ε, 0 ≤ ε ≤ 1. Precise values for these quantities are found, which in several cases allows us to obtain explicit formulas for both the maximum information and minimum entropy in terms of the probability distribution of X and the parameter ε.
Similar content being viewed by others
References
Erokhin, V., e-Entropy of a Discrete Random Variable, Teor. Veroyatnost. i Primenen., 1958, vol. 3, no. 1, pp. 103–107 [Theory Probab. Appl. (Engl. Transl.), 1958, vol. 3, no. 1, pp. 97–100].
Berger, T., Rate Distortion Theory: A Mathematical Basis for Data Compression, Englewood Cliffs, NJ: Prentice-Hall, 1971.
Ho, S.-W. and Verdú, S., On the Interplay between Conditional Entropy and Error Probability, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 12, pp. 5930–5942.
Zhang, Z., Estimating Mutual Information via Kolmogorov Distance, IEEE Trans. Inform. Theory, 2007, vol. 53, no. 9, pp. 3280–3282.
Prelov, V.V., On Inequalities between Mutual Information and Variation, Probl. Peredachi Inf., 2007, vol. 43, no. 1, pp. 15–27 [Probl. Inf. Trans. (Engl. Transl.), 2007, vol. 43, no. 1, pp. 12–23].
Prelov, V.V. and van der Meulen, E.C., Mutual Information, Variation, and Fano’s Inequality, Probl. Peredachi Inf., 2008, vol. 44, no. 3, pp. 19–32 [Probl. Inf. Trans. (Engl. Transl.), 2008, vol. 44, no. 3, pp. 185–197].
Ho, S.-W. and Yeung, R.W., The Interplay between Entropy and Variational Distance, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 12, pp. 5906–5929.
Sason, I., Entropy Bounds for Discrete Random Variables via Maximal Coupling, IEEE Trans. Inform. Theory, 2013, vol. 59, no. 11, pp. 7118–7131.
Prelov, V.V., On One Extreme Value Problem for Entropy and Error Probability, Probl. Peredachi Inf., 2014, vol. 50, no. 3, pp. 3–18 [Probl. Inf. Trans. (Engl. Transl.), 2014, vol. 50, no. 3, pp. 203–216].
Author information
Authors and Affiliations
Corresponding author
Additional information
Original Russian Text © V.V. Prelov, 2016, published in Problemy Peredachi Informatsii, 2016, Vol. 52, No. 4, pp. 3–13.
Supported in part by the Russian Foundation for Basic Research, project no. 15-01-08051.
Rights and permissions
About this article
Cite this article
Prelov, V.V. On some extremal problems for mutual information and entropy. Probl Inf Transm 52, 319–328 (2016). https://doi.org/10.1134/S0032946016040013
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0032946016040013