Skip to main content
Log in

On some extremal problems for mutual information and entropy

  • Information Theory
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

The problem of determining the maximum mutual information I(X; Y) and minimum entropy H(X, Y) of a pair of discrete random variables X and Y is considered under the condition that the probability distribution of X is fixed and the error probability Pr{Y ≠ X} takes a given value ε, 0 ≤ ε ≤ 1. Precise values for these quantities are found, which in several cases allows us to obtain explicit formulas for both the maximum information and minimum entropy in terms of the probability distribution of X and the parameter ε.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Erokhin, V., e-Entropy of a Discrete Random Variable, Teor. Veroyatnost. i Primenen., 1958, vol. 3, no. 1, pp. 103–107 [Theory Probab. Appl. (Engl. Transl.), 1958, vol. 3, no. 1, pp. 97–100].

    MATH  MathSciNet  Google Scholar 

  2. Berger, T., Rate Distortion Theory: A Mathematical Basis for Data Compression, Englewood Cliffs, NJ: Prentice-Hall, 1971.

    MATH  Google Scholar 

  3. Ho, S.-W. and Verdú, S., On the Interplay between Conditional Entropy and Error Probability, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 12, pp. 5930–5942.

    Article  MathSciNet  Google Scholar 

  4. Zhang, Z., Estimating Mutual Information via Kolmogorov Distance, IEEE Trans. Inform. Theory, 2007, vol. 53, no. 9, pp. 3280–3282.

    Article  MATH  MathSciNet  Google Scholar 

  5. Prelov, V.V., On Inequalities between Mutual Information and Variation, Probl. Peredachi Inf., 2007, vol. 43, no. 1, pp. 15–27 [Probl. Inf. Trans. (Engl. Transl.), 2007, vol. 43, no. 1, pp. 12–23].

    MATH  MathSciNet  Google Scholar 

  6. Prelov, V.V. and van der Meulen, E.C., Mutual Information, Variation, and Fano’s Inequality, Probl. Peredachi Inf., 2008, vol. 44, no. 3, pp. 19–32 [Probl. Inf. Trans. (Engl. Transl.), 2008, vol. 44, no. 3, pp. 185–197].

    MATH  MathSciNet  Google Scholar 

  7. Ho, S.-W. and Yeung, R.W., The Interplay between Entropy and Variational Distance, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 12, pp. 5906–5929.

    Article  MathSciNet  Google Scholar 

  8. Sason, I., Entropy Bounds for Discrete Random Variables via Maximal Coupling, IEEE Trans. Inform. Theory, 2013, vol. 59, no. 11, pp. 7118–7131.

    Article  MathSciNet  Google Scholar 

  9. Prelov, V.V., On One Extreme Value Problem for Entropy and Error Probability, Probl. Peredachi Inf., 2014, vol. 50, no. 3, pp. 3–18 [Probl. Inf. Trans. (Engl. Transl.), 2014, vol. 50, no. 3, pp. 203–216].

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. V. Prelov.

Additional information

Original Russian Text © V.V. Prelov, 2016, published in Problemy Peredachi Informatsii, 2016, Vol. 52, No. 4, pp. 3–13.

Supported in part by the Russian Foundation for Basic Research, project no. 15-01-08051.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Prelov, V.V. On some extremal problems for mutual information and entropy. Probl Inf Transm 52, 319–328 (2016). https://doi.org/10.1134/S0032946016040013

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0032946016040013

Navigation