Advertisement

Relations Between Information and Estimation in the Presence of Feedback

  • Himanshu AsnaniEmail author
  • Kartik Venkat
  • Tsachy Weissman
Part of the Lecture Notes in Control and Information Sciences book series (LNCIS, volume 450)

Abstract

We discuss some of the recent literature on relations between information- and estimation-theoretic quantities. We begin by exploring the connections between mutual information and causal/non-causal, matched/mismatched estimation for the setting of a continuous-time source corrupted by white Gaussian noise. Relations involving causal estimation, in both matched and mismatched cases, and mutual information persist in the presence of feedback. We present a new unified framework, based on Girsanov theory and Itô’s Calculus, to derive these relations. We conclude by deriving some new results using this framework.

Keywords

Mutual Information Relative Entropy Estimation Loss Standard Brownian Motion Gaussian Channel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgement

This research was supported by LCCC—Linnaeus Grant VR 2007-8646, Swedish Research Council.

References

  1. 1.
    Atar, R., Weissman, T.: Mutual information, relative entropy, and estimation in the Poisson channel. IEEE Trans. Inf. Theory 58(3), 1302–1318 (2012) MathSciNetCrossRefGoogle Scholar
  2. 2.
    Barron, A.R.: Entropy and the central limit theorem. Ann. Probab. 14(1), 336–342 (1986) MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Brown, L., Dasgupta, A., Haff, L.R., Strawderman, W.E.: The heat equation and Stein’s identity: connections, applications. J. Stat. Plan. Inference 136(7), 2254–2278 (2006) MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, New York (2006) zbMATHGoogle Scholar
  5. 5.
    Duncan, T.E.: On the calculation of mutual information. SIAM J. Appl. Math. 19, 215–220 (1970) MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Girsanov, I.V.: On transforming a certain class of stochastic processes by absolutely continuous substitution of measures. Theory Probab. Appl. 5, 285–301 (1960) CrossRefGoogle Scholar
  7. 7.
    Guo, D., Shamai, S., Verdú, S.: Mutual information and minimum mean-square error in Gaussian channels. IEEE Trans. Inf. Theory IT-51(4), 1261–1283 (2005) CrossRefGoogle Scholar
  8. 8.
    Guo, D., Wu, Y., Shamai (Shitz), S., Verdú, S.: Estimation in Gaussian noise: properties of the minimum mean-square error. IEEE Trans. Inf. Theory 57(4), 2371–2385 (2011) CrossRefGoogle Scholar
  9. 9.
    Kadota, T.T., Zakai, M., Ziv, J.: Mutual information of the white. Gaussian channel with and without feedback. IEEE Trans. Inf. Theory IT-17(4), 368–371 (1971) MathSciNetCrossRefGoogle Scholar
  10. 10.
    Karatzas, I., Shreve, A.E.: Brownian Motion and Stochastic Calculus, 2nd edn. Springer, New York (1988) CrossRefzbMATHGoogle Scholar
  11. 11.
    Merhav, N.: Data processing theorems and the second law of thermodynamics. IEEE Trans. Inf. Theory 57(8), 4926–4939 (2011) MathSciNetCrossRefGoogle Scholar
  12. 12.
    No, A., Weissman, T.: Minimax filtering regret via relations between information and estimation. In: 2013 IEEE International Symposium on Information Theory Proceedings (ISIT), 7–12 July 2013, pp. 444–448 (2013) CrossRefGoogle Scholar
  13. 13.
    Palomar, D., Verdú, S.: Representation of mutual information via input estimates. IEEE Trans. Inf. Theory 53(2), 453–470 (2007) CrossRefGoogle Scholar
  14. 14.
    Palomar, D.P., Verdu, S.: Lautum information. IEEE Trans. Inf. Theory 54(3), 964–975 (2008) MathSciNetCrossRefGoogle Scholar
  15. 15.
    Polyanskiy, Y., Poor, H.V., Verdú, S.: New channel coding achievability bounds. In: IEEE Int. Symposium on Information Theory 2008, Toronto, Ontario, Canada, 6–11 July 2008 Google Scholar
  16. 16.
    Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 2(2), 101–112 (1959) MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Steele, J.M.: Stochastic Calculus and Financial Applications. Springer, Berlin (2010) Google Scholar
  18. 18.
    Venkat, K., Weissman, T.: Pointwise relations between information and estimation in Gaussian noise. IEEE Trans. Inf. Theory 58(10), 6264–6281 (2012) MathSciNetCrossRefGoogle Scholar
  19. 19.
    Verdú, S.: Mismatched estimation and relative entropy. IEEE Trans. Inf. Theory 56(8), 3712–3720 (2010) CrossRefGoogle Scholar
  20. 20.
    Weissman, T.: The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels. IEEE Trans. Inf. Theory 56(9), 4256–4273 (2010) MathSciNetCrossRefGoogle Scholar
  21. 21.
    Weissman, T., Kim, Y.-H., Permuter, H.H.: Directed information, causal estimation, and communication in continuous time. IEEE Trans. Inf. Theory 59(3), 1271–1287 (2012) MathSciNetCrossRefGoogle Scholar
  22. 22.
    Wu, Y., Verdu, S.: Functional properties of MMSE and mutual information. IEEE Trans. Inf. Theory 58(3), 1289–1301 (2012) MathSciNetCrossRefGoogle Scholar
  23. 23.
    Zakai, M.: On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel. IEEE Trans. Inf. Theory 51(9), 3017–3024 (2005) MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Himanshu Asnani
    • 1
    Email author
  • Kartik Venkat
    • 1
  • Tsachy Weissman
    • 1
  1. 1.Electrical Engineering DepartmentStanford UniversityStanfordUSA

Personalised recommendations