Skip to main content
Log in

Communication with Contextual Uncertainty

  • Published:
computational complexity Aims and scope Submit manuscript

Abstract

We introduce a simple model illustrating the utility of context in compressing communication and the challenge posed by uncertainty of knowledge of context. We consider a variant of distributional communication complexity where Alice gets some information \({X \in \{0,1\}^n}\) and Bob gets \({Y \in \{0,1\}^n}\), where (X, Y) is drawn from a known distribution, and Bob wishes to compute some function g(X, Y) or some close approximation to it (i.e., the output is g(X, Y) with high probability over (X, Y)). In our variant, Alice does not know g, but only knows some function f which is a very close approximation to g. Thus, the function being computed forms the context for the communication. It is an enormous implicit input, potentially described by a truth table of size 2n. Imprecise knowledge of this function models the (mild) uncertainty in this context.

We show that uncertainty can lead to a huge cost in communication. Specifically, we construct a distribution \({\mu}\) over \({(X,Y)\in \{0,1\}^n \times \{0,1\}^n}\) and a class of function pairs (f, g) which are very close (i.e., disagree with o(1) probability when (X, Y) are sampled according to \({\mu}\)), for which the communication complexity of f or g in the standard setting is one bit, whereas the (two-way) communication complexity in the uncertain setting is at least \({\Omega(\sqrt{n})}\) bits even when allowing a constant probability of error.

It turns out that this blow-up in communication complexity can be attributed in part to the mutual information between X and Y. In particular, we give an efficient protocol for communication under contextual uncertainty that incurs only a small blow-up in communication if this mutual information is small. Namely, we show that if g has a communication protocol with complexity k in the standard setting and the mutual information between X and Y is I, then g has a one-way communication protocol with complexity \({O((1+I)\cdot 2^k)}\) in the uncertain setting. This result is an immediate corollary of an even stronger result which shows that if g has one-way communication complexity k, then it has one-way uncertain-communication complexity at most \({O((1+I)\cdot k)}\). In the particular case where the input distribution is a product distribution (and so I = 0), the protocol in the uncertain setting only incurs a constant factor blow-up in one-way communication and error.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Ziv Bar-Yossef, T. S. Jayram, Ravi Kumar & D. Sivakumar (2002). Information Theory Methods in Communication Complexity. In 17th Annual IEEE Conference on Computational Complexity, 93–102.

  2. Mohammad Bavarian, Dmitry Gavinsky & Tsuyoshi Ito (2014). On the role of shared randomness in simultaneous communication. In International Colloquium on Automata, Languages, and Programming, 150–162. Springer.

  3. Eric Blais, Joshua Brody & Badih Ghazi (2014). The Information Complexity of Hamming Distance. Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques 465.

  4. Bogdanov Andrej, Mossel Elchanan (2011) On Extracting Common Random Bits From Correlated Sources. IEEE Transactions on Information Theory 57(10): 6351–6355

    Article  MathSciNet  Google Scholar 

  5. Clément Louis Canonne, Venkatesan Guruswami, Raghu Meka & Madhu Sudan (2015). Communication with Imperfectly Shared Randomness. In Innovations in Theoretical Computer Science, ITCS, 257–262.

  6. Badih Ghazi, Pritish Kamath & Madhu Sudan (2016). Communication complexity of permutation-invariant functions. In Proceedings of the Twenty-Seventh Annual ACM-SIAM Symposium on Discrete Algorithms, 1902–1921. SIAM.

  7. Badih Ghazi & Madhu Sudan (2017). The Power of Shared Randomness in Uncertain Communication. In International Colloquium on Automata, Languages and Programming (ICALP), 49:1–49:14.

  8. Goldreich Oded, Juba Brendan, Sudan Madhu (2012) A theory of goal-oriented communication. J. ACM 59(2): 8

    Article  MathSciNet  Google Scholar 

  9. Elad Haramaty & Madhu Sudan (2014). Deterministic compression with uncertain priors. In Innovations in Theoretical Computer Science, ITCS, 377–386.

  10. Prahladh Harsha, Rahul Jain, David McAllester & Jaikumar Radhakrishnan (2007). The communication complexity of correlation. In Twenty-Second Annual IEEE Conference on Computational Complexity (CCC’07), 10–23. IEEE.

  11. Huang Wei, Shi Yaoyun, Zhang Shengyu, Zhu Yufan (2006) The communication complexity of the Hamming distance problem. Information Processing Letters 99(4): 149–153

    Article  MathSciNet  Google Scholar 

  12. Brendan Juba, Adam Tauman Kalai, Sanjeev Khanna & Madhu Sudan (2011). Compression without a common prior: an information-theoretic justification for ambiguity in language. In Innovations in Computer Science, ICS, 79–86.

  13. Brendan Juba & Madhu Sudan (2008). Universal semantic communication I. In 40th Annual ACM Symposium on Theory of Computing, 123–132.

  14. Brendan Juba & Madhu Sudan (2011). Efficient Semantic Communication via Compatible Beliefs. In Innovations in Computer Science, ICS, 22–31.

  15. Brendan Juba & Ryan Williams (2013). Massive online teaching to bounded learners. In Innovations in Theoretical Computer Science, ITCS, 1–10.

  16. Kremer Ilan, Nisan Noam, Dana Ron (1999) On Randomized One-Round Communication Complexity. Computational Complexity 8(1): 21–49

    Article  MathSciNet  Google Scholar 

  17. Eyal Kushilevitz & Noam Nisan (1997). Communication complexity. Cambridge University Press.

  18. Alan J Laub (2005). Matrix analysis for scientists and engineers. Siam.

  19. Michael Mitzenmacher & Eli Upfal (2005). Probability and computing: Randomized algorithms and probabilistic analysis. Cambridge University Press.

  20. Andrew Chi-Chih Yao (1979). Some Complexity Questions Related to Distributive Computing (Preliminary Report). In 11h Annual ACM Symposium on Theory of Computing, 209–213.

  21. Andrew Chi-Chin Yao (1977). Probabilistic computations: Toward a unified measure of complexity. In Foundations of Computer Science, 1977., 18th Annual Symposium on, 222–227. IEEE.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Madhu Sudan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ghazi, B., Komargodski, I., Kothari, P.K. et al. Communication with Contextual Uncertainty. comput. complex. 27, 463–509 (2018). https://doi.org/10.1007/s00037-017-0161-3

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00037-017-0161-3

Keywords

Subject classification

Navigation