Theory of Computing Systems

, Volume 62, Issue 3, pp 583–599 | Cite as

On Slepian–Wolf Theorem with Interaction



In this paper we study interactive “one-shot” analogues of the classical Slepian–Wolf theorem. Alice receives a value of a random variable X, Bob receives a value of another random variable Y that is jointly distributed with X. Alice’s goal is to transmit X to Bob (with some error probability ε). Instead of one-way transmission we allow them to interact. They may also use shared randomness. We show, that for every natural r Alice can transmit X to Bob using \(\left (1 + \frac {1}{r}\right )H(X|Y) + r + O(\log _{2}\left (\frac {1}{\varepsilon }\right ))\) bits on average in \(\frac {2H(X|Y)}{r} + 2\) rounds on average. Setting \(r = \lceil \sqrt {H(X|Y)}\rceil \) and using a result of Braverman and Garg (2) we conclude that every one-round protocol π with information complexity I can be compressed to a (many-round) protocol with expected communication about \(I + 2\sqrt {I}\) bits. This improves a result by Braverman and Rao (3), where they had \(I+5\sqrt {I}\). Further, we show (by setting r = ⌈H(X|Y)⌉) how to solve this problem (transmitting X) using \(2H(X|Y) + O(\log _{2}\left (\frac {1}{\varepsilon }\right ))\) bits and 4 rounds on average. This improves a result of Brody et al. (4), where they had \(4H(X|Y) + O(\log 1/\varepsilon )\) bits and 10 rounds on average. In the end of the paper we discuss how many bits Alice and Bob may need to communicate on average besides H(X|Y). The main question is whether the upper bounds mentioned above are tight. We provide an example of (X, Y), such that transmission of X from Alice to Bob with error probability ε requires \(H(X|Y) + {\Omega }\left (\log _{2}\left (\frac {1}{\varepsilon }\right )\right )\) bits on average.


Slepian–Wolf theorem Communication complexity Information complexity Interactive compression 


  1. 1.
    Bauer, B., Moran, S., Yehudayoff, A.: Internal compression of protocols to entropy. Approximation, Randomization, and Combinatorial Optimization Algorithms and Techniques, p. 481 (2015)Google Scholar
  2. 2.
    Braverman, M., Garg, A.: Public vs private coin in bounded-round information. In: Automata, Languages, and Programming, pp. 502–513. Springer (2014)Google Scholar
  3. 3.
    Braverman, M., Rao, A.: Information equals amortized communication. In: IEEE 52nd Annual Symposium on Foundations of Computer Science (FOCS), 2011 , IEEE, pp. 748–757 (2011)Google Scholar
  4. 4.
    Brody, J., Buhrman, H., Koucky, M., Loff, B., Speelman, F., Vereshchagin, N.: Towards a reverse Newman’s theorem in interactive information complexity. In: IEEE Conference on Computational Complexity (CCC), 2013, IEEE, pp. 24–33 (2013)Google Scholar
  5. 5.
    Kushilevitz, E., Nisan, N.: Communication Complexity. Cambridge University Press (2006)Google Scholar
  6. 6.
    Newman, I.: Private vs. common random bits in communication complexity. Inf. Process. Lett. 39(2), 67–71 (1991)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Orlitsky, A.: Average-case interactive communication. IEEE Trans. Inf. Theory 38(5), 1534–1547 (1992)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Shannon, C.E.: A mathematical theory of communication. ACM SIGMOBILE Mobile Computing and Communications Review 5(1), 3–55 (2001)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Slepian, D., Wolf, J.K.: Noiseless coding of correlated information sources. IEEE Trans. Inf. Theory 19(4), 471–480 (1973)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Weinstein, O.: Information complexity and the quest for interactive compression. ACM SIGACT News 46(2), 41–64 (2015)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Yeung, R.W.: Information Theory and Network Coding. Springer (2008)Google Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.National Research University Higher School of EconomicsMoscow State UniversityMoscowRussian Federation

Personalised recommendations