Min-Entropy Leakage of Channels in Cascade

  • Barbara Espinoza
  • Geoffrey Smith
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7140)


Theories of quantitative information flow offer an attractive framework for analyzing confidentiality in practical systems, which often cannot avoid “small” leaks of confidential information. Recently there has been growing interest in the theory of min-entropy leakage, which measures uncertainty based on a random variable’s vulnerability to being guessed in one try by an adversary. Here we contribute to this theory by studying the min-entropy leakage of systems formed by cascading two channels together, using the output of the first channel as the input to the second channel. After considering the semantics of cascading carefully and exposing some technical subtleties, we prove that the min-entropy leakage of a cascade of two channels cannot exceed the leakage of the first channel; this result is a min-entropy analogue of the classic data-processing inequality. We show however that a comparable bound does not hold for the second channel. We then consider the min-capacity, or maximum leakage over all a priori distributions, showing that the min-capacity of a cascade of two channels cannot exceed the min-capacity of either channel.


Conditional Probability Mutual Information Joint Distribution Channel Matrix Differential Privacy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Abramson, N.: Information Theory and Coding. McGraw-Hill (1963)Google Scholar
  2. 2.
    Alvim, M., Andrés, M., Chatzikokolakis, K., Degano, P., Palamidessi, C.: Differential Privacy: On the Trade-off between Utility and Information Leakage. In: Barthe, G., Datta, A., Etalle, S. (eds.) FAST 2011. LNCS, vol. 7140, pp. 39–54. Springer, Heidelberg (2012)Google Scholar
  3. 3.
    Alvim, M., Andrés, M., Palamidessi, C.: Probabilistic information flow. In: Proc. 25th IEEE Symposium on Logic in Computer Science (LICS 2010), pp. 314–321 (2010)Google Scholar
  4. 4.
    Andrés, M., Palamidessi, C., van Rossum, P., Smith, G.: Computing the Leakage of Information-Hiding Systems. In: Esparza, J., Majumdar, R. (eds.) TACAS 2010. LNCS, vol. 6015, pp. 373–389. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  5. 5.
    Barthe, G., Köpf, B.: Information-theoretic bounds for differentially private mechanisms. In: Proc. 24th IEEE Computer Security Foundations Symposium (CSF 2011), pp. 191–204 (2011)Google Scholar
  6. 6.
    Braun, C., Chatzikokolakis, K., Palamidessi, C.: Quantitative notions of leakage for one-try attacks. In: Proc. 25th Conference on Mathematical Foundations of Programming Semantics (MFPS 2009). ENTCS, vol. 249, pp. 75–91 (2009)Google Scholar
  7. 7.
    Clark, D., Hunt, S., Malacaria, P.: Quantitative information flow, relations and polymorphic types. Journal of Logic and Computation 18(2), 181–199 (2005)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. John Wiley & Sons, Inc. (2006)Google Scholar
  9. 9.
    Desoer, C.A.: Communication through channels in cascade. Ph.D. thesis, Massachusetts Institute of Technology (1953)Google Scholar
  10. 10.
    Dwork, C.: A firm foundation for private data analysis. Communications of the ACM 54(1) (2011)Google Scholar
  11. 11.
    El-Sayed, A.B.: Cascaded channels and the equivocation inequality. Metrika 25, 193–208 (1978)MathSciNetzbMATHCrossRefGoogle Scholar
  12. 12.
    Gallager, R.G.: Information Theory and Reliable Communication. John Wiley & Sons, Inc. (1968)Google Scholar
  13. 13.
    Hamadou, S., Sassone, V., Palamidessi, C.: Reconciling belief and vulnerability in information flow. In: Proc. 31st IEEE Symposium on Security and Privacy, pp. 79–92 (2010)Google Scholar
  14. 14.
    Kiely, A.B., Coffey, J.T.: On the capacity of a cascade of channels. IEEE Transactions on Information Theory 39(4), 1310–1321 (1993)zbMATHCrossRefGoogle Scholar
  15. 15.
    Köpf, B., Basin, D.: An information-theoretic model for adaptive side-channel attacks. In: Proc. 14th ACM Conference on Computer and Communications Security (CCS 2007), pp. 286–296 (2007)Google Scholar
  16. 16.
    Köpf, B., Smith, G.: Vulnerability bounds and leakage resilience of blinded cryptography under timing attacks. In: Proc. 23nd IEEE Computer Security Foundations Symposium (CSF 2010), pp. 44–56 (2010)Google Scholar
  17. 17.
    MacKay, D.J.: Information Theory, Inference, and Learning Algorithms. Cambridge University Press (2003)Google Scholar
  18. 18.
    Rényi, A.: On measures of entropy and information. In: Proc. 4th Berkeley Symposium on Mathematics, Statistics and Probability 1960, pp. 547–561 (1961)Google Scholar
  19. 19.
    Rényi, A.: Foundations of Probability. Holden-Day, Inc. (1970)Google Scholar
  20. 20.
    Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)MathSciNetzbMATHGoogle Scholar
  21. 21.
    Smith, G.: On the Foundations of Quantitative Information Flow. In: de Alfaro, L. (ed.) FOSSACS 2009. LNCS, vol. 5504, pp. 288–302. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Barbara Espinoza
    • 1
  • Geoffrey Smith
    • 1
  1. 1.School of Computing and Information SciencesFlorida International UniversityMiamiUSA

Personalised recommendations