TCC 2016: Theory of Cryptography pp 183-203

# Pseudoentropy: Lower-Bounds for Chain Rules and Transformations

Conference paper

DOI: 10.1007/978-3-662-53641-4_8

Volume 9985 of the book series Lecture Notes in Computer Science (LNCS)
Cite this paper as:
Pietrzak K., Skórski M. (2016) Pseudoentropy: Lower-Bounds for Chain Rules and Transformations. In: Hirt M., Smith A. (eds) Theory of Cryptography. TCC 2016. Lecture Notes in Computer Science, vol 9985. Springer, Berlin, Heidelberg

## Abstract

Computational notions of entropy have recently found many applications, including leakage-resilient cryptography, deterministic encryption or memory delegation. The two main types of results which make computational notions so useful are (1) Chain rules, which quantify by how much the computational entropy of a variable decreases if conditioned on some other variable (2) Transformations, which quantify to which extend one type of entropy implies another.

Such chain rules and transformations typically lose a significant amount in quality of the entropy, and are the reason why applying these results one gets rather weak quantitative security bounds. In this paper we for the first time prove lower bounds in this context, showing that existing results for transformations are, unfortunately, basically optimal for non-adaptive black-box reductions (and it’s hard to imagine how non black-box reductions or adaptivity could be useful here.)

A variable X has k bits of HILL entropy of quality $$(\epsilon ,s)$$ if there exists a variable Y with k bits min-entropy which cannot be distinguished from X with advantage $$\epsilon$$ by distinguishing circuits of size s. A weaker notion is Metric entropy, where we switch quantifiers, and only require that for every distinguisher of size s, such a Y exists.

We first describe our result concerning transformations. By definition, HILL implies Metric without any loss in quality. Metric entropy often comes up in applications, but must be transformed to HILL for meaningful security guarantees. The best known result states that if a variable X has k bits of Metric entropy of quality $$(\epsilon ,s)$$, then it has k bits of HILL with quality $$(2\epsilon ,s\cdot \epsilon ^2)$$. We show that this loss of a factor $${\varOmega }(\epsilon ^{-2})$$ in circuit size is necessary. In fact, we show the stronger result that this loss is already necessary when transforming so called deterministic real valued Metric entropy to randomised boolean Metric (both these variants of Metric entropy are implied by HILL without loss in quality).

The chain rule for HILL entropy states that if X has k bits of HILL entropy of quality $$(\epsilon ,s)$$, then for any variable Z of length m, X conditioned on Z has $$k-m$$ bits of HILL entropy with quality $$(\epsilon ,s\cdot \epsilon ^2/ 2^{m})$$. We show that a loss of $${\varOmega }(2^m/\epsilon )$$ in circuit size necessary here. Note that this still leaves a gap of $$\epsilon$$ between the known bound and our lower bound.