Advertisement

Predictive Models for Min-entropy Estimation

  • John Kelsey
  • Kerry A. McKay
  • Meltem Sönmez Turan
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9293)

Abstract

Random numbers are essential for cryptography. In most real-world systems, these values come from a cryptographic pseudorandom number generator (PRNG), which in turn is seeded by an entropy source. The security of the entire cryptographic system then relies on the accuracy of the claimed amount of entropy provided by the source. If the entropy source provides less unpredictability than is expected, the security of the cryptographic mechanisms is undermined, as in [5, 7, 10]. For this reason, correctly estimating the amount of entropy available from a source is critical.

In this paper, we develop a set of tools for estimating entropy, based on mechanisms that attempt to predict the next sample in a sequence based on all previous samples. These mechanisms are called predictors. We develop a framework for using predictors to estimate entropy, and test them experimentally against both simulated and real noise sources. For comparison, we subject the entropy estimates defined in the August 2012 draft of NIST Special Publication 800-90B [4] to the same tests, and compare their performance.

Keywords

Entropy estimation Min-entropy Random number generation 

Notes

Acknowledgments

We would like to thank Stefan Lucks for his suggestion to a performance metric that considered runs of correct predictions. We would also like to thank Tim Hall for his implementations of the entropy estimates in [9], and John Deneker, Tim Hall, and Sonu Sankur for providing samples from real-world noise sources for testing.

References

  1. 1.
    Cryptography Research Inc.: Evaluation of VIA C3 Nehemiah random number generator. Technical report. http://www.cryptography.com/public/pdf/VIA_rng.pdf. Accessed 27 Feb 2003
  2. 2.
    Antos, A., Kontoyiannis, I.: Convergence properties of functional estimates for discrete distributions. Random Struct. Algorithm. 19(3–4), 163–193 (2001). http://dx.doi.org/10.1002/rsa.10019
  3. 3.
    Barak, B., Halevi, S.: A model and architecture for pseudo-random generation with applications to /dev/random. In: Proceedings of the 12th ACM Conference on Computer and Communications Security, CCS 2005, pp. 203–212. ACM, New York, NY, USA (2005). http://doi.acm.org/10.1145/1102120.1102148
  4. 4.
    Barker, E., Kelsey, J.: NIST draft special publication 800–90 B: recommendation for the entropy sources used for random bit generation, August 2012. http://csrc.nist.gov/publications/drafts/800-90/draft-sp800-90b.pdf
  5. 5.
    Dorrendorf, L., Gutterman, Z., Pinkas, B.: Cryptanalysis of the random number generator of the windows operating system. ACM Trans. Inf. Syst. Secur. 13(1), 10:1–10:32 (2009)CrossRefGoogle Scholar
  6. 6.
    Feller, W.: An Introduction to Probability Theory and its Applications, vol. One, Chap. 13. Wiley, New York (1950)Google Scholar
  7. 7.
    Gutterman, Z., Pinkas, B., Reinman, T.: Analysis of the linux random number generator. In: Proceedings of the 2006 IEEE Symposium on Security and Privacy, SP 2006, pp. 371–385. IEEE Computer Society, Washington, DC, USA (2006). http://dx.doi.org/10.1109/SP.2006.5
  8. 8.
    Hagerty, P.: Presentation of non-iid tests. In: NIST Random Bit Generation Workshop (2012). http://csrc.nist.gov/groups/ST/rbg_workshop_2012/hagerty.pdf
  9. 9.
    Hagerty, P., Draper, T.: Entropy bounds and statistical tests. In: NIST Random Bit Generation Workshop (2012). http://csrc.nist.gov/groups/ST/rbg_workshop_2012/hagerty_entropy_paper.pdf
  10. 10.
    Heninger, N., Durumeric, Z., Wustrow, E., Halderman, J.A.: Mining your Ps and Qs: detection of widespread weak keys in network devices. In: Proceedings of the 21st USENIX Security Symposium, Aug 2012Google Scholar
  11. 11.
    Kontoyiannis, I., Algoet, P., Suhov, Y.M., Wyner, A.: Nonparametric entropy estimation for stationary processes and random fields, with applications to English text. IEEE Trans. Inform. Theor. 44, 1319–1327 (1998)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Lauradoux, C., Ponge, J., Roeck, A.: Online entropy estimation for non-binary sources and applications on iPhone. Research report RR-7663, INRIA, Jun 2011. https://hal.inria.fr/inria-00604857
  13. 13.
    Maurer, U.M.: A universal statistical test for random bit generators. J. Cryptol. 5(2), 89–105 (1992). http://dx.doi.org/10.1007/BF00193563
  14. 14.
    FIPS PUB 140–2, Security requirements for cryptographic modules. U.S. Department of Commerce/National Institute of Standards and Technology (2002)Google Scholar
  15. 15.
    Paninski, L.: Estimation of entropy and mutual information. Neural Comput. 15(6), 1191–1253 (2003). http://dx.doi.org/10.1162/089976603321780272
  16. 16.
  17. 17.
    Sayood, K.: Introduction to Data Compression, Third edn, Chap. 5. Morgan Kaufmann, San Francisco (2006)Google Scholar
  18. 18.
    Shannon, C.: Prediction and entropy of printed english. Bell Syst. Tech. J. 30, 50–64 (1951). https://archive.org/details/bstj30-1-50
  19. 19.
  20. 20.
    Wyner, A.D., Ziv, J.: Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compression. IEEE Trans. Inf. Theor. 35(6), 1250–1258 (1989)MathSciNetCrossRefGoogle Scholar

Copyright information

© International Association for Cryptologic Research 2015

Authors and Affiliations

  • John Kelsey
    • 1
  • Kerry A. McKay
    • 1
  • Meltem Sönmez Turan
    • 1
    • 2
  1. 1.National Institute of Standards and TechnologyGaithersburgUSA
  2. 2.Dakota Consulting Inc.Silver SpringUSA

Personalised recommendations