Mutual Information Analysis: How, When and Why?

  • Nicolas Veyrat-Charvillon
  • François-Xavier Standaert
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5747)

Abstract

The Mutual Information Analysis (MIA) is a generic side-channel distinguisher that has been introduced at CHES 2008. This paper brings three contributions with respect to its applicability to practice. First, we emphasize that the MIA principle can be seen as a toolbox in which different (more or less effective) statistical methods can be plugged in. Doing this, we introduce interesting alternatives to the original proposal. Second, we discuss the contexts in which the MIA can lead to successful key recoveries with lower data complexity than classical attacks such as, e.g. using Pearson’s correlation coefficient. We show that such contexts exist in practically meaningful situations and analyze them statistically. Finally, we study the connections and differences between the MIA and a framework for the analysis of side-channel key recovery published at Eurocrypt 2009. We show that the MIA can be used to compare two leaking devices only if the discrete models used by an adversary to mount an attack perfectly correspond to the physical leakages.

References

  1. 1.
    Ali, S.M., Silvey, S.D.: A general class of coefficients of divergence of one distribution from another. Journal of the Royal Statistical Society, Series B (Methodological) 28(1), 131–142 (1966)MathSciNetMATHGoogle Scholar
  2. 2.
    Anderson, T.W.: On the distribution of the two-sample cramér-von mises criterion. The Annals of Mathematical Statistics 33(3), 1148–1159 (1962)CrossRefMATHGoogle Scholar
  3. 3.
    Aumonier, S.: Generalized correlation power analysis. In: Ecrypt Workshop on Tools For Cryptanalysis. Krakòw, Poland (September 2007)Google Scholar
  4. 4.
    Bickel, P., Levina, E.: The earth’s mover’s distance is the mallows distance: some insights from statistics. In: Computer Vision 2001, vol. 2, pp. 251–256 (2001)Google Scholar
  5. 5.
    Brier, E., Clavier, C., Olivier, F.: Correlation power analysis with a leakage model. In: Joye, M., Quisquater, J.-J. (eds.) CHES 2004. LNCS, vol. 3156, pp. 16–29. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  6. 6.
    Chari, S., Rao, J., Rohatgi, P.: Template attacks. In: Kaliski Jr., B.S., Koç, Ç.K., Paar, C. (eds.) CHES 2002. LNCS, vol. 2523, pp. 13–28. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  7. 7.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, Chichester (1991)CrossRefMATHGoogle Scholar
  8. 8.
    Csiszár, I.: Information-type measures of difference of probability distributions and indirect observation. Studia Sci. Math. Hungar. 2, 229–318 (1967)Google Scholar
  9. 9.
    Csiszár, I., Shields, P.C.: Information theory and statistics: a tutorial. Commun. Inf. Theory 1(4), 417–528 (2004)MATHGoogle Scholar
  10. 10.
    DPA Contest 2008/2009, http://www.dpacontest.org/
  11. 11.
    Freedman, D., Diaconis, P.: On the histogram as a density estimator. Probability Theory and Related Fields 57(4), 453–476 (1981)MathSciNetMATHGoogle Scholar
  12. 12.
    Gierlichs, B., Batina, L., Tuyls, P., Preneel, B.: Mutual information analysis. In: Oswald, E., Rohatgi, P. (eds.) CHES 2008. LNCS, vol. 5154, pp. 426–442. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  13. 13.
    Härdle, W.: Smoothing Techniques: With Implementation in S. Springer Series in Statistics (December 1990)Google Scholar
  14. 14.
    Kocher, P., Jaffe, J., Jun, B.: Differential power analysis. In: Wiener, M. (ed.) CRYPTO 1999. LNCS, vol. 1666, pp. 388–412. Springer, Heidelberg (1999)CrossRefGoogle Scholar
  15. 15.
    Lemke, K., Paar, C.: Gaussian mixture models for higher-order side channel analysis. In: Nejdl, W., Tochtermann, K. (eds.) EC-TEL 2006. LNCS, vol. 4227, pp. 14–27. Springer, Heidelberg (2006)Google Scholar
  16. 16.
    Laird, N., Dempster, A., Rubin, D.: Maximum likelihood from incomplete data via the em algorithm. Journal of the Royal Statistical Society, Series B (Methodological) 39(1), 1–38 (1977)MathSciNetMATHGoogle Scholar
  17. 17.
    Mallows, C.L.: A note on asymptotic joint normality. The Annals of Mathematical Statistics 43(2), 508–515 (1972)MathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    Messerges, T.S.: Using second-order power analysis to attack DPA resistant software. In: Paar, C., Koç, Ç.K. (eds.) CHES 2000. LNCS, vol. 1965, pp. 238–251. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  19. 19.
    Prouff, E., Rivain, M.: Theoretical and practical aspects of mutual information based side channel analysis. In: Abdalla, M., Pointcheval, D., Fouque, P.-A., Vergnaud, D. (eds.) ACNS 2009. LNCS, vol. 5536, pp. 499–518. Springer, Heidelberg (2009)Google Scholar
  20. 20.
    Scott, D.W.: On optimal and data-based histograms. Biometrika 66(3), 605–610 (1979)MathSciNetCrossRefMATHGoogle Scholar
  21. 21.
    Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Chapman & Hall/CRC, Boca Raton (1986)CrossRefMATHGoogle Scholar
  22. 22.
    Standaert, F.-X., Malkin, T.G., Yung, M.: A unified framework for the analysis of side-channel key recovery attacks (extended version). Cryptology ePrint Archive, Report 2006/139 (2006), http://eprint.iacr.org/
  23. 23.
    Turlach, B.A.: Bandwidth selection in kernel density estimation: a review. In: CORE and Institut de Statistique (1993)Google Scholar
  24. 24.
    Zhang, M.H., Cheng, Q.S.: Determine the number of components in a mixture model by the extended ks test. Pattern Recogn. Lett. 25(2), 211–216 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Nicolas Veyrat-Charvillon
    • 1
  • François-Xavier Standaert
    • 1
  1. 1.UCL Crypto GroupUniversité catholique de LouvainLouvain-la-NeuveBelgium

Personalised recommendations