Skip to main content
Log in

Differentially Private Precision Matrix Estimation

  • Published:
Acta Mathematica Sinica, English Series Aims and scope Submit manuscript

Abstract

In this paper, we study the problem of precision matrix estimation when the dataset contains sensitive information. In the differential privacy framework, we develop a differentially private ridge estimator by perturbing the sample covariance matrix. Then we develop a differentially private graphical lasso estimator by using the alternating direction method of multipliers (ADMM) algorithm. Furthermore, we prove theoretical results showing that the differentially private ridge estimator for the precision matrix is consistent under fixed-dimension asymptotic, and establish a convergence rate of differentially private graphical lasso estimator in the Frobenius norm as both data dimension p and sample size n are allowed to grow. The empirical results that show the utility of the proposed methods are also provided.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Abadi, M., Chu, A., Goodfellow, I., et al.: Deep learning with differential privacy. In: Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, 308–318, 2016

  2. Banerjee, O., Ghaoui, L., d’Aaspremont, A.: Model selection through sparse maximum likelihood estimation for multivariate gaussian or binary data. J. Mach. Learn. Res., 9, 485–516 (2008)

    MathSciNet  MATH  Google Scholar 

  3. Bickel, P. J., Levina, E.: Regularized estimation of large covariance matrices. Ann. Statist., 36(1), 199–227 (2008)

    Article  MathSciNet  Google Scholar 

  4. Bien, J., Tibshirani, R.: Sparse estimation of a covariance matrix. Biometrika, 98(4), 807–820 (2011)

    Article  MathSciNet  Google Scholar 

  5. Boyd, S., Parikh, N., Chu, E., et al.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn., 3(1), 1–122 (2011)

    Article  Google Scholar 

  6. Cai, T., Liu, W. D., Luo, X.: A constrained l1 minimization approach to sparse precision matrix estimation. J. Amer. Statist. Assoc., 106(494), 594–607 (2011)

    Article  MathSciNet  Google Scholar 

  7. Chaudhuri, K., Monteleoni, C., Sarwate, A. D.: Differentially private empirical risk minimization. J. Mach. Learn. Res., 12, 1069–1109 (2011)

    MathSciNet  MATH  Google Scholar 

  8. Combettes, P. L., Wajs, V. R.: Signal recovery by proximal forward-backward splitting. Multiscale Model. Simul., 4(4), 1168–1200 (2005)

    Article  MathSciNet  Google Scholar 

  9. Dua, D., Gra, C.: UCI machine learning repository, 2017

  10. Deng, X. W., Tsui, K.: Penalized covariance matrix estimation using a matrix-logarithm transformation. J. Comput. Graph. Statist., 22(2), 494–512 (2013)

    Article  MathSciNet  Google Scholar 

  11. Dwork, C., McSherry, F., Nissim, K., et al.: Calibrating noise to sensitivity in private data analysis. Theory of Cryptography Conference, Springer, Berlin, Heidelberg, 2006

    Google Scholar 

  12. Dwork, C., Roth, A.: The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci., 9(3–4), 211–407 (2014)

    MathSciNet  MATH  Google Scholar 

  13. Dwork, C., Talwar, K., Thakurta, A., et al.: Analyze gauss: optimal bounds for privacy-preserving principal component analysis. In: Proceedings of the forty-sixth annual ACM symposium on Theory of computing, ACM, 11–20, 2014

  14. Eckstein, J., Bertsekas, D. P.: On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program., 55(1–3), 293–318 (1992)

    Article  MathSciNet  Google Scholar 

  15. Friedman, J., Hastie, T., Tibshirani, R.: Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9(3), 432–441 (2008)

    Article  Google Scholar 

  16. Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comput. Math. Appl., 2, 17–40 (1976)

    Article  Google Scholar 

  17. Guillot, D., Rajaratnam, B., Rolfs, B., et al.: Iterative thresholding algorithm for sparse inverse covariance estimation. Advances in Neural Information Processing Systems, 2012

  18. Kaplan, H., Stemmer, U.: Differentially private k-means with constant multiplicative error. Advances in Neural Information Processing Systems, 2018

  19. Kuismin, M. O., Kemppainen, J. T., Sillanpää, M. J.: Precision matrix estimation with ROPE. J. Comput. Graph. Statist., 26(3), 682–694 (2017)

    Article  MathSciNet  Google Scholar 

  20. Lauritzen, S. L.: Graphical Models, Clarendon Press, 1996

  21. Ledoit, O., Wolf, M.: A well-conditioned estimator for large-dimensional covariance matrices. J. Multivariate Anal., 88(2), 365–411 (2004)

    Article  MathSciNet  Google Scholar 

  22. Lions, P. L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal., 16(6), 964–979 (1979)

    Article  MathSciNet  Google Scholar 

  23. Liu, W. D., Luo, X.: Fast and adaptive sparse precision matrix estimation in high dimensions. J. Multivariate Anal., 135, 153–162 (2015)

    Article  MathSciNet  Google Scholar 

  24. Martin, N., Maes, H.: Multivariate Analysis, Academic Press, London, 1979

    Google Scholar 

  25. Rothman, A. J., Bickel, P. J., Levina, E., et al.: Sparse permutation invariant covariance estimation. Electron. J. Stat., 2, 494–515 (2008)

    Article  MathSciNet  Google Scholar 

  26. Sachs, K., Perez, O., Pe’er, D., et al.: Causal protein-signaling networks derived from multiparameter single-cell data. Science, 308(5721), 523–529 (2005)

    Article  Google Scholar 

  27. van Wieringen, W. N., Peeters, C. F. W.: Ridge estimation of inverse covariance matrices from high-dimensional data. Comput. Statist. Data Anal., 103, 284–303 (2016)

    Article  MathSciNet  Google Scholar 

  28. Wang, D., Huai, M. D., Xu, J. H.: Differentially private sparse inverse covariance estimation. In: 2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP), IEEE, 2018

  29. Warton, D. I.: Penalized normal likelihood and ridge regularization of correlation and covariance matrices. J. Amer. Statist. Assoc., 103(481), 340–349 (2008)

    Article  MathSciNet  Google Scholar 

  30. Yuan, M., Lin, Y.: Model selection and estimation in the Gaussian graphical model. Biometrika, 94(1), 19–35 (2007)

    Article  MathSciNet  Google Scholar 

  31. Yuan, T., Wang, J. H.: A coordinate descent algorithm for sparse positive definite matrix estimation. Stat. Anal. Data Min., 6(5), 431–442 (2013)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hai Zhang.

Additional information

Supported by National Natural Science Foundation of China (Grant Nos. 11571011 and U1811461), the Open Research Fund of KLATASDS-MOE, the Fundamental Research Funds for the Central Universities

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Su, W.Q., Guo, X. & Zhang, H. Differentially Private Precision Matrix Estimation. Acta. Math. Sin.-English Ser. 36, 1107–1124 (2020). https://doi.org/10.1007/s10114-020-9370-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10114-020-9370-9

Keywords

MR(2010) Subject Classification

Navigation