Journal of Computer Science and Technology

, Volume 34, Issue 2, pp 272–286 | Cite as

Improving Data Utility Through Game Theory in Personalized Differential Privacy

  • Lei Cui
  • Youyang Qu
  • Mohammad Reza Nosouhi
  • Shui Yu
  • Jian-Wei Niu
  • Gang XieEmail author
Regular Paper


Due to dramatically increasing information published in social networks, privacy issues have given rise to public concerns. Although the presence of differential privacy provides privacy protection with theoretical foundations, the trade-off between privacy and data utility still demands further improvement. However, most existing studies do not consider the quantitative impact of the adversary when measuring data utility. In this paper, we firstly propose a personalized differential privacy method based on social distance. Then, we analyze the maximum data utility when users and adversaries are blind to the strategy sets of each other. We formalize all the payoff functions in the differential privacy sense, which is followed by the establishment of a static Bayesian game. The trade-off is calculated by deriving the Bayesian Nash equilibrium with a modified reinforcement learning algorithm. The proposed method achieves fast convergence by reducing the cardinality from n to 2. In addition, the in-place trade-off can maximize the user’s data utility if the action sets of the user and the adversary are public while the strategy sets are unrevealed. Our extensive experiments on the real-world dataset prove the proposed model is effective and feasible.


personalized privacy protection game theory trade-off reinforcement learning 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2019_1910_MOESM1_ESM.pdf (145 kb)
ESM 1 (PDF 145 kb)


  1. [1]
    Garcia D. Leaking privacy and shadow profiles in online social networks. Science Advances, 2017, 3(8): Article No. e1701172.CrossRefGoogle Scholar
  2. [2]
    He Z, Cai Z, Yu J. Latent-data privacy preserving with customized data utility for social network data. IEEE Transactions on Vehicular Technology, 2018, 67(1): 665-673.CrossRefGoogle Scholar
  3. [3]
    Cristofaro E D, Soriente C, Tsudik G, Williams A. Hummingbird: Privacy at the time of twitter. In Proc. the 2012 IEEE Symposium on Security and Privacy, May 2012, pp.285-299.Google Scholar
  4. [4]
    Abawajy J H, Ninggal M I H, Herawan T. Privacy preserving social network data publication. IEEE Communications Surveys and Tutorials, 2016, 18(3): 1974-1997.CrossRefGoogle Scholar
  5. [5]
    Yu S, Zhou W, Guo S, Guo M. A feasible IP traceback framework through dynamic deterministic packet marking. IEEE Transactions on Computers, 2016, 65(5): 1418-1427.MathSciNetCrossRefzbMATHGoogle Scholar
  6. [6]
    Qu Y, Yu S, Gao L, Zhou W, Peng S. A hybrid privacy protection scheme in cyber-physical social networks. IEEE Transactions on Computational Social Systems, 2018, 5(3): 773-784.CrossRefGoogle Scholar
  7. [7]
    Qu Y, Yu S, Zhou W, Peng S, Wang G, Xiao K. Privacy of things: Emerging challenges and opportunities in wireless Internet of Things. IEEE Wireless Communications, 2018, 25(6): 91-97.CrossRefGoogle Scholar
  8. [8]
    Yu S, Liu M, Dou W, Liu X, Zhou S. Networking for big data: A survey. IEEE Communications Surveys and Tutorials, 2017, 19(1): 531-549.CrossRefGoogle Scholar
  9. [9]
    Zhu T, Xiong P, Li G, Zhou W. Correlated differential privacy: Hiding information in non-IID data set. IEEE Transactions on Information Forensics and Security, 2015, 10(2): 229-242.CrossRefGoogle Scholar
  10. [10]
    Koufogiannis F, Pappas G J. Diffusing private data over networks. IEEE Transactions on Control of Network Systems, 2016, 5(3): 1027-1037.MathSciNetCrossRefGoogle Scholar
  11. [11]
    Wang W, Zhang Q. Privacy preservation for context sensing on smartphone. IEEE/ACM Transactions on Networking, 2016, 24(6): 3235-3247.CrossRefGoogle Scholar
  12. [12]
    Yu S. Big privacy: Challenges and opportunities of privacy study in the age of big data. IEEE Access, 2016, 4: 2751-2763.CrossRefGoogle Scholar
  13. [13]
    Mohassel P, Zhang Y. SecureML: A system for scalable privacy-preserving machine learning. In Proc. the 2017 IEEE Symposium on Security and Privacy, May 2017, pp.19-38.Google Scholar
  14. [14]
    Costantino G, Martinelli F, Santi P. Investigating the privacy versus forwarding accuracy tradeoff in opportunisticinterest-casting. IEEE Transactions on Mobile Computing, 2014, 13(4): 824-837.CrossRefGoogle Scholar
  15. [15]
    Pierangela S, Latanya S. Protecting privacy when disclosing information: k-anonymity and its enforcement through generalization and suppression. https://dataprivacylab. org/dataprivacy/projects/kanonymity/paper3.pdf, May 2018.Google Scholar
  16. [16]
    Machanavajjhala A, Kifer D, Gehrke J, Venkitasubra-Maniam M. L-diversity: Privacy beyond k-anonymity. ACM Transactions on Knowledge Discovery from Data, 2007, 1(1): Article No. 3.Google Scholar
  17. [17]
    Gong X, Chen X, Xing K, Shin D, Zhang M, Zhang J. Personalized location privacy in mobile networks: A social group utility approach. In Proc. the 2005 IEEE Conference on Computer Communications, April 2015, pp.1008-1016.Google Scholar
  18. [18]
    Dwork C. Differential privacy. In Proc. the 33rd International Colloquium on Automata, Languages and Programming, July 2006, pp.1-12.Google Scholar
  19. [19]
    Zhu T, Li G, Zhou W, Yu P S. Differentially private data publishing and analysis: A survey. IEEE Transactions on Knowledge and Data Engineering, 2017, 29(8): 1619-1638.CrossRefGoogle Scholar
  20. [20]
    Wang Q, Zhang Y, Lu X, Wang Z, Qin Z, Ren K. Realtime and spatio-temporal crowd-sourced social network data publishing with differential privacy. IEEE Transactions on Dependable and Secure Computing, 2016, 15(4): 591-606.Google Scholar
  21. [21]
    Zhang K, Liang X, Lu R, Shen X. PIF: A personalized finegrained spam filtering scheme with privacy preservation in mobile social networks. IEEE Transactions on Computational Social Systems, 2015, 2(3): 41-52.CrossRefGoogle Scholar
  22. [22]
    Yu S, Guo S, Stojmenovic I. Fool me if you can: Mimicking attacks and anti-attacks in cyberspace. IEEE Transactions on Computers, 2015, 64(1): 139-151.MathSciNetCrossRefzbMATHGoogle Scholar
  23. [23]
    Qu Y, Cui L, Yu S, Zhou W, Wu J. Improving data utility through game theory in personalized differential privacy. In Proc. the 2018 IEEE International Conference on Communications, May 2018, Article No. 656.Google Scholar
  24. [24]
    Wu D, Yang B, Wang R. Scalable privacy-preserving big data aggregation mechanism. Digital Communications and Networks, 2016, 2(3): 122-129.CrossRefGoogle Scholar
  25. [25]
    Wang Q, Hu S, Ren K, Wang J, Wang Z, Du M. Catch me in the dark: Effective privacy-preserving outsourcing of feature extractions over image data. In Proc. the 35th Annual IEEE International Conference on Computer Communications, April 2016, Article No. 131.Google Scholar
  26. [26]
    Ma J, Liu J, Huang X, Xiang Y, Wu W. Authenticated data redaction with fine-grained control. IEEE Transactions on Emerging Topics in Computing. doi:
  27. [27]
    Qu Y, Yu S, Gao L, Niu J. Big data set privacy preserving through sensitive attribute-based grouping. In Proc. the 2017 IEEE International Conference on Communications, May 2017, Article No. 792.Google Scholar
  28. [28]
    Dwork C, McSherry F, Nissim K, Smith A D. Calibrating noise to sensitivity in private data analysis. In Proc. the 3rd Theory of Cryptography Conference, March 2006, pp.265-284.Google Scholar
  29. [29]
    Du X, Guizani M, Xiao Y, Chen H. Secure and efficient time synchronization in heterogeneous sensor networks. IEEE Transactions on Vehicular Technology, 2008, 57(4): 2387-2394.CrossRefGoogle Scholar
  30. [30]
    Aghasian E, Garg S, Gao L, Yu S, Montgomery J. Scoring users’ privacy disclosure across multiple online social networks. IEEE Access, 2017, 5: 13118-13130.CrossRefGoogle Scholar
  31. [31]
    Wasserman L, Zhou S. A statistical framework for differential privacy. Journal of the American Statistical Association, 2010, 105(489): 375-389.MathSciNetCrossRefzbMATHGoogle Scholar
  32. [32]
    Jorgensen Z, Yu T, Cormode G. Conservative or liberal? Personalized differential privacy. In Proc. the 31st IEEE International Conference on Data Engineering, April 2015, pp.1023-1034.Google Scholar
  33. [33]
    Wang S, Huang L, Tian M, Yang W, Xu H, Guo H. Personalized privacy-preserving data aggregation for histogram estimation. In Proc. the 2015 IEEE Global Communications Conference, December 2015, Article No. 423.Google Scholar
  34. [34]
    He Z, Cai Z, Yu J. Latent-data privacy preserving with customized data utility for social network data. IEEE Transactions on Vehicular Technology, 2018, 67(1): 665-673.CrossRefGoogle Scholar
  35. [35]
    Nie Y, Yang W, Huang L, Xie X, Zhao Z, Wang S. A utilityoptimized framework for personalized private histogram estimation. IEEE Transactions on Knowledge and Data Engineering. doi:
  36. [36]
    McAuley J, Leskovec J. Social circles: Google+., Nov. 2018.

Copyright information

© Springer Science+Business Media, LLC & Science Press, China 2019

Authors and Affiliations

  • Lei Cui
    • 1
    • 2
  • Youyang Qu
    • 2
  • Mohammad Reza Nosouhi
    • 3
  • Shui Yu
    • 3
  • Jian-Wei Niu
    • 4
  • Gang Xie
    • 1
    • 5
    Email author
  1. 1.College of Information and ComputerTaiyuan University of TechnologyTaiyuanChina
  2. 2.School of Information TechnologyDeakin UniversityMelbourneAustralia
  3. 3.School of SoftwareUniversity of Technology SydneySydneyAustralia
  4. 4.School of Computer Science and EngineeringBeihang UniversityBeijingChina
  5. 5.Shanxi Key Laboratory of Advanced Control and Intelligent Information SystemTaiyuan University of Science and TechnologyTaiyuanChina

Personalised recommendations