Take It or Leave It: Running a Survey When Privacy Comes at a Cost
In this paper, we consider the problem of estimating a potentially sensitive (individually stigmatizing) statistic on a population. In our model, individuals are concerned about their privacy, and experience some cost as a function of their privacy loss. Nevertheless, they would be willing to participate in the survey if they were compensated for their privacy cost. These cost functions are not publicly known, however, nor do we make Bayesian assumptions about their form or distribution. Individuals are rational and will misreport their costs for privacy if doing so is in their best interest. Ghosh and Roth recently showed in this setting, when costs for privacy loss may be correlated with private types, if individuals value differential privacy, no individually rational direct revelation mechanism can compute any non-trivial estimate of the population statistic. In this paper, we circumvent this impossibility result by proposing a modified notion of how individuals experience cost as a function of their privacy loss, and by giving a mechanism which does not operate by direct revelation. Instead, our mechanism has the ability to randomly approach individuals from a population and offer them a take-it-or-leave-it offer. This is intended to model the abilities of a surveyor who may stand on a street corner and approach passers-by.
KeywordsPrivate Data Laplace Distribution Privacy Cost Participation Decision Differential Privacy
Unable to display preview. Download preview PDF.
- 1.Heckman, J.: Sample selection bias as a specification error. Econometrica: Journal of the Econometric Society, 153–161 (1979)Google Scholar
- 2.Ghosh, A., Roth, A.: Selling privacy at auction. In: EC 2011: Proceedings of the 12th ACM Conference on Electronic Commerce, pp. 199–208. ACM (2011)Google Scholar
- 3.McSherry, F., Talwar, K.: Mechanism design via differential privacy. In: Proceedings of the 48th Annual Symposium on Foundations of Computer Science (2007)Google Scholar
- 4.Chen, Y., Chong, S., Kash, I., Moran, T., Vadhan, S.: Truthful mechanisms for agents that value privacy. Arxiv preprint arXiv:1111.5472 (2011)Google Scholar
- 7.Gupta, A., Ligett, K., McSherry, F., Roth, A., Talwar, K.: Differentially Private Combinatorial Optimization. In: Proceedings of the ACM-SIAM Symposium on Discrete Algorithms (2010)Google Scholar
- 8.Nissim, K., Smorodinsky, R., Tennenholtz, M.: Approximately optimal mechanism design via differential privacy. In: ITCS 2012: Proceedings of the 3rd Innovations in Computers Science Conference (2012)Google Scholar
- 9.Xiao, D.: Is privacy compatible with truthfulness? (2011) (manuscript)Google Scholar
- 10.Nissim, K., Orlandi, C., Smorodinsky, R.: Privacy-aware mechanism design. In: ACM Conference on Electronic Commerce, pp. 774–789 (2012)Google Scholar
- 11.Huang, Z., Kannan, S.: The exponential mechanism for social welfare: Private, truthful, and nearly optimal. In: FOCS 2012 (2012)Google Scholar
- 12.Feigenbaum, J., Jaggard, A., Schapira, M.: Approximate privacy: foundations and quantification. In: Proceedings of the 11th ACM Conference on Electronic Commerce, pp. 167–178. ACM (2010)Google Scholar
- 13.Fleischer, L., Lyu, Y.H.: Approximately optimal auctions for selling privacy when costs are correlated with data. In: ACM Conference on Electronic Commerce, pp. 568–585 (2012)Google Scholar
- 14.Roth, A., Schoenebeck, G.: Conducting truthful surveys, cheaply. In: ACM Conference on Electronic Commerce, pp. 826–843 (2012)Google Scholar