Advertisement

Bootstrapping Online Trust: Timeline Activity Proofs

  • Constantin Cătălin Drăgan
  • Mark Manulis
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11025)

Abstract

Establishing initial trust between a new user and an online service, is being generally facilitated by centralized social media platforms, i.e., Facebook, Google, by allowing users to use their social profiles to prove “trustworthiness” to a new service which has some verification policy with regard to the information that it retrieves from the profiles. Typically, only static information, e.g., name, age, contact details, number of friends, are being used to establish the initial trust. However, such information provides only weak trust guarantees, as (malicious) users can trivially create new profiles and populate them with static data fast to convince the new service.

We argue that the way the profiles are used over (longer) periods of time should play a more prominent role in the initial trust establishment. Intuitively, verification policies, in addition to static data, could check whether profiles are being used on a regular basis and have a convincing footprint of activities over various periods of time to be perceived as more trustworthy.

In this paper, we introduce Timeline Activity Proofs (TAP) as a new trust factor. TAP allows online users to manage their timeline activities in a privacy-preserving way and use them to bootstrap online trust, e.g., as part of registration to a new service. In our model we do not rely on any centralized social media platform. Instead, users are given full control over the activities that they wish to use as part of \(\mathsf {TAP}\) proofs. A distributed public ledger is used to provide the crucial integrity guarantees, i.e., that activities cannot be tampered with retrospectively. Our \(\mathsf {TAP}\) construction adopts standard cryptographic techniques to enable authorized access to encrypted activities of a user for the purpose of policy verification and is proven to provide data confidentiality protecting the privacy of user’s activities and authenticated policy compliance protecting verifiers from users who cannot show the required footprint of past activities.

Notes

Acknowledgements

Constantin Cătălin Drăgan and Mark Manulis were supported by the EPSRC project TAPESTRY (EP/N02799X).

References

  1. 1.
    Barth, A., Boneh, D., Waters, B.: Privacy in encrypted content distribution using private broadcast encryption. In: Di Crescenzo, G., Rubin, A. (eds.) FC 2006. LNCS, vol. 4107, pp. 52–64. Springer, Heidelberg (2006).  https://doi.org/10.1007/11889663_4CrossRefGoogle Scholar
  2. 2.
    Canetti, R., Tauman Kalai, Y., Varia, M., Wichs, D.: On symmetric encryption and point obfuscation. In: Micciancio, D. (ed.) TCC 2010. LNCS, vol. 5978, pp. 52–71. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-11799-2_4CrossRefGoogle Scholar
  3. 3.
    Chaum, D.: Security without identification: transaction systems to make big brother obsolete. Commun. ACM 28(10), 1030–1044 (1985)CrossRefGoogle Scholar
  4. 4.
    Curtmola, R., Garay, J.A., Kamara, S., Ostrovsky, R.: Searchable symmetric encryption: improved definitions and efficient constructions. In: ACMCCS, pp. 79–88 (2006)Google Scholar
  5. 5.
    Cutillo, L.A., Molva, R., Strufe, T.: Safebook: a privacy-preserving online social network leveraging on real-life trust. IEEE Commun. Mag. 47(12), 94–101 (2009)CrossRefGoogle Scholar
  6. 6.
    Diffie, W., Hellman, M.E.: New directions in cryptography. IEEE Trans. Inf. Theory 22(6), 644–654 (1976)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Garman, C., Green, M., Miers, I.: Decentralized anonymous credentials. In: NDSS (2014)Google Scholar
  8. 8.
    Goldreich, O., Goldwasser, S., Micali, S.: How to construct random functions. J. ACM 33(4), 792–807 (1986)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Günther, F., Manulis, M., Strufe, T.: Cryptographic treatment of private user profiles. In: Danezis, G., Dietrich, S., Sako, K. (eds.) FC 2011. LNCS, vol. 7126, pp. 40–54. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-29889-9_5CrossRefGoogle Scholar
  10. 10.
    Katz, J., Lindell, Y.: Introduction to Modern Cryptography, 2nd edn. CRC Press, Boca Raton (2014)zbMATHGoogle Scholar
  11. 11.
    Matthew Rosenberg, N.C., Cadwalladr, C.: How Trump Consultants Exploited the Facebook Data of Millions, 17 March 2018. https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html
  12. 12.
    Nakamoto, S.: Bitcoin: a peer-to-peer electronic cash system (2008)Google Scholar
  13. 13.
    Nguyen, C.: China might use data to create a score for each citizen based on how trustworthy they are, 26 October 2016. http://uk.businessinsider.com/china-social-credit-score-like-black-mirror-2016-10
  14. 14.
    OpenID Connect Framework. https://openid.net. Accessed 16 June 2018
  15. 15.
    Smartbit. https://www.smartbit.com.au. Accessed 18 June 2018
  16. 16.
    Symeonidis, I., Tsormpatzoudi, P., Preneel, B.: Collateral damage of Facebook Apps: an enhanced privacy scoring model. IACR Cryptology ePrint Archive, Report 2015/456Google Scholar
  17. 17.
    Tiku, N.: Facebook will make it easier for you to control your personal data, 28 March 2018. https://www.wired.com/story/new-facebook-privacy-settings
  18. 18.
    Wood, G.: Ethereum yellow paper (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Surrey Centre for Cyber SecurityUniversity of SurreyGuildfordUK

Personalised recommendations