Understanding the Role of Privacy and Trust in Intelligent Personal Assistant Adoption

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11420)


Voice-controlled intelligent personal assistants (IPAs) have seen tremendous growth in recent years on smartphones and as standalone devices in people’s homes. While research has examined the potential benefits and drawbacks of these devices for IPA users, few studies have empirically evaluated the role of privacy and trust in individual decision to adopt IPAs. In this study, we present findings from a survey of IPA users and non-users (N = 1160) to understand (1) the motivations and barriers to adopting IPAs and (2) how concerns about data privacy and trust in company compliance with social contract related to IPA data affect acceptance and use of IPAs. We discuss our findings in light of social contract theory and frameworks of technology acceptance.


Intelligent personal assistant Internet of Things Privacy Technology adoption Amazon Alexa Google Home Social contract theory 



This publication is based upon work supported by the National Science Foundation under grants No. 1640640 and 1640697.


  1. 1.
    Bagozzi, R.P.: The legacy of the technology acceptance model and a proposal for a paradigm shift. J. Assoc. Inf. Syst. 8(4), 3 (2007)Google Scholar
  2. 2.
    Cases, A.-S., et al.: Web site spill over to email campaigns: the role of privacy, trust and shoppers’ attitudes. J. Bus. Res. 63(9–10), 993–999 (2010)CrossRefGoogle Scholar
  3. 3.
    Chung, H., et al.: Digital forensic approaches for Amazon Alexa ecosystem. Digit. Investig. 22, S15–S25 (2017)CrossRefGoogle Scholar
  4. 4.
    Chung, H., Lee, S.: Intelligent virtual assistant knows your life. CoRRabs/1803.00466 (2018)Google Scholar
  5. 5.
    Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13, 319–340 (1989)CrossRefGoogle Scholar
  6. 6.
    Doleck, T., et al.: Examining the antecedents of social networking sites use among CEGEP students. Educ. Inf. Technol. 22(5), 2103–2123 (2017)CrossRefGoogle Scholar
  7. 7.
    Donaldson, T., Dunfee, T.W.: Ties That Bind: A Social Contracts Approach to Business Ethics. Harvard Business School Press, Boston (1999)Google Scholar
  8. 8.
    Dorai, G., et al.: I know what you did last summer: your smart home Internet of Things and your iPhone forensically ratting you out. In: Proceedings of the 13th International Conference on Availability, Reliability and Security, pp. 49:1–49:10. ACM, New York (2018)Google Scholar
  9. 9.
    Druga, S., et al.: “Hey Google is it OK if I eat you?”: initial explorations in child-agent interaction. In: Proceedings of the 2017 Conference on Interaction Design and Children, pp. 595–600. ACM, New York (2017)Google Scholar
  10. 10.
    Dwivedi, Y.K., et al.: A generalised adoption model for services: a cross-country comparison of mobile health (m-health). Gov. Inf. Q. 33(1), 174–187 (2016)CrossRefGoogle Scholar
  11. 11.
    Eastlick, M.A., et al.: Understanding online B-to-C relationships: an integrated model of privacy concerns, trust, and commitment. J. Bus. Res. 59(8), 877–886 (2006)CrossRefGoogle Scholar
  12. 12.
    Fruchter, N., Liccardi, I.: Consumer attitudes towards privacy and security in home assistants. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, pp. LBW0501–LBW0506. ACM, New York (2018)Google Scholar
  13. 13.
    Lopatovska, I., et al.: Talk to me: exploring user interactions with the Amazon Alexa. J. Librariansh. Inf. Sci., 1–14 (2018)Google Scholar
  14. 14.
    Lopatovska, I., Williams, H.: Personification of the Amazon Alexa: BFF or a mindless companion. In: Proceedings of the 2018 Conference on Human Information Interaction & Retrieval, pp. 265–268. ACM Press, New Brunswick (2018)Google Scholar
  15. 15.
    Luger, E., Sellen, A.: “Like Having a Really Bad PA”: the Gulf between user expectation and experience of conversational agents. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 5286–5297. ACM, New York (2016)Google Scholar
  16. 16.
    Martin, K.E.: Diminished or just different?: A factorial vignette study of privacy as a social contract. J. Bus. Ethics 111(4), 519–539 (2012)CrossRefGoogle Scholar
  17. 17.
    Martin, K.E.: Understanding privacy online: development of a social contract approach to privacy. J. Bus. Ethics 137(3), 551–569 (2015)CrossRefGoogle Scholar
  18. 18.
    McCole, P., et al.: Trust considerations on attitudes towards online purchasing: the moderating effect of privacy and security concerns. J. Bus. Res. 63(9), 1018–1024 (2010)CrossRefGoogle Scholar
  19. 19.
    Miltgen, C., et al.: Determinants of end-user acceptance of biometrics: Integrating the “Big 3” of technology acceptance with privacy context. Decis. Support Syst. 56, 103–114 (2013)CrossRefGoogle Scholar
  20. 20.
    Miyazaki, A.D.: Perceived ethicality of insurance claim fraud: do higher deductibles lead to lower ethical standards? J. Bus. Ethics 87(4), 589–598 (2008)CrossRefGoogle Scholar
  21. 21.
    Moorthy, A.E., Vu, K.-P.L.: Privacy concerns for use of voice activated personal assistant in the public space. Int. J. Hum. Comput. Interact. 31(4), 307–335 (2015)CrossRefGoogle Scholar
  22. 22.
    Nissenbaum, H.: Privacy as contextual integrity. Wash. Law Rev. 79, 119–157 (2004)Google Scholar
  23. 23.
    Nissenbaum, H.: Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford Law Books, Stanford (2010)Google Scholar
  24. 24.
    Olmstead, K.: Nearly half of Americans use digital voice assistants, mostly on their smartphones (2017).
  25. 25.
    Porcheron, M., et al.: “Do Animals Have Accents?”: talking with agents in multi-party conversation. In: Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, pp. 207–219. ACM, New York (2017)Google Scholar
  26. 26.
    Pradhan, A., et al.: “Accessibility came by accident”: use of voice-controlled intelligent personal assistants by people with disabilities. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 459:1–459:13. ACM, New York (2018)Google Scholar
  27. 27.
    Purington, A., et al.: “Alexa is my new BFF”: social roles, user satisfaction, and personification of the Amazon echo. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2853–2859. ACM, New York (2017)Google Scholar
  28. 28.
    Shoot, B.: St. Louis University Installs Amazon Echo Dots Campus-Wide (2018).
  29. 29.
    Stutzman, F., Hartzog, W.: Obscurity by design: an approach to building privacy into social media. In: Workshop on Reconciling Privacy with Social Media. ACM (2012)Google Scholar
  30. 30.
    Venkatesh, V., et al.: Consumer acceptance and use of information technology: extending the unified theory of acceptance and use of technology. MIS Q. 36(1), 157–178 (2012)CrossRefGoogle Scholar
  31. 31.
    Venkatesh, V., et al.: User acceptance of information technology: toward a unified view. MIS Q. 27, 425–478 (2003)CrossRefGoogle Scholar
  32. 32.
    Vitak, J.: A digital path to happiness? In: Reinecke, L., Oliver, M.B. (eds.) Handbook of Media Use and Well-Being. Routledge, New York (2016)Google Scholar
  33. 33.
    Welch, C.: Amazon made a special version of Alexa for hotels that put Echo speakers in their rooms (2018).
  34. 34.
    Xu, H., et al.: Measuring mobile users’ concerns for information privacy. In: Proceedings of the International Conference on Information Systems 2012 on Digital Innovation in the Service Economy, pp. 1–16 (2012)Google Scholar
  35. 35.
    Zeng, E., et al.: End user security and privacy concerns with smart homes. In: Symposium on Usable Privacy and Security (SOUPS 2017), pp. 65–80 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of MarylandCollege ParkUSA
  2. 2.University of Wisconsin—MilwaukeeMilwaukeeUSA

Personalised recommendations