Prestige Questions, Online Agents, and Gender-Driven Differences in Disclosure
This work considers the possibility of using virtual agents to encourage disclosure for sensitive information. In particular, this research used “prestige questions”, which asked participants to disclose information relevant to their socioeconomic status, such as credit limit, as well as university attendance, and mortgage or rent payments they could afford. We explored the potential for agents to enhance disclosure compared to conventional web-forms, due to their ability to serve as relational agents by creating rapport. To consider this possibility, agents were framed as artificially intelligent versus avatars controlled by a real human, and we compared these conditions to a version of the financial questionnaire with no agent. In this way, both the perceived agency of the agent and its ability to generate rapport were tested. Additionally, we examined the differences in disclosure between men and women in these conditions. Analyses reveled that agents (either AI- or human-framed) evoked greater disclosure compared to the no agent condition. However, there was some evidence that human-framed agents evoked greater lying. Thus, users in general responded more socially to the presence of a human- or AI-framed agent, and the benefits and costs of this approach were made apparent. The results are discussed in terms of rapport and anonymity.
KeywordsVirtual agents Human-Agent experimentation Disclosure
Unable to display preview. Download preview PDF.
- 2.Gabler, N.: The Secret Shame of Middle-Class Americans. The Atlantic, May 2016Google Scholar
- 3.Bickmore, T., Cassell, J.: Relational agents: a model and implementation of building user trust. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (2001)Google Scholar
- 7.Bickmore, T., Cassell, J.: Relational agents: a model and implementation of building user trust. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (2001)Google Scholar
- 12.Fawcett, P.E., Blomfield-Brown, C.: System and method for providing automated customer support. U.S. Patent No. 5,678,002, Oct. 14, 1997Google Scholar
- 13.Gratch, J., Wang, N., Gerten, J., Fast, E., Duffy, R.: Creating rapport with virtual agents. In: Pelachaud, C., Martin, J.-C., André, E., Chollet, G., Karpouzis, K., Pelé, D. (eds.) IVA 2007. LNCS, vol. 4722, pp. 125–138. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-74997-4_12CrossRefGoogle Scholar
- 14.Burgoon, J.K., Guerrero, L.K., Floyd, K.: Nonverbal communication. Routledge (2016)Google Scholar
- 16.De Vault, D., et al.: SimSensei Kiosk: a virtual human interviewer for healthcare decision support. In: Proceedings of the 2014 International Conference on Autonomous Agents and Multi-Agent Systems. International Foundation for Autonomous Agents and Multiagent Systems (2014)Google Scholar
- 17.Gratch, J., et al.: Creating rapport with virtual agents. International Workshop on Intelligent Virtual Agents. Springer Berlin Heidelberg (2007)Google Scholar
- 18.Weisband, S., Kiesler, S.: Self disclosure on computer forms: meta-analysis and implications. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM (1996)Google Scholar
- 19.Hoffman, E., McCabe, K., Smith, V.L.: Social distance and other-regarding behavior in dictator games. The American Economic Review 86(3), 653–660 (1996)Google Scholar