To Tell the Truth: Virtual Agents and Morning Morality

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10498)

Abstract

This paper investigates the impact of time of day on truthfulness in human-agent interactions. Time of day has been found to have important implications for moral behavior in human-human interaction. Namely, the morning morality effect shows that people are more likely to act ethically (i.e., tell fewer lies) in the morning than in the afternoon. Based on previous work on disclosure and virtual agents, we propose that this effect will not bear out in human-agent interactions. Preliminary evaluation shows that individuals who lie when engaged in multi-issue bargaining tasks with the Conflict Resolution Agent, a semi-automated virtual human, tell more lies to human negotiation partners than virtual agent negotiation partners in the afternoon and are more likely to tell more lies in the afternoon than in the morning when they believe they are negotiating with a human. Time of day does not have a significant effect on the amount of lies told to the virtual agent during the multi-issue bargaining task.

Keywords

Morning morality Virtual humans Honest responding Multi-issue bargaining 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Gajadhar, B.J., de Kort, Y.A.W., IJsselsteijn, W.A.: Shared fun is doubled fun: player enjoyment as a function of social setting. In: Markopoulos, P., de Ruyter, B., IJsselsteijn, W., Rowland, D. (eds.) Fun and Games 2008. LNCS, vol. 5294, pp. 106–117. Springer, Heidelberg (2008). doi:10.1007/978-3-540-88322-7_11 CrossRefGoogle Scholar
  2. 2.
    Gratch, J., DeVault, D., Lucas, G.: The benefits of virtual humans for teaching negotiation. In: Traum, D., Swartout, W., Khooshabeh, P., Kopp, S., Scherer, S., Leuski, A. (eds.) IVA 2016. LNCS, vol. 10011, pp. 283–294. Springer, Cham (2016). doi:10.1007/978-3-319-47665-0_25 CrossRefGoogle Scholar
  3. 3.
    Kouchaki, M., Smith, I.H.: The morning morality effect: The influence of time of day on unethical behavior. Psychological Science 25(1), 95–102 (2014)CrossRefGoogle Scholar
  4. 4.
    Krämer, N.C., von der Pütten, A., Eimler, S.: Human-agent and human-robot interaction theory: similarities to and differences from human-human interaction. In: Zacarias, M., de Oliveira, J.V. (eds.) Human-Computer Interaction: The Agency Perspective. SCI, vol. 396, pp. 215–240. Springer, Heidelberg (2012). doi:10.1007/978-3-642-25691-2_9 CrossRefGoogle Scholar
  5. 5.
    Lucas, G.M., Gratch, J., King, A., Morency, L.P.: It’s only a computer: virtual humans increase willingness to disclose. Computers in Human Behavior 37, 94–100 (2014)CrossRefGoogle Scholar
  6. 6.
    de Melo, C.M., Gratch, J.: Beyond believability: quantifying the differences between real and virtual humans. In: Brinkman, W.-P., Broekens, J., Heylen, D. (eds.) IVA 2015. LNCS, vol. 9238, pp. 109–118. Springer, Cham (2015). doi:10.1007/978-3-319-21996-7_11 CrossRefGoogle Scholar
  7. 7.
    Ravaja, N.: The psychophysiology of digital gaming: The effect of a non co-located opponent. Media Psychology 12(3), 268–294 (2009)CrossRefGoogle Scholar
  8. 8.
    Reeves, B., Nass., C.: The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press (1996)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Institute for Creative TechnologiesUniversity of Southern CaliforniaLos AngelesUSA

Personalised recommendations