Human-Computer Interaction

INTERACT 2015: Human-Computer Interaction – INTERACT 2015 pp 239-258 | Cite as

It’s Not the Way You Look, It’s How You Move: Validating a General Scheme for Robot Affective Behaviour

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9298)

Abstract

In the emerging world of human-robot interaction, people and robots will work together to achieve joint objectives. This paper discusses the design and validation of a general scheme for creating emotionally expressive behaviours for robots, in order that people might better interpret how a robot collaborator is succeeding or failing in its work. It exemplifies a unified approach to creating robot behaviours for two very different robot forms, based on combinations of four groups of design parameters (approach/avoidance, energy, intensity and frequency). 59 people rated video clips of robots performing expressive behaviours both for emotional expressivity on Valence-Arousal-Dominance dimensions, and their judgement of the successfulness of the robots’ work. Results are discussed in terms of the utility of expressive behaviour for facilitating human understanding of robot intentions and the design of cues for basic emotional states.

Keywords

Human-robot interaction Social robotics Nonverbal communication Artificial emotions Body language 

References

  1. 1.
    Ackermann, M., Starr, B.: Social activity indicators: interface components for cscw systems. In: Proceedings of the UIST 1995, the 8th ACM Symposium on User Interface Software and Technology, Pittsburgh, USA (1995)Google Scholar
  2. 2.
    Argyle, M.: Bodily Communication. Routledge, New York (2013)Google Scholar
  3. 3.
    Barakova, E.I., Lourens, T.: Expressing and interpreting emotional movements in social games with robots. Pers. Ubiquit. Comput. 14(5), 457–467 (2010)CrossRefGoogle Scholar
  4. 4.
    Beck, A., Hiolle, A., Mazel, A., Cañamero, L.: Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd International Workshop on Affective Interaction in Natural Environments, pp. 37–42. ACM (2010)Google Scholar
  5. 5.
    Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)CrossRefGoogle Scholar
  6. 6.
    Cramer, H., Kemper, N., Zwijnenburg, A., de Rooij, O.: Phobot: Hri 2008 student design competition winnerGoogle Scholar
  7. 7.
    Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3–4), 169–200 (1992)CrossRefGoogle Scholar
  8. 8.
    Gaur, V., Scassellati, B.: Which motion features induce the perception of animacy? In: Proceedings of the 2006 IEEE International Conference for Development and Learning, Bloomington, Indiana (2006)Google Scholar
  9. 9.
    Gopinath, R.: Employees emotions in workplace. Res. J. Bus. Manage. 5(1), 1–15 (2011)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Hareli, S., Rafaeli, A.: Emotion cycles: on the social influence of emotion in organizations. Res. Organ. Behav. 28, 35–59 (2008)CrossRefGoogle Scholar
  11. 11.
    Heider, F., Simmel, M.: An experimental study of apparent behavior. Am. J. Psychol. 243–259 (1944)Google Scholar
  12. 12.
    Karg, M., Schwimmbeck, M., Kuhnlenz, K., Buss, M.: Towards mapping emotive gait patterns from human to robot. In: 2010 IEEE RO-MAN, pp. 258–263, IEEE (2010)Google Scholar
  13. 13.
    Leite, I., Pereira, A., Martinho, C., Paiva, A.: Are emotional robots more fun to play with? In: The 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2008, pp. 77–82. IEEE (2008)Google Scholar
  14. 14.
    Lohse, M., Rothuis, R., Gallego-Pérez, J., Karreman, D.E., Evers, V.: Robot gestures make difficult tasks easier: the impact of gestures on perceived workload and task performance. In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems, pp. 1459–1466. ACM (2014)Google Scholar
  15. 15.
    McCarthy, J.: Ascribing mental qualities to machines. In: Formalizing Common Sense: Papers, pp. 93–118. Ablex (1990)Google Scholar
  16. 16.
    Mehrabian, A.: Basic Dimensions for a General Psychological Theory: Implications for Personality, Social, Environmental, and Developmental Studies. Oelgeschlager Gunn & Hain, Cambridge (1980)Google Scholar
  17. 17.
    Novikova, J., Watts, L.: A design model of emotional body expressions in non-humanoid robots. In: Proceedings of the Second International Conference on Human-Agent Interaction (HAI 2014) (2014)Google Scholar
  18. 18.
    Pinelle, D., Gutwin, C., Greenberg, S.: Task analysis for groupware usability evaluation: modeling shared-workspace tasks with the mechanics of collaboration. ACM Trans. Comput. Hum. Interact. 281–311 (2003)Google Scholar
  19. 19.
    Read, R., Belpaeme, T.: Situational context directs how people affectively interpret robotic non-linguistic utterances. In: Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, 41–48. ACM (2014)Google Scholar
  20. 20.
    Reeves, B., Nass, C.: How People Treat Computers, Television, and New Media like Real People and Places. CSLI Publications/Cambridge University Press, Stanford/Cambridge (1996)Google Scholar
  21. 21.
    Ribeiro, T., Paiva, A.: The illusion of robotic life: principles and practices of animation for robots. In: Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 383–390. ACM (2012)Google Scholar
  22. 22.
    Saerbeck, M., Bartneck, C.: Perception of affect elicited by robot motion. In: Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction, pp. 53–60. IEEE Press (2010)Google Scholar
  23. 23.
    Samadani, A.-A., Kubica, E., Gorbet, R., Kulić, D.: Perception and generation of affective hand movements. Int. J. Soc. Rob. 5(1), 35–51 (2013)CrossRefGoogle Scholar
  24. 24.
    Schmidt, K.: The problem with ‘awareness’. Computer Supported Cooperative Work. J. Collaborative Comput. 11, 3–4 (2002)CrossRefGoogle Scholar
  25. 25.
    Scholl, B.J., Tremoulet, P.D.: Perceptual causality and animacy. Trends Cogn. Sci. 4(8), 299–309 (2000)CrossRefGoogle Scholar
  26. 26.
    Singh, A., Young, J.E.: A dog tail for utility robots: exploring affective properties of tail movement. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013, Part II. LNCS, vol. 8118, pp. 403–419. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  27. 27.
    Woods, S., Walters, M., Koay, K.L., Dautenhahn, K.: Comparing, human robot interaction scenarios using live and video based methods: towards a novel methodological approach. In: 9th IEEE International Workshop on Advanced Motion Control, pp. 750–755. IEEE (2006)Google Scholar
  28. 28.
    Xu, J., Broekens, J., Hindriks, K., Neerincx, M.A.: Robot mood is contagious: effects of robot body language in the imitation game. In: Proceedings of the 2014 International Conference on Autonomous Agents and Multi-Agent Systems, International Foundation for Autonomous Agents and Multiagent Systems, pp. 973–980 (2014)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of BathBathUK

Personalised recommendations