International Journal of Social Robotics

, Volume 4, Issue 2, pp 163–180 | Cite as

The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature

Article

Abstract

Affective touch is a crucial element of early human development, social bonding, and emotional support. Technically and socially difficult to study, it has received little research attention. Our approach employs animal models instantiated by the Haptic Creature, a touch-centric social robot. In this paper, we examine how humans communicate emotional state through touch to the Haptic Creature and their expectations of its reactions. A user study is presented where participants selected and performed gestures they would likely use when conveying nine different emotions to the Haptic Creature. We report a touch dictionary compiled for our research; the gestures participants chose from it; and video analysis of their enactment. Our principal findings regard patterns of gesture use for emotional expression; physical properties of the likely gestures; expectations for the Haptic Creature’s response to mirror the emotion communicated; and analysis of the human’s higher intent in communication. From the latter finding, we present five tentative categories of “intent” that overlap emotion states: protective, comforting, restful, affectionate, and playful. These results can help inform the future design of social robots by illuminating details of one direction in affective touch interactions.

Keywords

Affective touch Socially interactive robots Affect display Human-robot interaction (HRI) Affect Haptics Emotion Touch Robot pets 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Beck A, Katcher AH (1996) Between pets and people: the importance of animal companionship. Purdue University Press, West Lafayette Google Scholar
  2. 2.
    Bradley MM, Lang PJ (2007) The international affective picture system (IAPS) in the study of emotion and attention. In: Coan JA, Allen JJ (eds) Handbook of emotion elicitation and assessment. Series in affective science. Oxford University Press, London, pp 29–46. Chap 2 Google Scholar
  3. 3.
    Breazeal C (2003) Emotive qualities in lip-synchronized robot speech. Adv Robot 17(2):97–113 CrossRefGoogle Scholar
  4. 4.
    Breazeal CL (2002) Designing sociable robots. MIT Press, Cambridge Google Scholar
  5. 5.
    Campos JJ, Mumme DL, Kermoian R, Campos RG (1994) A functionalist perspective on the nature of emotion. Monogr Soc Res Child Dev 59(2/3):284–303 CrossRefGoogle Scholar
  6. 6.
    Canamero LD, Fredslund J (2000) How does it feel? Emotional interaction with a humanoid LEGO robot. In: Socially intelligent agents: the human in the loop. Papers from the AAAI fall symposium, pp 23–28 Google Scholar
  7. 7.
    Chang J, MacLean K, Yohanan S (2010) Gesture recognition in the haptic creature. In: Kappers A, van Erp J, Tiest WB, van der Helm F (eds) Haptics: generating and perceiving tangible sensations—EuroHaptics 2010. Lecture notes in computer science, vol 6191. Springer, Berlin, pp 385–391 CrossRefGoogle Scholar
  8. 8.
    Darwin C (1872) The expression of the emotions in man and animals. Murray, London CrossRefGoogle Scholar
  9. 9.
    Ekman P (1992) Are there basic emotions? Psychol Rev 99(3):550–553 CrossRefGoogle Scholar
  10. 10.
    Ekman P, Friesen WV (1971) Constants across cultures in the face and emotion. J Pers Soc Psychol 17(2):124–129 CrossRefGoogle Scholar
  11. 11.
    Fisher JD, Rytting M, Heslin R (1976) Hands touching hands: affective and evaluative effects of an interpersonal touch. Sociometry 39(4):416–421 CrossRefGoogle Scholar
  12. 12.
    Frank MG, Stennett J (2001) The forced-choice paradigm and the perception of facial expressions of emotion. J Pers Soc Psychol 80(1):75–85 CrossRefGoogle Scholar
  13. 13.
    Fujita M, Kitano H (1998) Development of an autonomous quadruped robot for robot entertainment. Auton Robots 5(1):7–18 CrossRefGoogle Scholar
  14. 14.
    Gentoo (2009) Gentoo Linux. http://www.gentoo.org/. Aug 2009
  15. 15.
    Heller MA, Schiff W (eds) (1991) The psychology of touch. Erlbaum, Hillsdale Google Scholar
  16. 16.
    Hertenstein MJ, Holmes R, McCullough M, Keltner D (2009) The communication of emotion via touch. Emotion 9(4):566–573 CrossRefGoogle Scholar
  17. 17.
    Hertenstein MJ, Keltner D, App B, Bulleit BA, Jaskolka AR (2006) Touch communicates distinct emotions. Emotion 6(3):528–533 CrossRefGoogle Scholar
  18. 18.
    Hertenstein MJ, Verkamp JM, Kerestes AM, Holmes RM (2006) The communicative functions of touch in humans, nonhuman primates, and rats: a review and synthesis of the empirical research. Genet Soc Gen Psychol Monogr 132(1):5–94 CrossRefGoogle Scholar
  19. 19.
    Irtel H (2007) PXLab: The psychological experiments laboratory. Mannheim (Germany): University of Mannheim. http://www.pxlab.de/
  20. 20.
    Jones SE, Yarbrough AE (1985) A naturalistic study of the meanings of touch. Commun Monogr 52(1):19–56 CrossRefGoogle Scholar
  21. 21.
    Jourard SM (1966) An exploratory study of body-accessibility. Br J Soc Clin Psychol 5:221–231 CrossRefGoogle Scholar
  22. 22.
    Katcher AH (1981) Interrelations between people and pets. In: Interactions between people and their pets: form and function. Thomas, Springfield, pp 41–67 Google Scholar
  23. 23.
    Katcher AH, Friedmann E, Goodman M, Goodman L (1983) Men, women, and dogs. Calif Vet 37(2):14–17 Google Scholar
  24. 24.
    Kobayashi H, Hara F (1995) A basic study on dynamic control of facial expressions for face robot. In: RO-MAN ’95: Proceedings of the 4th IEEE international workshop on robot and human communication, July 1995, pp 275–280 CrossRefGoogle Scholar
  25. 25.
    Lang PJ (1980) Behavioral treatment and bio-behavioral assessment: computer applications. In: Sidowski JB, Johnson JH, Williams TW (eds) Technology in mental health care delivery systems. Ablex, Norwood, pp 129–139 Google Scholar
  26. 26.
    Major B, Heslin R (1982) Perceptions of cross-sex and same-sex nonreciprocal touch: it is better to give than to receive. J Nonverbal Behav 6(3):148–162 CrossRefGoogle Scholar
  27. 27.
    McKean E (ed) (2005) The new Oxford American dictionary, 2nd edn. Oxford University Press, London Google Scholar
  28. 28.
    Melson GF, Kahn PH Jr, Beck AM, Friedman B, Roberts T, Garrett E (2005) Robots as dogs?: Children’s interactions with the robotic dog AIBO and a live Australian shepherd. In: CHI ’05 Extended abstracts on human factors in computing systems, CHI EA ’05, New York, New York, USA, 2005. ACM, New York, pp 1649–1652 CrossRefGoogle Scholar
  29. 29.
    Montagu A (1986) Touching: the human significance of the skin. Perennial Library, New York Google Scholar
  30. 30.
    Munsell KL, Canfield M, Templer DI, Tangan K, Arikawa H (2004) Modification of the pet attitude scale. Soc Anim 12(2):137–142 CrossRefGoogle Scholar
  31. 31.
    Nguyen T, Heslin R, Nguyen ML (1975) The meanings of touch: sex differences. J Commun 25(3):92–103 CrossRefGoogle Scholar
  32. 32.
    Poresky RH, Hendrix C, Hosier JE, Samuelson ML (1987) The companion animal bonding scale: internal reliability and construct validity. Psychol Rep 60:743–746 CrossRefGoogle Scholar
  33. 33.
    Russell JA (1979) Affective space is bipolar. J Pers Soc Psychol 37(3):345–356 CrossRefGoogle Scholar
  34. 34.
    Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178 CrossRefGoogle Scholar
  35. 35.
    Russell JA (1994) Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychol Bull 115(1):102–141 CrossRefGoogle Scholar
  36. 36.
    Russell JA, Weiss A, Mendelsohn GA (1989) Affect grid: a single-item scale of pleasure and arousal. J Pers Soc Psychol 57(3):493–502 CrossRefGoogle Scholar
  37. 37.
    Saldien J, Goris K, Yilmazyildiz S, Verhelst W, Lefeber D (2008) On the design of the huggable robot Probo. J Phys Agents 2(2):3–11 Google Scholar
  38. 38.
    Scheeff M, Pinto J, Rahardja K, Snibbe S, Tow R (2000) Experiences with Sparky, a social robot. In: Workshop on interactive robotics and entertainment, Pittsburgh, Pennsylvania, USA, April 2000. AAAI Press, Menlo Park Google Scholar
  39. 39.
    Scheutz M, Schermerhorn P, Kramer J, Anderson D (2007) First steps toward natural human-like HRI. Auton Robots 22(4):411–423 CrossRefGoogle Scholar
  40. 40.
    Serpell JA (1996) In the company of animals: a study of human-animal relationships. Cambridge University Press, Cambridge Google Scholar
  41. 41.
    Serpell JA (2000) Handbook on animal-assisted therapy: theoretical foundations and guidelines for practice. In: Animal companions and human well-being: an historical exploration of the value of human-animal relationships. Academic Press, New York, pp 3–19. Chap 1 Google Scholar
  42. 42.
    Shibata T, Mitsui T, Wada K, Touda A, Kumasaka T, Tagami K, Tanie K (2001) Mental commit robot and its application to therapy of children. In: Proceedings of IEEE/ASME international conference on advanced intelligent mechatronics, IEEE/ASME, July 2001, vol 2, pp 1053–1058 Google Scholar
  43. 43.
    Stiehl WD, Lieberman J, Breazeal C, Basel L, Lalla L, Wolf M (2005) Design of a therapeutic robotic companion for relational, affective touch. In: ROMAN 2005. IEEE international workshop on robot and human interactive communication, August 2005, pp 408–415 CrossRefGoogle Scholar
  44. 44.
    Ugobe (2007) PleoWorld—the home of Pleo, the robotic baby dinosaur from Ugobe Life Forms. http://www.pleoworld.com/
  45. 45.
    Wada K, Shibata T, Saito T, Sakamoto K, Tanie K (2005) Psychological and social effects of one year robot assisted activity on elderly people at a health service facility for the aged. In: Proceedings of the 2005 IEEE international conference on robotics and automation, 2005. ICRA 2005, pp 2785–2790 CrossRefGoogle Scholar
  46. 46.
    Walker DN (1975) A dyadic interaction model for nonverbal touching behavior in encounter groups. Small Group Res 6(3):308–324 CrossRefGoogle Scholar
  47. 47.
    Weiss SJ (1992) Measurement of the sensory qualities in tactile interaction. Nurs Res 41(2):82–86 CrossRefGoogle Scholar
  48. 48.
    Willis FN Jr, Hamm HK (1980) The use of interpersonal touch in securing compliance. J Nonverbal Behav 5(1):49–55 CrossRefGoogle Scholar
  49. 49.
    Yilmazyildiz S, Mattheyses W, Patsis Y, Verhelst W (2006) Expressive speech recognition and synthesis as enabling technologies for affective robot-child communication. In: Zhuang Y, Yang S, Rui Y, He Q (eds) Advances in multimedia information processing—PCM 2006. Lecture notes in computer science, vol 4261. Springer, Berlin, pp 1–8 CrossRefGoogle Scholar
  50. 50.
    Yohanan S, MacLean K (2008) The Haptic Creature project: social human-robot interaction through affective touch. In: Proceedings of the AISB 2008 symposium on the reign of catz & dogz: the second AISB symposium on the role of virtual creatures in a computerised society, April 2008, vol 1, pp 7–11 Google Scholar
  51. 51.
    Yohanan S, MacLean KE (2009) A tool to study affective touch: goals & design of the haptic creature. In: CHI ’09: Proceedings of the 27th international conference extended abstracts on human factors in computing systems, New York, New York, USA, 2009. ACM, New York, pp 4153–4158 CrossRefGoogle Scholar
  52. 52.
    Yohanan S, MacLean KE (2011) Design and assessment of the Haptic Creature’s affect display. In: HRI ’11: Proceedings of the 6th ACM/IEEE international conference on human-robot interaction, New York, New York, USA, March 2011. ACM, New York, pp 473–480 Google Scholar

Copyright information

© Springer Science & Business Media BV 2011

Authors and Affiliations

  1. 1.SPIN Research Group, Department of Computer ScienceUniversity of British ColumbiaVancouverCanada

Personalised recommendations