International Journal of Social Robotics

, Volume 5, Issue 1, pp 35–51 | Cite as

Perception and Generation of Affective Hand Movements

  • Ali-Akbar Samadani
  • Eric Kubica
  • Rob Gorbet
  • Dana Kulić
Article

Abstract

Perception and generation of affective movements are essential for achieving the expressivity required for a fully engaging human-machine interaction. This paper develops a computational model for recognizing and generating affective hand movements for display on anthropomorphic and non-anthropomorphic structures. First, time-series features of these movements are aligned and converted to fixed-length vectors using piece-wise linear re-sampling. Next, a feature transformation best capable of discriminating between the affective movements is obtained using functional principal component analysis (FPCA). The resulting low-dimensional feature transformation is used for classification and regeneration. A dataset consisting of one movement type, closing and opening the hand, is considered for this study. Three different expressions, sadness, happiness and anger, were conveyed by a demonstrator through the same general movement. The performance of the developed model is evaluated objectively using leave-one-out cross validation and subjectively through a user study, where participants evaluated the regenerated affective movements as well as the original affective movements reproduced both on a human-like model and a non-anthropomorphic structure. The proposed approach achieves zero leave-one-out cross validation errors, on both the training and testing sets. No significant difference is observed between participants’ evaluation of the regenerated movements as compared to the original movement, which confirms successful regeneration of the affective movement. Furthermore, a significant effect of structure on the perception of affective movements is observed.

Keywords

Affective hand movements Affective computing Functional data analysis Feature extraction Human affect perception Affective movement generation 

References

  1. 1.
    Araki Y, Konishi S, Kawano S, Matsui H (2009) Functional regression modeling via regularized Gaussian basis expansions. Ann Inst Stat Math 61(4):811–833 MathSciNetCrossRefGoogle Scholar
  2. 2.
    Argyle M (1988) Bodily communication. Taylor & Francis, London Google Scholar
  3. 3.
    Beesley P (2009) Hylozoic soil. Leonardo J Sci 42(4):360–361 CrossRefGoogle Scholar
  4. 4.
    Beesley P (2010) Kinetic architectures and geotextile installations. Riverside Architectural Press Google Scholar
  5. 5.
    Bernardin K, Ogawara K, Ikeuchi K, Dillmann R (2005) A sensor fusion approach for recognizing continuous human grasping sequences using hidden Markov models. IEEE Trans Robot 21(1):47–57 CrossRefGoogle Scholar
  6. 6.
    Bethel C, Murphy R (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern, Part C, Appl Rev 38(1):83–92 CrossRefGoogle Scholar
  7. 7.
    Birdwhistell R (1970) Kinesics and context. A. Lane. Penguin Press Google Scholar
  8. 8.
    Blake R, Shiffrar M (2007) Perception of human motion. Annu Rev Psychol 58:47–73 CrossRefGoogle Scholar
  9. 9.
    Blythe P, Todd P, Miller G (1999) How motion reveals intention: categorizing social interactions. In: Simple heuristics that make us smart, pp 257–285 Google Scholar
  10. 10.
    Bookstein F (1997) Morphometric tools for landmark data: geometry and biology. Cambridge University Press, Cambridge Google Scholar
  11. 11.
    Bradley M, Lang P (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatr 25(1):49–59 CrossRefGoogle Scholar
  12. 12.
    Breazeal C (2004) Designing sociable robots. MIT Press, Cambridge Google Scholar
  13. 13.
    Buck R (1984) The communication of emotion. Guilford Press, New York Google Scholar
  14. 14.
    Bull P (1987) Posture and gesture. Pergamon, Elmsford Google Scholar
  15. 15.
    Calvo A, D’Mello S (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1(1):18–37 CrossRefGoogle Scholar
  16. 16.
    Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum-Comput Stud 59(1):213–225 CrossRefGoogle Scholar
  17. 17.
    Canamero L, Fredslund J (2000) How does it feel? Emotional interaction with a humanoid Lego robot. In: AAAI fall symposium socially intelligent agents—the human in the loop. AAAI Press, Menlo Park, pp 23–28 Google Scholar
  18. 18.
    Carmichael L, Roberts S, Wessell N (1937) A study of the judgment of manual expression as presented in still and motion pictures. Br J Soc Psychol 8(1):115–142 Google Scholar
  19. 19.
    Chen C, Zhuang Y, Nie F, Yang Y, Wu F, Xiao J (2011) Learning a 3D human pose distance metric from geometric pose descriptor. IEEE Trans Vis Comput Graph 17(11):1676–1689 CrossRefGoogle Scholar
  20. 20.
    Cohen J (1988) Statistical power analysis for the behavioral sciences. Erlbaum, Hilsdale MATHGoogle Scholar
  21. 21.
    Colibazzi T, Posner J, Wang Z, Gorman D, Gerber A, Yu S, Zhu H, Kangarlu A, Duan Y, Russell J (2010) Neural systems subserving valence and arousal during the experience of induced emotions. Emotion 10(3):377–389 CrossRefGoogle Scholar
  22. 22.
    Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28(2):117–139 MathSciNetCrossRefGoogle Scholar
  23. 23.
    Daly E, Lancee W, Polivy J (1983) A conical model for the taxonomy of emotional experience. J Pers Soc Psychol 45(2):443–457 CrossRefGoogle Scholar
  24. 24.
    Damiani S, Deregibus E, Andreone L (2009) Driver–vehicle interfaces and interaction: where are they going? Eur Transp Res Rev 1(2):87–96 CrossRefGoogle Scholar
  25. 25.
    Darwin C (1872/1965) The expression of emotions in man and animals. Chicago University Press, Chicago Google Scholar
  26. 26.
    De Boor C (2001) A practical guide to splines, vol 27. Springer, Berlin MATHGoogle Scholar
  27. 27.
    Dick A, Brooks M (2003) Issues in automated visual surveillance. In: International conference on digital image computing: techniques and applications, pp 195–204 Google Scholar
  28. 28.
    Ekman P (1992) Are there basic emotions? Psychol Rev 99(3):550–553 CrossRefGoogle Scholar
  29. 29.
    Ekman P, Friesen W (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1(1):49–98 Google Scholar
  30. 30.
    Fast J (1988). Body language. Pocket Google Scholar
  31. 31.
    Faul F, Erdfelder E, Lang A, Buchner A (2007) G* power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39(2):175–191 CrossRefGoogle Scholar
  32. 32.
    Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3):143–166 CrossRefMATHGoogle Scholar
  33. 33.
    Gasser T, Kneip A (1995) Searching for structure in curve samples. J Am Stat Assoc 90(432):1179–1188 MATHGoogle Scholar
  34. 34.
    Harada T, Taoka S, Mori T, Sato T (2004) Quantitative evaluation method for pose and motion similarity based on human perception. In: 4th IEEE/RAS international conference on humanoid robots, vol 1, pp 494–512 CrossRefGoogle Scholar
  35. 35.
    Heider F, Simmel M (1944) An experimental study of apparent behavior. Am J Psychol 57(2):243–259 CrossRefGoogle Scholar
  36. 36.
    Hietanen J, Leppänen J, Lehtonen U (2004) Perception of emotions in the hand movement quality of Finnish sign language. J Nonverbal Behav 28(1):53–64 CrossRefGoogle Scholar
  37. 37.
    Hodgins J, O’Brien J, Tumblin J (1998) Perception of human motion with different geometric models. IEEE Trans Vis Comput Graph 4(4):307–316 CrossRefGoogle Scholar
  38. 38.
    Iba S, Weghe J, Paredis C, Khosla P (1999) An architecture for gesture-based control of mobile robots. In: Proceedings of IEEE/RDJ international conference on intelligent robots and systems (IROS’99), vol 2. IEEE Press, New York, pp 851–857 Google Scholar
  39. 39.
    Inamura T, Toshima I, Tanie H, Nakamura Y (2004) Embodied symbol emergence based on mimesis theory. Int J Robot Res 23(4–5):363 CrossRefGoogle Scholar
  40. 40.
    Ivanenko Y, Cappellini G, Dominici N, Poppele R, Lacquaniti F (2005) Coordination of locomotion with voluntary movements in humans. J Neurosci 25(31):7238–7253 CrossRefGoogle Scholar
  41. 41.
    James W (1884) What is an emotion? Mind 9(34):188–205 CrossRefGoogle Scholar
  42. 42.
    Jansen T (2007) The great pretender. OIO Publishers Google Scholar
  43. 43.
    Jenkins OC, Matarić MJ (2004) A spatio-temporal extension to Isomap nonlinear dimension reduction. In: Proceedings of the 21st international conference on machine learning (ICML’04). ACM, New York, pp 441–448 Google Scholar
  44. 44.
    Jolliffe I, MyiLibrary (2002) Principal component analysis, vol 2. Wiley Online Library Google Scholar
  45. 45.
    Krauss R, Hadar U (1999) The role of speech-related arm/hand gestures in word retrieval. Gesture, speech, and sign pp 93–116 Google Scholar
  46. 46.
    Kulić D, Croft E (2007) Physiological and subjective responses to articulated robot motion. Robotica 25(1):13–27 CrossRefGoogle Scholar
  47. 47.
    Kulić D, Takano W, Nakamura Y (2008) Incremental learning, clustering and hierarchy formation of whole body motion patterns using adaptive hidden Markov chains. Int J Robot Res 27(7):761–784 CrossRefGoogle Scholar
  48. 48.
    Laban R, Lawrence F (1947) Effort. Macdonald and Evans Google Scholar
  49. 49.
    Lee E (1982) A simplified b-spline computation routine. Computing 29(4):365–371 MathSciNetCrossRefMATHGoogle Scholar
  50. 50.
    Lee J, Chai J, Reitsma P, Hodgins J, Pollard N (2002) Interactive control of avatars animated with human motion data. ACM Trans Graph 21(3):491–500 Google Scholar
  51. 51.
    Lee J, Park J, Nam T (2007) Emotional interaction through physical movement. Hum-Comput Interact 3:401–410 Google Scholar
  52. 52.
    Lewis M (1995) Self-conscious emotions. Am Sci 83(1):68–78 Google Scholar
  53. 53.
    Losch M, Schmidt-Rohr S, Knoop S, Vacek S, Dillmann R (2007) Feature set selection and optimal classifier for human activity recognition. In: The 16th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE Press, New York, pp 1022–1027 CrossRefGoogle Scholar
  54. 54.
    Lu G, Shark L, Hall G, Zeshan U (2009) Dynamic hand gesture tracking and recognition for real-time immersive virtual object manipulation. In: International conference on CyberWorlds. IEEE Press, New York, pp 29–35 CrossRefGoogle Scholar
  55. 55.
    McDonnell R, Jörg S, McHugh J, Newell F, O’Sullivan C (2008) Evaluating the emotional content of human motions on real and virtual characters. In: Proceedings of the 5th symposium on applied perception in graphics and visualization. ACM, New York, pp 67–74 CrossRefGoogle Scholar
  56. 56.
    Measurand (2009) Motion capture systems. http://www.measurand.com
  57. 57.
    Mitra S, Acharya T (2007) Gesture recognition: A survey. IEEE Trans Syst Man Cybern, Part C, Appl Rev 37(3):311–324 CrossRefGoogle Scholar
  58. 58.
    Müller M, Röder T, Clausen M (2005) Efficient content-based retrieval of motion capture data. ACM Trans Graph 24(3):677–685 CrossRefGoogle Scholar
  59. 59.
    Nakanishi J, Morimoto J, Endo G, Cheng G, Schaal S, Kawato M (2004) Learning from demonstration and adaptation of biped locomotion. Robot Auton Syst 47(2–3):79–91 CrossRefGoogle Scholar
  60. 60.
    Ogata T, Sugano S, Tani J (2005) Open–end human–robot interaction from the dynamical systems perspective: mutual adaptation and incremental learning. Adv Robot 19(6):651–670 CrossRefGoogle Scholar
  61. 61.
    Ong S, Ranganath S (2005) Automatic sign language analysis: A survey and the future beyond lexical meaning. IEEE Trans Pattern Anal Mach Intell 27(6):873–891 CrossRefGoogle Scholar
  62. 62.
    Perani D, Fazio F, Borghese N, Tettamanti M, Ferrari S, Decety J, Gilardi M (2001) Different brain correlates for watching real and virtual hand actions. NeuroImage 14(3):749–758 CrossRefGoogle Scholar
  63. 63.
    Picard R (2000) Affective computing. MIT Press, Cambridge Google Scholar
  64. 64.
    Pollick F, Paterson H, Bruderlin A, Sanford A (2001) Perceiving affect from arm movement. Cognition 82(2):B51–B61 CrossRefGoogle Scholar
  65. 65.
    Ramsay J (1997) Functional data analysis, 2nd edn. Springer, New York MATHGoogle Scholar
  66. 66.
    Ramsay J (2008) Functional data analysis software. http://www.psych.mcgill.ca/misc/fda/software.html
  67. 67.
    Ramsay J, Hooker G, Graves S (2009) Functional data analysis with R and MATLAB. Springer, Berlin CrossRefMATHGoogle Scholar
  68. 68.
    Reisenzein R (1994) Pleasure-arousal theory and the intensity of emotions. J Pers Soc Psychol 67(3):525–539 CrossRefGoogle Scholar
  69. 69.
    Roether C, Omlor L, Christensen A, Giese M (2009) Critical features for the perception of emotion from gait. J Vis 9(6):1–32 CrossRefGoogle Scholar
  70. 70.
    Russell J (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178 CrossRefGoogle Scholar
  71. 71.
    Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: International conference on human-robot interaction. ACM, New York, pp 53–60 Google Scholar
  72. 72.
    Samadani A (2011) Questionnaire’s videos. https://ece.uwaterloo.ca/~asamadan/JulyVideos.htm
  73. 73.
    Samadani A, DeHart B, Robinson K, Kulić D, Kubica E, Gorbet R (2011) A study of human performance in recognizing expressive hand movements. In: RO-MAN 2011, pp 93–100, IEEE Press, New York CrossRefGoogle Scholar
  74. 74.
    Santello M, Flanders M, Soechting J (2002) Patterns of hand motion during grasping and the influence of sensory guidance. J Neurosci 22(4):1426–1435 Google Scholar
  75. 75.
    Shaver P, Schwartz J, Kirson D, O’Connor C (1987) Emotion knowledge: further exploration of a prototype approach. J Pers Soc Psychol 52(6):1061–1086 CrossRefGoogle Scholar
  76. 76.
    Shibata T, Yoshida M, Yamato J (1997) Artificial emotional creature for human-machine interaction. In: IEEE international conference on systems, man, and cybernetics. Computational cybernetics and simulation. IEEE Press, New York, vol 3, pp 2269–2274 CrossRefGoogle Scholar
  77. 77.
    Smith L, Breazeal C (2007) The dynamic lift of developmental process. Dev Sci 10(1):61–68 CrossRefGoogle Scholar
  78. 78.
    Spiegel J, Machotka P (1974) Messages of the body. Free Press, New York Google Scholar
  79. 79.
    SPSS (2010) Spss for windows, rel. 19.0. Chicago: SPSS inc Google Scholar
  80. 80.
    Thomsen M (2007) Metabolistic architecture. In: Responsive textile environments? Tuns Press Google Scholar
  81. 81.
    Urtasun R, Fleet D, Fua P (2006) Temporal motion models for monocular and multiview 3D human body tracking. Comput Vis Image Underst 104(2):157–177 CrossRefGoogle Scholar
  82. 82.
    Wallbott H (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896 CrossRefGoogle Scholar
  83. 83.
    Wanderley M, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644 CrossRefGoogle Scholar
  84. 84.
    Weerdesteijn J, Desmet P, Gielen M (2005) Moving design: to design emotion through movement. Design J 8(1):28–40 CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2012

Authors and Affiliations

  • Ali-Akbar Samadani
    • 1
  • Eric Kubica
    • 2
  • Rob Gorbet
    • 3
  • Dana Kulić
    • 1
  1. 1.Department of Electrical and Computer EngineeringUniversity of WaterlooWaterlooCanada
  2. 2.Department of Systems Design EngineeringUniversity of WaterlooWaterlooCanada
  3. 3.Center for Knowledge IntegrationUniversity of WaterlooWaterlooCanada

Personalised recommendations