Advertisement

Modeling the Interactions of Context and Style on Affect in Motion Perception: Stylized Gaits Across Multiple Environmental Contexts

  • Madison Heimerdinger
  • Amy LaViersEmail author
Article

Abstract

As more and more robots move into social settings, humans will be monitoring the external motion profile of counterparts in order to make judgments about the internal state of these counterparts. This means that generating motion with an understanding of how humans will interpret it is paramount. This paper investigates the connection between environmental context, stylized gaits, and perception via a model of affect parameterized by valence and arousal. The predictive model proposed indicates that, for the motion stimuli used, environmental context has a larger influence on valence and style of walking has a larger influence on arousal. This work expands on previous research in affect recognition by exploring the critical relationship between environmental context, stylized gait, and affective perception. The results of this work indicate that social behavior of robots may be informed by environmental context for improved performance.

Keywords

Affect Style Gait Environment Context HRI 

Notes

Acknowledgements

This work was conducted under IRB #17697 supported by NSF Grants #1701295 and #1528036. Training activities funded by DARPA Grant #D16AP00001 and led by Catherine Maguire, Catie Cuan, and Riley Watts were critical to the paper. The authors also want to thank Lisa LaViers, a researcher in the Accounting Department at Emory University, for help in designing the implementation of the third study.

Compliance with Ethical Standards

Conflict of interest

A. LaViers owns stock in AE Machines, an automation software company.

References

  1. 1.
    Studd K, Cox LL (2013) Everybody is a body. Dog Ear Publishing, IndianapolisGoogle Scholar
  2. 2.
    LaViers A, Bai L, Bashiri M, Heddy G, Sheng Y (2016) Abstractions for design-by-humans of heterogeneous behaviors. In: Laumond J-P, Abe N (eds) Dance notations and robot motion. Springer, Berlin, pp 237–262CrossRefGoogle Scholar
  3. 3.
    LaViers A, Egerstedt M (2012) Style-based robotic motion. In: Proceedings of the American control conference. IEEE, pp 4327–4332Google Scholar
  4. 4.
    Burton SJ, Samadani A-A, Gorbet R, Kulić D (2016) Laban movement analysis and affective movement generation for robots and other near-living creatures. In: Laumond J-P, Abe N (eds) Dance notations and robot motion. Springer, Berlin, pp 25–48CrossRefGoogle Scholar
  5. 5.
    Masuda M, Kato S, Itoh H (2009) Emotion detection from body motion of human form robot based on laban movement analysis. In: International conference on principles and practice of multi-agent systems. Springer, Berlin, pp 322–334Google Scholar
  6. 6.
    Masuda M, Kato S (2010) Motion rendering system for emotion expression of human form robots based on laban movement analysis. In: 19th international symposium in robot and human interactive communication. IEEE, pp 324–329Google Scholar
  7. 7.
    Rett J, Dias J (2007) Human–robot interface with anticipatory characteristics based on laban movement analysis and bayesian models. In: 2007 IEEE 10th international conference on rehabilitation robotics. IEEE, pp 257–268Google Scholar
  8. 8.
    Khoshhal K, Aliakbarpour H, Quintas J, Hofmann M, Dias J (2011) Probabilistic LMA-based human motion analysis by conjugating frequency and spatial based features. In: WIAMIS 2011: 12th international workshop on image analysis for multimedia interactive services, Delft, The Netherlands, April 13–15, 2011. TU Delft; EWI; MM; PRBGoogle Scholar
  9. 9.
    Chi D, Costa M, Zhao L, Badler N (2000) The emote model for effort and shape. In: Proceedings of the 27th annual conference on computer graphics and interactive techniques. ACM Press/Addison-Wesley Publishing Co, New York, pp 173–182Google Scholar
  10. 10.
    LaViers A, Egerstedt M (2014) Controls and art: inquiries at the intersection of the subjective and the objective. Springer, BerlinzbMATHCrossRefGoogle Scholar
  11. 11.
    Knight H, Simmons R (2014) Expressive motion with x, y and theta: Laban effort features for mobile robots. In: The 23rd IEEE international symposium on robot and human interactive communication. IEEE, pp 267–273Google Scholar
  12. 12.
    Knight H, Simmons R (2015) Layering laban effort features on robot task motions. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction extended abstracts. ACM, pp 135–136Google Scholar
  13. 13.
    Knight H, Simmons R (2016) Laban head-motions convey robot state: a call for robot body language. In: 2016 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2881–2888Google Scholar
  14. 14.
    Levy JA, Duke MP (2003) The use of laban movement analysis in the study of personality, emotional state and movement style: an exploratory investigation of the veridicality of” body language. Individ Differ Res 1(1):39–63Google Scholar
  15. 15.
    Lourens T, Van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using laban movement analysis. Robot Auton Syst 58(12):1256–1265CrossRefGoogle Scholar
  16. 16.
    Samadani A-A, Burton S, Gorbet R, Kulic D (2013) Laban effort and shape analysis of affective hand and arm movements. In: 2013 Humaine association conference on affective computing and intelligent interaction (ACII). IEEE, pp 343–348Google Scholar
  17. 17.
    Laban R, Lawrence F C (1947) Effort. Macdonald & Evans, LondonGoogle Scholar
  18. 18.
    Fdili Alaoui S, Carlson K, Cuykendall S, Bradley K, Studd K, Schiphorst T (2015) How do experts observe movement? In: Proceedings of the 2nd international workshop on movement and computing. ACM, pp 84–91Google Scholar
  19. 19.
    Frijda NH (1988) The laws of emotion. American psychologist 43(5):349CrossRefGoogle Scholar
  20. 20.
    Mehrabian A, Russell J A (1974) An approach to environmental psychology. The MIT Press, CambridgeGoogle Scholar
  21. 21.
    Mehrabian A (1995) Framework for a comprehensive description and measurement of emotional states. Genet Soc Gen Psychol Monogr 121:339–361Google Scholar
  22. 22.
    Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39(6):1161–1178CrossRefGoogle Scholar
  23. 23.
    Lang PJ (1980) Behavioral treatment and bio-behavioral assessment: computer applications. In: Sidowski JB, Johnson JH, Williams TA (eds) Technology in mental health care delivery systems. pp. 119–l37. http://www.citeulike.org/group/13427/article/720885
  24. 24.
    Hodes RL, Cook EW, Lang PJ (1985) Individual differences in autonomic response: conditioned association or conditioned fear? Psychophysiology 22(5):545–560CrossRefGoogle Scholar
  25. 25.
    Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Therapy Exp Psychiatry 25(1):49–59CrossRefGoogle Scholar
  26. 26.
    Valdez P, Mehrabian A (1994) Effects of color on emotions. J Exp Psychol Gen 123(4):394CrossRefGoogle Scholar
  27. 27.
    Paltoglou G, Thelwall M (2013) Seeing stars of valence and arousal in blog posts. IEEE Trans Affect Comput 4(1):116–123CrossRefGoogle Scholar
  28. 28.
    Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2005) Grounding affective dimensions into posture features. In: International conference on affective computing and intelligent interaction. Springer, Berlin, pp 263–270Google Scholar
  29. 29.
    Dan-Glauser ES, Scherer KR (2011) The geneva affective picture database (gaped): a new 730-picture database focusing on valence and normative significance. Behav Res Methods 43(2):468CrossRefGoogle Scholar
  30. 30.
    Kurdi B, Lozano S, Banaji MR (2017) Introducing the open affective standardized image set (oasis). Behav Res Methods 49(2):457–470CrossRefGoogle Scholar
  31. 31.
    Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33CrossRefGoogle Scholar
  32. 32.
    Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vis 9(6):15–15CrossRefGoogle Scholar
  33. 33.
    Russell JA, Fehr B (1987) Relativity in the perception of emotion in facial expressions. J Exp Psychol Gen 116(3):223CrossRefGoogle Scholar
  34. 34.
    Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput Graph Appl 34(6):35–45CrossRefGoogle Scholar
  35. 35.
    Van den Stock J, Righart R, De Gelder B (2007) Body expressions influence recognition of emotions in the face and voice. Emotion 7(3):487CrossRefGoogle Scholar
  36. 36.
    de Gelder B, Meeren HK, Righart R, Van den Stock J, Van de Riet WA, Tamietto M (2006) Beyond the face: exploring rapid influences of context on face processing. Prog Brain Res 155:37–48CrossRefGoogle Scholar
  37. 37.
    Read R, Belpaeme T (2014) Situational context directs how people affectively interpret robotic non-linguistic utterances. In: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction. ACM, pp 41–48Google Scholar
  38. 38.
    Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58CrossRefGoogle Scholar
  39. 39.
    Dils A, Albright AC (2001) Moving history/dancing cultures: a dance history reader. Wesleyan University Press, MiddletownGoogle Scholar
  40. 40.
    Rosenthal-von der Pütten AM, Krämer NC, Hoffmann L, Sobieraj S, Eimler SC (2013) An experimental study on emotional reactions towards a robot. Int J Soc Robot 5(1):17–34CrossRefGoogle Scholar
  41. 41.
    Breazeal C L (2004) Designing sociable robots. MIT press, CambridgezbMATHCrossRefGoogle Scholar
  42. 42.
    Knight H, Veloso M, Simmons R (2015) Taking candy from a robot: speed features and candy accessibility predict human response. In: 2015 24th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, pp 355–362Google Scholar
  43. 43.
    Breazeal C, Scassellati B (1999) A context-dependent attention system for a social robot. In: rn, vol 255, p 3Google Scholar
  44. 44.
    Knight H, Gray M (2012) Acting lesson with robot: emotional gestures. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 407–407Google Scholar
  45. 45.
    Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R (2013) Communicating affect via flight path exploring use of the laban effort system for designing affective locomotion paths. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 293–300Google Scholar
  46. 46.
    Dragan AD, Lee KC, Srinivasa SS (2013) Legibility and predictability of robot motion. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 301–308Google Scholar
  47. 47.
    Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot 10:1–14CrossRefGoogle Scholar
  48. 48.
    Moshkina L, Park S, Arkin RC, Lee JK, Jung H (2011) Tame: time-varying affective response for humanoid robots. Int J Soc Robot 3(3):207–221CrossRefGoogle Scholar
  49. 49.
    Stephens-Fripp B, Naghdy F, Stirling D, Naghdy G (2017) Automatic affect perception based on body gait and posture: a survey. Int J Soc Robot 9(5):617–641CrossRefGoogle Scholar
  50. 50.
    Venture G, Kadone H, Zhang T, Grèzes J, Berthoz A, Hicheur H (2014) Recognizing emotions conveyed by human gait. Int J Soc Robot 6(4):621–632CrossRefGoogle Scholar
  51. 51.
    Bradley E, Stuart J (1998) Using chaos to generate variations on movement sequences. Chaos Interdiscip J Nonlinear Sci 8(4):800–807zbMATHCrossRefGoogle Scholar
  52. 52.
    Brand M, Hertzmann A (2000) Style machines. In: Proceedings of the 27th annual conference on computer graphics and interactive techniques. ACM Press/Addison-Wesley Publishing Co, New York, pp 183–192Google Scholar
  53. 53.
    Liu CK, Hertzmann A, Popović Z (2005) Learning physics-based motion style with nonlinear inverse optimization. ACM Trans Graph (TOG) 24(3):1071–1081CrossRefGoogle Scholar
  54. 54.
    Torresani L, Hackney P, Bregler C (2007) Learning motion style synthesis from perceptual observations. In: Advances in neural information processing systems, pp 1393–1400Google Scholar
  55. 55.
    Gillies M (2009) Learning finite-state machine controllers from motion capture data. IEEE Trans Comput Intell AI Games 1(1):63–72MathSciNetCrossRefGoogle Scholar
  56. 56.
    Etemad SA, Arya A (2016) Expert-driven perceptual features for modeling style and affect in human motion. IEEE Trans Hum Mach Syst 46(4):534–545CrossRefGoogle Scholar
  57. 57.
    Etemad SA, Arya A, Parush A, DiPaola S (2016) Perceptual validity in animation of human motion. Comput Anim Virtual Worlds 27(1):58–71CrossRefGoogle Scholar
  58. 58.
    Ribeiro T, Paiva A (2012) The illusion of robotic life: principles and practices of animation for robots. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 383–390Google Scholar
  59. 59.
    Van Breemen A (2004) Bringing robots to life: applying principles of animation to robots. In: Proceedings of shapping human–robot interaction workshop held at CHI 2004, pp 143–144Google Scholar
  60. 60.
    Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: International conference on affective computing and intelligent interaction. Springer, Berlin, pp 59–70Google Scholar
  61. 61.
    Bernhardt D, Robinson P (2009) Detecting emotions from connected action sequences. In: Visual informatics: bridging research and practice, pp 1–11Google Scholar
  62. 62.
    Etemad SA, Arya A (2010) Modeling and transformation of 3D human motion. In: GRAPP, pp 307–315Google Scholar
  63. 63.
    Heimerdinger M, LaViers A (2017) Influence of environmental context on recognition rates of stylized walking sequences. In: International conference on social robotics. Springer, Berlin, pp 272–282Google Scholar
  64. 64.
    Russell JA, Mehrabian A (1977) Evidence for a three-factor theory of emotions. J Res Personal 11(3):273–294CrossRefGoogle Scholar
  65. 65.
    Russell JA (1979) Affective space is bipolar. J Personal Soc Psychol 37(3):345MathSciNetCrossRefGoogle Scholar
  66. 66.
    Agresti A, Kateri M (2011) Categorical data analysis. In: International encyclopedia of statistical science. Springer, Berlin, pp 206–208Google Scholar
  67. 67.
    Reeser TW (2011) Masculinities in theory: an introduction. Wiley, New YorkGoogle Scholar
  68. 68.
    Pedhazur EJ, Tetenbaum TJ (1979) Bem sex role inventory: a theoretical and methodological critique. J Personal Soc Psychol 37(6):996CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.Mechanical Science and Engineering DepartmentUniversity of Illinois Urbana-ChampaignUrbanaUSA

Personalised recommendations