Advertisement

Imitation of Human Motion by Low Degree-of-Freedom Simulated Robots and Human Preference for Mappings Driven by Spinal, Arm, and Leg Activity

  • Roshni KaushikEmail author
  • Amy LaViers
Article

Abstract

Robots cannot exactly replicate human motion, especially for low degree-of-freedom (DOF) robots, but perceptual imitation has been accomplished. Nevertheless, the multiple mappings between human and robot bodies continue to present questions around which aspect of human motion robots should preserve. In this vein, this paper presents a methodology for mapping human motion capture data to the motion of a low-DOF simulated robot; further, empirical experiments conducted on Amazon Mechanical Turk illuminate human preference across several such mappings. Users preferred motion capture driven robot motion over artificially generated robot motion, suggesting imitation was successfully accomplished by the proposed mappings. Moreover, one mapping, relating to the leaning of the spine, was preferred over arm- and leg-based mappings by a significant subgroup of respondents who were both loyal to the mapping—across multiple stimuli—and more engaged in the survey than other respondents. These results re-confirm the ability for simple robots to imitate human behavior and indicate that monitoring human spinal activity may be especially useful in this pursuit. Parallel work in psychology and human behavior analysis suggests that successful imitation of the motion of human counterparts is a necessary activity for robots to integrate in human-facing environments.

Keywords

Imitation Human preference Mobile robots Motion capture 

Notes

Acknowledgements

We would like to thank Erin Berl for providing the human movement used to generate the mappings for this study.

Funding

This work was supported by DARPA Grant #D16AP00001.

Compliance with Ethical Standards

Conflict of interest

Amy LaViers owns stock in AE Machines, Inc. and caali, LLC.

Research Involving Human Participants and/or Animals

Studies with human subjects were governed by IRB #16225.

References

  1. 1.
    Tomasello M, Kruger AC, Ratner HH (1993) Cultural learning. Behav Brain Sci 16(3):495–511CrossRefGoogle Scholar
  2. 2.
    Breazeal C, Scassellati B (2002) Robots that imitate humans. Trends Cognit Sci 6(11):481–487CrossRefGoogle Scholar
  3. 3.
    Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166CrossRefGoogle Scholar
  4. 4.
    Kaushik R, Vidrin I, LaViers A (2018) Quantifying coordination in human dyads via a measure of verticality. In: Proceedings of the 5th international conference on movement and computing - MOCO ’18. ACM Press, Genoa, Italy, pp 1–8.  https://doi.org/10.1145/3212721.3212805
  5. 5.
    Kaushik R, LaViers A (2018) Imitating human movement using a measure of verticality to animate low degree-of-freedom non-humanoid virtual characters. Soc Robot Springer Int Publ Cham 11357:588–598.  https://doi.org/10.1007/978-3-030-05204-1_58 CrossRefGoogle Scholar
  6. 6.
    Boker SM, Rotondo JL (2002) Symmetry building and symmetry breaking in synchronized movement. Mirror Neurons Evol Brain Language 42:163CrossRefGoogle Scholar
  7. 7.
    Boker SM, Cohn JF, Theobald BJ, Matthews I, Brick TR, Spies JR (2009) Effects of damping head movement and facial expression in dyadic conversation using real-time facial expression tracking and synthesized avatars. Philos Trans Royal Soc B Biol Sci 364(1535):3485–3495CrossRefGoogle Scholar
  8. 8.
    Ashenfelter KT, Boker SM, Waddell JR, Vitanov N (2009) Spatiotemporal symmetry and multifractal structure of head movements during dyadic conversation. J Exp Psychol Hum Percept Perform 35(4):1072CrossRefGoogle Scholar
  9. 9.
    Liu C, Ishi CT, Ishiguro H, Hagita N (2012) Generation of nodding, head tilting and eye gazing for human–robot dialogue interaction. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI), IEEE, pp 285–292Google Scholar
  10. 10.
    Johnson DO, Cuijpers RH (2019) Investigating the effect of a humanoid robot’s head position on imitating human emotions. Int J Soc Robot 11(1):65–74CrossRefGoogle Scholar
  11. 11.
    Mielke EA, Townsend EC, Killpack MD (2017) Analysis of rigid extended object co-manipulation by human dyads: lateral movement characterization. arXiv preprint arXiv:1702.00733
  12. 12.
    Melnyk A, Hénaff P (2019) Physical analysis of handshaking between humans: mutual synchronisation and social context. Int J Soc Robot.  https://doi.org/10.1007/s12369-019-00525-y CrossRefGoogle Scholar
  13. 13.
    Woolford K (2014) Capturing human movement in the wild. In: Proceedings of the 2014 international workshop on movement and computing, ACM, p 19Google Scholar
  14. 14.
    Arvind D, Valtazanos A (2009) Speckled tango dancers: Real-time motion capture of two-body interactions using on-body wireless sensor networks. In: Sixth international workshop on wearable and implantable body sensor networks, IEEE, 2009. BSN 2009. pp 312–317Google Scholar
  15. 15.
    Brown C, Paine G (2015) Interactive tango milonga: Designing internal experience. In: Proceedings of the 2nd international workshop on movement and computing, ACM, pp 17–20Google Scholar
  16. 16.
    Li W, Pasquier P (2016) Automatic affect classification of human motion capture sequences in the Valence–Arousal model. In: Proceedings of the 3rd international symposium on movement and computing, ACM, p 15Google Scholar
  17. 17.
    Shiratori T, Nakazawa A, Ikeuchi K (2006) Synthesizing dance performance using musical and motion features. In: Proceedings 2006 IEEE international conference on robotics and automation, 2006. ICRA 2006, IEEE, pp 3654–3659Google Scholar
  18. 18.
    Kingston P, Egerstedt M (2011) Motion preference learning. In: American control conference (ACC), 2011, IEEE, pp 3819–3824Google Scholar
  19. 19.
    Pomplun M, Mataric MJ (2000) Evaluation metrics and results of human arm movement imitation. In: Proceedings, first IEEE-RAS international conference on humanoid robotics (Humanoids-2000), pp 7–8Google Scholar
  20. 20.
    Johnson DO, Cuijpers RH, Pollmann K, van de Ven AA (2016) Exploring the entertainment value of playing games with a humanoid robot. Int J Soc Robot 8(2):247–269CrossRefGoogle Scholar
  21. 21.
    Torta E, van Heumen J, Piunti F, Romeo L, Cuijpers R (2015) Evaluation of unimodal and multimodal communication cues for attracting attention in human–robot interaction. Int J Soc Robot 7(1):89–96CrossRefGoogle Scholar
  22. 22.
    Noy L, Dekel E, Alon U (2011) The mirror game as a paradigm for studying the dynamics of two people improvising motion together. Proc Natl Acad Sci 108(52):20947–20952CrossRefGoogle Scholar
  23. 23.
    Slowinski P, Rooke E, Di Bernardo M, Tanaseva-Atanasova K (2014) Kinematic characteristics of motion in the mirror game. In: 2014 IEEE international conference on systems, man, and cybernetics (SMC), IEEE, pp 748–753Google Scholar
  24. 24.
    Maeda G, Ewerton M, Lioutikov R, Amor HB, Peters J, Neumann G (2014) Learning interaction for collaborative tasks with probabilistic movement primitives. In: 2014 14th IEEE-RAS international conference on humanoid robots (humanoids), IEEE, pp 527–534Google Scholar
  25. 25.
    Ewerton M, Neumann G, Lioutikov R, Amor HB, Peters J, Maeda G (2015) Learning multiple collaborative tasks with a mixture of interaction primitives. In: 2015 IEEE international conference on robotics and automation (ICRA), IEEE, pp 1535–1542Google Scholar
  26. 26.
    Cuykendall S, Schiphorst T, Bizzocchi J (2014) Designing interaction categories for kinesthetic empathy: a case study of synchronous objects. In: Proceedings of the 2014 international workshop on movement and computing, ACM, p 13Google Scholar
  27. 27.
    Özcimder K, Dey B, Lazier RJ, Trueman D, Leonard NE (2016) Investigating group behavior in dance: an evolutionary dynamics approach. In: American control conference (ACC), 2016, IEEE, pp 6465–6470Google Scholar
  28. 28.
    Yamane K, Ariki Y, Hodgins J (2010) Animating non-humanoid characters with human motion data. In: Proceedings of the 2010 ACM siggraph/eurographics symposium on computer animation, eurographics association, pp 169–178Google Scholar
  29. 29.
    Abdul-Massih M, Yoo I, Benes B (2017) Motion style retargeting to characters with different morphologies. Comput Gr Forum Wiley Online Libr 36:86–99CrossRefGoogle Scholar
  30. 30.
    Seol Y, O’Sullivan C, Lee J (2013) Creature features: online motion puppetry for non-human characters. In: Proceedings of the 12th ACM siggraph/eurographics symposium on computer animation, ACM, pp 213–221Google Scholar
  31. 31.
    Bevacqua E, Richard R, Soler J, De Loor P (2016) INGREDIBLE: a platform for full body interaction between human and virtual agent that improves co-presence. In: Proceedings of the 3rd international symposium on movement and computing, ACM, p 22Google Scholar
  32. 32.
    Tang JK, Chan JC, Leung H (2011) Interactive dancing game with real-time recognition of continuous dance moves from 3D human motion capture. In: Proceedings of the 5th international conference on ubiquitous information management and communication, ACM, p 50Google Scholar
  33. 33.
    Zhai C, Alderisio F, Słowiński P, Tsaneva-Atanasova K, di Bernardo M (2016) Design of a virtual player for joint improvisation with humans in the mirror game. PloS One 11(4):e0154361CrossRefGoogle Scholar
  34. 34.
    Zhai C, Alderisio F, Slowinski P, Tsaneva-Atanasova K, di Bernardo M (2015) Modeling joint improvisation between human and virtual players in the mirror game. arXiv preprint arXiv:1512.05619
  35. 35.
    McCormick J, Vincs K, Nahavandi S, Creighton D, Hutchison S (2014) Teaching a digital performing agent: artificial neural network and hidden markov model for recognising and performing dance movement. In: Proceedings of the 2014 international workshop on movement and computing, ACM, p 70Google Scholar
  36. 36.
    Gillies M, Brenton H, Kleinsmith A (2015) Embodied design of full bodied interaction with virtual humans. In: Proceedings of the 2nd international workshop on movement and computing, ACM, pp 1–8Google Scholar
  37. 37.
    Yamane K (2016) Human motion tracking by robots. In: Dance notations and robot motion, Springer, pp 417–430Google Scholar
  38. 38.
    Ott C, Lee D, Nakamura Y (2008) Motion capture based human motion recognition and imitation by direct marker control. In: 8th IEEE-RAS international conference on humanoid robots, 2008. Humanoids 2008. IEEE, pp 399–405Google Scholar
  39. 39.
    Minato T, Ishiguro H (2007) Generating natural posture in an android by mapping human posture in three-dimensional position space. In: IEEE/RSJ international conference on intelligent robots and systems, 2007. IROS 2007. IEEE, pp 609–616Google Scholar
  40. 40.
    Fujimoto I, Matsumoto T, De Silva PRS, Kobayashi M, Higashi M (2011) Mimicking and evaluating human motion to improve the imitation skill of children with autism through a robot. Int J Soc Robot 3(4):349–357CrossRefGoogle Scholar
  41. 41.
    Nakaoka S, Nakazawa A, Yokoi K, Hirukawa H, Ikeuchi K (2003) Generating whole body motions for a biped humanoid robot from captured human dances. In: 2003 IEEE international conference on robotics and automation (Cat. No. 03CH37422), IEEE, vol 3, pp 3905–3910Google Scholar
  42. 42.
    Demiris Y, Johnson M (2003) Distributed, predictive perception of actions: a biologically inspired robotics architecture for imitation and learning. Connect Sci 15(4):231–243CrossRefGoogle Scholar
  43. 43.
    Billard A, Matarić MJ (2001) Learning human arm movements by imitation: evaluation of a biologically inspired connectionist architecture. Robot Auton Syst 37(2–3):145–160CrossRefGoogle Scholar
  44. 44.
    Suleiman W, Yoshida E, Kanehiro F, Laumond JP, Monin A (2008) On human motion imitation by humanoid robot. In: 2008 IEEE international conference on robotics and automation, IEEE, pp 2697–2704Google Scholar
  45. 45.
    Nehaniv C, Dautenhahn K (1998) Mapping between dissimilar bodies: a ordances and the algebraic foundations of imitation. EWLR-98, pp 64–72Google Scholar
  46. 46.
    Alissandrakis A, Nehaniv CL, Dautenhahn K, Saunders J (2006) Evaluation of robot imitation attempts: comparison of the system’s and the human’s perspectives. In: Proceeding of the 1st ACM sigchi/sigart conference on human–robot interaction - HRI ’06, ACM Press, Salt Lake City, p 134.  https://doi.org/10.1145/1121241.1121265
  47. 47.
    Van de Perre G, De Beir A, Cao HL, Esteban PG, Lefeber D, Vanderborght B (2019) Studying design aspects for social robots using a generic gesture method. Int J Soc Robot.  https://doi.org/10.1007/s12369-019-00518-x CrossRefGoogle Scholar
  48. 48.
    Simmons R, Knight H (2017) Keep on dancing: effects of expressive motion mimicry. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), IEEE, pp 720–727Google Scholar
  49. 49.
    Knight H, Simmons R (2016) Laban head-motions convey robot state: a call for robot body language. In: 2016 IEEE international conference on robotics and automation (ICRA), IEEE, pp 2881–2888Google Scholar
  50. 50.
    Knight H (2016) Expressive motion for low degree-of-freedom robotsGoogle Scholar
  51. 51.
    Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R (2013) Communicating affect via flight path: exploring use of the laban effort system for designing affective locomotion paths. In: Proceedings of the 8th ACM/IEEE international conference on human-robot interaction, IEEE Press, pp 293–300Google Scholar
  52. 52.
    Dragan AD, Ratliff ND, Srinivasa SS (2011) Manipulation planning with goal sets using constrained trajectory optimization. In: 2011 IEEE international conference on robotics and automation, IEEE, pp 4582–4588Google Scholar
  53. 53.
    Chan MT, Gorbet R, Beesley P, Kulić D (2016) Interacting with curious agents: user experience with interactive sculptural systems. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), IEEE, pp 151–158Google Scholar
  54. 54.
    Wang H, Kosuge K (2012) Control of a robot dancer for enhancing haptic human-robot interaction in waltz. IEEE Trans Haptics 5(3):264–273CrossRefGoogle Scholar
  55. 55.
    Baillieul J, Özcimder K (2012) The control theory of motion-based communication: problems in teaching robots to dance. In: American control conference (ACC), 2012, IEEE, pp 4319–4326Google Scholar
  56. 56.
    Takeda T, Hirata Y, Kosuge K (2007) Dance partner robot cooperative motion generation with adjustable length of dance step stride based on physical interaction. In: IEEE/RSJ international conference on intelligent robots and systems, 2007. IROS 2007. IEEE, pp 3258–3263Google Scholar
  57. 57.
    Hölldampf J, Peer A, Buss M (2010) Synthesis of an interactive haptic dancing partner. In: RO-MAN, 2010 IEEE, IEEE, pp 527–532Google Scholar
  58. 58.
    Johnson DO, Cuijpers RH, Juola JF, Torta E, Simonov M, Frisiello A, Bazzani M, Yan W, Weber C, Wermter S (2014) Socially assistive robots: a comprehensive approach to extending independent living. Int J Soc Robot 6(2):195–211CrossRefGoogle Scholar
  59. 59.
    Torta E, Werner F, Johnson DO, Juola JF, Cuijpers RH, Bazzani M, Oberzaucher J, Lemberger J, Lewy H, Bregman J (2014) Evaluation of a small socially-assistive humanoid robot in intelligent homes for the care of the elderly. J Intell Robot Syst 76(1):57–71CrossRefGoogle Scholar
  60. 60.
    Van Dijk ET, Torta E, Cuijpers RH (2013) Effects of eye contact and iconic gestures on message retention in human–robot interaction. Int J Soc Robot 5(4):491–501CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana-ChampaignUrbanaUSA

Personalised recommendations