Advertisement

Correlation Analysis for Predictive Models of Robot User’s Impression: A Study on Visual Medium and Mechanical Noise

  • Takamune IzuiEmail author
  • Gentiane Venture
Article
  • 38 Downloads

Abstract

Service robots are increasingly common. Studies show that the appearance and behavior of robots influence users’ impressions. Developers need to know whether users’ impressions of robots are in accordance with the purpose they designed or not. To assess such impressions, it is necessary for participants to meet robots in a real-world setting. Such studies are costly and take a lot of time. On the other hand, experiments are now extremely easy to perform over the internet. However, there is no evidence that impressions obtained from a video recording or a movie of a simulated avatar are comparable to an impression obtained in a real-world setting. In this study, we hypothesized that there are trade-offs between ease of collecting impressions and the real-world applicability of said impressions. We tried to use impressions obtained in “Recorded” and “Avatar” settings to predict the impressions in a “Real” setting by correlation analysis. In addition, we also performed muted real-world setting “Soundproof” to evaluate the influence of motor noise of robot motion for user’s impressions. In the experiment, two kinds of humanoid robots performed five kinds of motions. Participants gave their impressions in quantitative form under four different conditions: real-world, video recording, video avatar and muted real-world. Our study takes into account the effects of motor noise in addition to the medium on which the robot is seen. Our results show correlations between the “Soundproof”, “Recorded”, and “Avatar” settings. We found that motor noise affects participants’ impressions of the robot and that some trade-off relationships exist between the different conditions.

Keywords

HRI Humanoid robot Impression 

Notes

Funding

Not applicable.

Compliance with Ethical Standards

Conflict of interest

The authors declare that they have no competing interests.

Availability of Data and Materials

Available.

Consent for Publication

Not applicable.

References

  1. 1.
    Breazeal C (2004) Function meets style: insights from emotion theory applied to HRI. IEEE Trans Syst Man Cybern Part C Appl Rev 34(2):187–194CrossRefGoogle Scholar
  2. 2.
    Arkin RC et al (2003) An ethological and emotional basis for human-robot interaction. Robot Autonom Syst 42(3–4):191–201CrossRefGoogle Scholar
  3. 3.
    Zecca M et al (2009) Whole body emotion expressions for KOBIAN humanoid robot—preliminary experiments with different Emotional patterns. In: The 18th IEEE international symposium on robot and human interactive communication, 2009. RO-MAN 2009. IEEE, pp 381–386Google Scholar
  4. 4.
    Kaneko K et al (2011) Hardware improvement of cybernetic human HRP-4C for entertainment use. In: 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4392–4399Google Scholar
  5. 5.
    Yuk N-S, Kwon D-S (2008) Realization of expressive body motion using leg-wheel hybrid mobile robot: KaMERo1. In: International conference on control, automation and systems, 2008. ICCAS 2008. IEEE, pp 2350–2355Google Scholar
  6. 6.
    Burgard W et al (1998) The interactive museum tour-guide robot. In: AAAI/IAAI, pp 11–18Google Scholar
  7. 7.
    Scheutz M (2013) What is robot ethics? [TC spotlight]. IEEE Robot Autom Mag 20(4):20–165CrossRefGoogle Scholar
  8. 8.
    Vargas MF (1986) Louder than words: an introduction to nonverbal communication. Iowa State Press, IowaGoogle Scholar
  9. 9.
    Salovey P, Mayer JD (1990) Emotional intelligence. Imagin Cognit Personal 9(3):185–211CrossRefGoogle Scholar
  10. 10.
    Bernotat J, Eyssel FA (2017) A robot at home—how affect, technology commitment, and personality traits influence user experience in an intelligent robotics apartment. In: Proceedings of the 26th IEEE international symposium on robot and human interactive communication (RO-MAN)Google Scholar
  11. 11.
    Anzalone SM et al (2015) Evaluating the engagement with social robots. Int J Soc Robot 7(4):465–478CrossRefGoogle Scholar
  12. 12.
    Chen TL et al (2014) An investigation of responses to robot-initiated touch in a nursing context. Int J Soc Robot 6(1):141–161CrossRefGoogle Scholar
  13. 13.
    Izui T, Venture G (2017) Impression’s predictive models for animated robot. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, pp 621–626Google Scholar
  14. 14.
    Kamide H et al (2012) New measurement of psychological safety for humanoid. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction. ACM, pp 49–56Google Scholar
  15. 15.
    Wagemaker E et al (2017) Advances in mental health care: five N = 1 studies on the effects of the robot seal paro in adults with severe intellectual disabilities. J Ment Health Res Intell Disabil 10(4):309–320CrossRefGoogle Scholar
  16. 16.
    Venture G, Indurkhya B, Izui T (2017) Dance with me! Child–robot interaction in the wild. In: International conference on social robotics. Springer, Cham, pp 375–382CrossRefGoogle Scholar
  17. 17.
    Pateromichelakis N et al (2014) Head-eyes system and gaze analysis of the humanoid robot Romeo. In: 2014 IEEE/RSJ international conference on intelligent robots and systems (IROS 2014). IEEE, pp 1374–1379Google Scholar
  18. 18.
    Erden MS (2013) Emotional postures for the humanoid-robot NAO. Int J Soc Robot 5(4):441–456CrossRefGoogle Scholar
  19. 19.
    Claret J-A, Venture G, Basañez L (2017) Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task. Int J Soc Robot 9(2):277–292CrossRefGoogle Scholar
  20. 20.
    Kamide H et al (2014) Nonverbal behaviors toward an audience and a screen for a presentation by a humanoid robot. Artif Intell Res 3(2):57CrossRefGoogle Scholar
  21. 21.
    Kasuga H et al (2017) A social robot in a human–animal relationship at home: a field study. In: Proceedings of the 5th international conference on human–agent interaction. ACM, pp 61–69Google Scholar
  22. 22.
    Lee WH et al (2014) Motivational emotion generation and behavior selection based on emotional experiences for social robots. In: Workshops in ICSR 2014Google Scholar
  23. 23.
    Dubois M et al (2016) Influence of emotional motions in human–robot interactions. In: International symposium on experimental robotics. Springer, Cham, pp 799–808Google Scholar
  24. 24.
    Yamashita Y et al (2017) Appearance of a robot influences causal relationship between touch sensation and the personality impression. In: Proceedings of the 5th international conference on human–agent interaction. ACM, pp 457–461Google Scholar
  25. 25.
    Wu X et al (2017) An evaluation of a telepresence robot: user testing among older adults with mobility impairment. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 325–326Google Scholar
  26. 26.
    Baisch S et al (2017) Acceptance of social robots by elder people: does psychosocial functioning matter? Int J Soc Robot 9(2):293–307CrossRefGoogle Scholar
  27. 27.
    Woods S et al (2006) Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach. In: 9th IEEE international workshop on advanced motion control, 2006. IEEE, pp 750–755Google Scholar
  28. 28.
    Bonanni L, Ishii H (2009) Stop-motion prototyping for tangible interfaces. In: Proceedings of the 3rd international conference on tangible and embedded interaction. ACM, pp 315–316Google Scholar
  29. 29.
    Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292MathSciNetCrossRefGoogle Scholar
  30. 30.
    Sakairi Y, Nakatsuka K, Shimizu T (2013) Development of the Two-Dimensional Mood Scale for self-monitoring and self-regulation of momentary mood states. Jpn Psychol Res 55(4):338–349Google Scholar
  31. 31.
    Serenko A (2007) The development of an instrument to measure the degree of animation predisposition of agent users. Comput Hum Behav 23(1):478–495CrossRefGoogle Scholar
  32. 32.
    Izui T et al (2015) Expressing emotions using gait of humanoid robot. In: 2015 24th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, pp 241–245Google Scholar
  33. 33.
    Aarestrup M, Jensen LC, Fischer K (2015) The sound makes the greeting: interpersonal functions of intonation in human–robot interaction. In: 2015 AAAI spring symposium seriesGoogle Scholar
  34. 34.
    Ivaldi S et al (2017) Towards engagement models that consider individual factors in HRI: on the relation of extroversion and negative attitude towards robots to gaze and speech during a human-robot assembly task. Int J Soc Robot 9(1):63–86CrossRefGoogle Scholar
  35. 35.
    Abel SM, Odell P (2006) Sound attenuation from earmuffs and earplugs in combination: maximum benefits vs. missed information. Aviat Space Environ Med 77(9):899–904Google Scholar
  36. 36.
    Graham-Rowe E et al (2012) Mainstream consumers driving plug-in battery-electric and plug-in hybrid electric cars: a qualitative analysis of responses and evaluations. Transp Res Part A Policy Pract 46(1):140–153CrossRefGoogle Scholar
  37. 37.
    Kawachi N et al (2003) Home-use robot “wakamaru”. Mitsubishi Juko Giho 40(5):270–273Google Scholar
  38. 38.
    El Haddad K (2017) Nonverbal conversation expressions processing for human–agent interactions. In: 2017 seventh international conference on affective computing and intelligent interaction (ACII). IEEE, pp 601–605Google Scholar
  39. 39.
    Stein J-P, Ohler P (2017) Venturing into the uncanny valley of mind–the influence of mind attribution on the acceptance of human-like characters in a virtual reality setting. Cognition 160:43–50CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.Tokyo University of Agriculture and TechnologyKoganeiJapan

Personalised recommendations