Advertisement

Implicit Nonverbal Behaviors Expressing Closeness by 3D Agents

  • Hiroko KamideEmail author
  • Mihoko Niitsuma
  • Tatuo Arai
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9388)

Abstract

The goal of the current study was to extract natural nonverbal behaviors that are implicit but specific to strangers and friends and to test the expressiveness of these nonverbal behaviors in two different levels of closeness using 3D agents. An experiment was conducted in which 48 pairs (48 strangers and 48 friends) of participants had casual conversations about recent events for 10 min. Their body movements were recorded by a motion-capture system, and 13 vectors were defined on the upper body to compute the cosine similarity for each frame in order to extract the motions. The motions specific to strangers and friends were identified and two scenarios were created using those motions. The scenarios were implemented using 3D agents of a female human and a humanoid robot, and 400 respondents were asked to evaluate the closeness that the agent seemed to express toward the counterpart. The results showed that a human-agent performing friend motions were evaluated higher in expressiveness closeness than friend motions and a human-agent and a robot-agent performing friend motions were evaluated lower in strangeness than friend motions. In future works, we aim to improve the scenarios and implement them in humanoid robots.

Keywords

Humanoid Robot Nonverbal Behavior Cosine Similarity Experimental Social Psychology Interpersonal Distance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Yang, H.Y., Zhang, H., Xu, W., Zhang, P.J., Xu, L.M.: The Application of KINECT Motion Sensing Technology in Game-Oriented Study. iJET 9(2), 59–63 (2014)Google Scholar
  2. 2.
    Almaddah, A., Vural, S., Mae, Y., Ohara, K., Arai, T.: Spherical Spaces for Illumination Invariant Face Relighting. Journal of Robotics and Mechatronics 25(5), 840–847 (2013)CrossRefGoogle Scholar
  3. 3.
    Penaloza, C.I., Mae, Y., Cuellar, F.F., Kojima, M., Arai, T.: Brain Machine Interface System Automation considering User Preferences and Error Perception Feedback. IEEE Transactions on Automation Science and Engineering (2014)Google Scholar
  4. 4.
    Nakaoka, S., Nakazawa, A., Kanehiro, F., Kaneko, K., Morisawa, M., Hirukawa, H., Ikeuchi, K.: Learning from Observation Paradigm: Leg Task Models for Enabling a Biped Humanoid Robot to Imitate Human Dances. International Journal of Robotics Research 26(8), 829–844 (2007)CrossRefGoogle Scholar
  5. 5.
    Shigemi, S., Kawaguchi, Y., Yoshiike, T., Kawabe, K., Okgawa, N.: Development of New ASIMO. Honda Technical Review 18(1), 38–44 (2006)Google Scholar
  6. 6.
    Kaneko, K., Kanehiro, F., Morisawa, M., Miura, K., Nakaoka, S., Kajita, S.: Cybernetic human HRP-4C. In: Proc. IEEE-RAS International Conference on Humanoid Robots, pp. 7–14 (2009)Google Scholar
  7. 7.
    Nakanishi, H., Tanaka, K., Wada, Y.: Remote handshaking: touch enhances video-mediated social telepresence. In: International Conference on Human Factors in Computing Systems, pp. 2143–2152 (2014)Google Scholar
  8. 8.
    Ito, T., Matsubara, H., Grimbergen, R.: A Cognitive Science Approach to Shogi Playing Processes (2)-Some Results on Next Move Test Experiments. Transactions of Information Processing Society of Japan 45(5), 1481–1492 (2004)Google Scholar
  9. 9.
    Sato, Y., Takahashi, D., Grimbergen, R.: A Shogi Program Based on Monte-Carlo Tree Search. ICGA Journal 33(2), 80–92 (2010)CrossRefGoogle Scholar
  10. 10.
    Nakaoka, S.: Choreonoid: extensible virtual robot environment built on an integrated GUI framework. In: Proc. of the 2012 IEEE/SICE International Symposium on System Integration, pp. 79–85 (2012)Google Scholar
  11. 11.
    Takashima, K., Aida, N., Yokoyama, H., Kitamura, Y.: TransformTable: a self-actuated shape-changing digital table. In: Proceedings of Conferece on Interactive Tabletop and Surface (ITS), pp. 179–187 (2013)Google Scholar
  12. 12.
    Hayashi, Y., Itoh, Y., Takashima, K., Fujita, K., Nakajima, K., Onoye, T.: Cuple: cup-shaped tool for subtly collecting information during conversational experiment. The International Journal of Advanced Computer Science 3(1), (2013)Google Scholar
  13. 13.
    Kanda, T., Ishiguro, H., Imai, M., Ono, T.: Development and Evaluation of Interactive Humanoid Robots. Proceedings of the IEEE 92(11), 1839–1850 (2004). Special issue on Human Interactive Robot for Psychological EnrichmentCrossRefGoogle Scholar
  14. 14.
    Kamide, H., Mae, Y., Takubo, T., Ohara, K., Arai, T.: Direct Comparison of Psychological evaluation between Virtual and Real Humanoids; Personal space and subjective impressions. International Journal of Human-Computer Studies 72, 451–459 (2014)CrossRefGoogle Scholar
  15. 15.
    Kamide, H., Kawabe, K., Shigemi, S., Arai, T.: Nonverbal behaviors toward an audience and a screen for a presentation by a humanoid robot. Artificial Intelligence Research 3(2), 57–66 (2014)CrossRefGoogle Scholar
  16. 16.
    Cramer, D.: Close relationships: The study of love and friendship. Arnold, London (1998)Google Scholar
  17. 17.
    Levinger, G., Snoek, J.D.: Attraction in relationship: a new look at interpersonal attraction. General Learning (1972)Google Scholar
  18. 18.
    Knapp, M.L.: Interpersonal Communication and Human Relationships. Allyn & Bacon, Boston (1984)Google Scholar
  19. 19.
    Mehrabian, A., Wiener, M.: Decoding of inconsistent communications. Journal of Personality and Social Psychology 6, 109–114 (1967)CrossRefGoogle Scholar
  20. 20.
    Hall, E.T.: The hidden dimension, Doubleday and Company (1966)Google Scholar
  21. 21.
    Coutts, L.M., Schneider, F.W.: Visual behavior in an unfocused interaction as a function of sex and distance. Journal of Experimental Social Psychology 11, 64–77 (1975)CrossRefGoogle Scholar
  22. 22.
    Foot, H.C., Chapman, A.J., Smith, J.R.: Friendship and social responsiveness in boys and girls. Journal of Personality and Social Psychology 5, 401–411 (1977)CrossRefGoogle Scholar
  23. 23.
    Argyle, M., Dean, J.: Eye contact, distance, and affiliation. Sociometry 28, 289–304 (1965)CrossRefGoogle Scholar
  24. 24.
    Floyd, K., Morman, M.T.: The measurement of affectionate communication. Communication Quarterly 46, 144–162 (1998)CrossRefGoogle Scholar
  25. 25.
    Jones, E.E., Nisbett, R.E.: The actorand the observer: divergent peceptions of the causes of behavior. In: Jones, E.E., Kanouse, D.E., Kelly, H.H., Nisbett, R.E., Valins, S., Weiner, B. (eds.) Attribution: Perceiveing the Causes of Behavior. General Lerning Press, pp. 79–94 (1972)Google Scholar
  26. 26.
    Bernieri, F., Gillis, J.S., Davis, J.M., Grahe, J.E.: Dyad rapport and the accuracy of its judgment across situations: A lens model analysis. Journal of Personality and Social Psychology 71, 110–129 (1996)CrossRefGoogle Scholar
  27. 27.
    Taylor, S.E., Fiske, S.T.: Salience, attention, and attribution: top of the head phenomena. In: Berkowitz, L. (ed.) Advances in experimental social psychology, vol. 11, pp. 249–288. Academic Press, New York (1978)Google Scholar
  28. 28.
    Brewer, M.B.: A dual process model of impression formation. In: Srull, T.K., Wyer, R.S. (eds.) Advances in Social Cognition, vol. 1, pp. 1–36. Erlbaum, Hillsdale (1988)Google Scholar
  29. 29.
    Fiske, S.T., Neuberg, S.L.: A continuum of impression formation, from category-based to individuating processes: influences of information and motivation on attention and interpretation. In: Zanna, M.P. (ed.) Advances in Experimental Social Psychology, vol. 23, pp. 1–74. Academic Press, New York (1990)Google Scholar
  30. 30.
    Jokinen, F., Nishida, Y.: Gaze and Turn-taking behaviour in Casual Conversational Interactions. ACM Transactions on Interactive Intelligent Systems (TiiS) Journal 3(2), (2013). , Special Section on Eye-gaze and Conversational Engagement, Guest Editors: Elisabeth André and Joyce ChaiGoogle Scholar
  31. 31.
    Levitski, A., Radun, J., Jokinen, K.: Visual interaction and conversational activity. In: Proceedings of The 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction: Eye Gaze and Multimodality, at the 14th ACM International Conference on Multimodal Interaction (ICMI-2012), October 26 2012, Santa Monica, California, U.S. (2012)Google Scholar
  32. 32.
    Sternberg, R.J.: Construct validation of a triangular love scale. European Journal of Social Psychology 27, 313–335 (1997)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Authors and Affiliations

  1. 1.Research Institute of Electrical CommunicationTohoku UniversityAoba-ku, SendaiJapan
  2. 2.Faculty of Science and Engineering, Department of Precision MechanicsChuo UniversityBunkyo-ku, TokyoJapan
  3. 3.Department of Engineering ScienceOsaka UniversityToyonaka, OsakaJapan

Personalised recommendations