Skip to main content
Log in

Multivariate evaluation of interactive robot systems

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

In designing robot systems for human interaction, designers draw on aspects of human behavior that help them achieve specific design goals. For instance, the designer of an educational robot system may use speech, gaze, and gesture cues in a way that enhances its student’s learning. But what set of behaviors improve such outcomes? How might designers of such a robot system determine this set of behaviors? Conventional approaches to answering such questions primarily involve designers carrying out a series of experiments in which they manipulate a small number of design variables and measure the effects of these manipulations on specific interaction outcomes. However, these methods become infeasible when the design space is large and when the designer needs to understand the extent to which each variable contributes to achieving the desired effects. In this paper, we present a novel multivariate method for evaluating what behaviors of interactive robot systems improve interaction outcomes. We illustrate the use of this method in a case study in which we explore how different types of narrative gestures of a storytelling robot improve its users’ recall of the robot’s story, their ability to retell the robot’s story, their perceptions of and rapport with the robot, and their overall engagement in the experiment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Ajzen, I. (1991). The theory of planned behavior. Organizational behavior and human decision processes, 50(2), 179–211.

    Article  Google Scholar 

  • Argyle, M., & Cook, M. (1976). Gaze and mutual gaze. Cambridge: Cambridge University Press.

    Google Scholar 

  • Arthur, J. B. (1994). Effects of human resource systems on manufacturing performance and turnover. Academy of Management journal, 37(3), 670–687.

    Article  Google Scholar 

  • Bayes, M. A. (1972). Behavioral cues of interpersonal warmth. Journal of Consulting and Clinical Psychology, 39(2), 333.

    Article  MathSciNet  Google Scholar 

  • Bremner, P., Pipe, A. G., Melhuish, C., Fraser, M., & Subramanian, S. (2011). The effects of robot-performed co-verbal gesture on listener behaviour. In 2011 11th IEEE-RAS International Conference on Humanoid Robots (Humanoids), (pp. 458–465). IEEE.

  • Brendgen, M., Bowen, F., Rondeau, N., & Vitaro, F. (1999). Effects of friends characteristics on childrens social cognitions. Social Development, 8(1), 41–51.

    Article  Google Scholar 

  • Chidambaram, V., Chiang, Y. H., & Mutlu, B. (2012). Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. In 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 293–300). IEEE.

  • Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2013). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). London: Routledge.

    Google Scholar 

  • Ekman, P., & Friesen, W. (1969). The repertoire of nonverbal behavioral categories. Semiotica, 1, 49–98.

    Google Scholar 

  • Fasola, J., & Mataric, M. J. (2012). Using socially assistive human-robot interaction to motivate physical exercise for older adults. Proceedings of the IEEE, 100(8), 2512–2526.

    Article  Google Scholar 

  • Foster, M. E., Giuliani, M., & Knoll, A. (2009). Comparing objective and subjective measures of usability in a human-robot dialogue system. In Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 2-Volume 2, (pp. 879–887). Association for Computational Linguistics.

  • Goldin-Meadow, S. (1999). The role of gesture in communication and thinking. Trends in cognitive sciences, 3(11), 419–429.

    Article  Google Scholar 

  • Goldin-Meadow, S. (2003). Hearing Gesture: How our hands help up think. Cambridge: Harvard University Press.

    Google Scholar 

  • Graham, J. (1975). A cross-cultural study of the communication of extra-verbal meaning by gestures. International Journal of Psychology, 10, 57–67.

    Article  Google Scholar 

  • Ham, J., Bokhorst, R., Cuijpers, R., van der Pol, D., & Cabibihan, J. J. (2011). Making robots persuasive: the influence of combining persuasive strategies (gazing and gestures) by a storytelling robot on its persuasive power. Social robotics (pp. 71–83). Berlin: Springer.

  • Huang, C. M., & Mutlu, B. (2012). Robot behavior toolkit: Generating effective social behaviors for robots. In 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 25–32).

  • Huang, C. M., & Mutlu, B. (2013). Modeling and evaluating narrative gestures for humanlike robots. In Proceedings of the Robotics: Science and Systems Conference, RSS’13.

  • Huang, C. M., Iio, T., Satake, S., & Kanda, T. (2014). Modeling and controlling friendliness for an interactive museum robot. In Proceedings of the Robotics: Science and Systems Conference, RSS’14.

  • Kendon, A. (1994). Do gestures communicate? a review. Research on language and social interaction. Edmonton: Boreal Scholarly Publishers & Distributors.

    Google Scholar 

  • Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge: Cambridge University Press.

    Google Scholar 

  • Kidd, C. D. (2008). Designing for long-term human-robot interaction and application to weight loss. PhD thesis, Cambridge, aAI0819995.

  • Krauss, R., Chen, Y., & Gottesman, R. (2000). Lexical gestures and lexical access: A process model (pp., 261–283). Cambridge: Cambridge University Press.

    Google Scholar 

  • Landis, J., & Koch, G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174.

    Article  MathSciNet  MATH  Google Scholar 

  • Lee, M. K., Kiesler, S., Forlizzi, J., Srinivasa, S., & Rybski, P. (2010). Gracefully mitigating breakdowns in robotic services. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 203–210). IEEE.

  • Lozano, S., & Tversky, B. (2006). Communicative gestures facilitate problem solving for both communicators and recipients. Journal of Memory and Language, 55(1), 47–63.

    Article  Google Scholar 

  • McGrath, J. E. (1995). Methodology matters: doing research in the behavioral and social sciences. Human-computer interaction, (pp. 152–169). San Francisco: Morgan Kaufmann Publishers Inc.

  • McNeill, D. (1992). Hand and Mind. Chicago: University of Chicago Press.

    Google Scholar 

  • Mutlu, B., Forlizzi, J., & Hodgins, J. (2006). A storytelling robot: Modeling and evaluation of human-like gaze behavior. In 2006 6th IEEE-RAS International Conference on Humanoid Robots, (pp. 518–523). IEEE.

  • Narahara, H., & Maeno, T. (2007). Factors of gestures of robots for smooth communication with humans. In Proceedings of the 1st International Conference on Robot Communication and Coordination, (pp. 44). IEEE Press.

  • Neggers, S., & Bekkering, H. (2001). Gaze anchoring to a pointing target is present during the entire pointing movement and is driven by a non-visual signal. Journal of Neurophysiology, 86(2), 961–970.

    Google Scholar 

  • Okuno, Y., Kanda, T., Imai, M., Ishiguro, H., & Hagita, N. (2009). Providing route directions: Design of robot’s utterance, gesture, and timing. In 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 53–60). IEEE.

  • Peltason, J., Riether, N., Wrede, B., & Lütkebohle, I. (2012). Talking with robots about objects: A system-level evaluation in hri. In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, (pp. 479–486). ACM.

  • Persson, S., Wei, H., Milne, J., Page, G. P., & Somerville, C. R. (2005). Identification of genes required for cellulose synthesis by regression analysis of public microarray data sets. Proceedings of the National Academy of Sciences of the United States of America, 102(24), 8633–8638.

    Article  Google Scholar 

  • Richmond, V. (2002). Teacher nonverbal immediacy: Use and outcomes (pp. 65–82). Boston: Allyn and Bacon.

    Google Scholar 

  • Riek, L. D., Rabinowitch, T. C., Bremner, P., Pipe, A. G., Fraser, M., & Robinson, P. (2010). Cooperative gestures: Effective signaling for humanoid robots. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 61–68). IEEE.

  • Roth, W. M. (2001). Gestures: Their role in teaching and learning. Review of Educational Research, 71(3), 365–392.

    Article  Google Scholar 

  • Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., & Joublin, F. (2012). Generation and evaluation of communicative robot gesture. International Journal of Social Robotics, 4(2), 201–217.

    Article  Google Scholar 

  • Sauppé, A., & Mutlu, B. (2014). How social cues regulate task coordination and communication. In Proceedings of the ACM 2014 Conference on Computer Supported Cooperative Work, ACM.

  • Schegloff, E. (1984). On some gestures’ relation to speech (pp. 266–296). Cambridge: Cambridge Univeristy Press.

    Google Scholar 

  • Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: uses in assessing rater reliability. Psychological bulletin, 86(2), 420.

    Article  Google Scholar 

  • Sidnell, J. (2006). Coordinating gesture, talk, and gaze in reenactments. Research on Language and Social Interaction, 39(4), 377–409.

    Article  Google Scholar 

  • Stanovich, K. E., & Siegel, L. S. (1994). Phenotypic performance profile of children with reading disabilities: A regression-based test of the phonological-core variable-difference model. Journal of Educational Psychology, 86(1), 24.

    Article  Google Scholar 

  • Streeck, J. (1988). The significance of gestures: How it is established. Papers in Pragmatics, 2(1/2), 60–83.

    Google Scholar 

  • Streeck, J. (1993). Gesture as communication I: Its coordination with gaze and speech. Communications Monographs, 60(4), 275–299.

    Article  Google Scholar 

  • Sugiyama, O., Kanda, T., Imai, M., Ishiguro, H., & Hagita, N. (2007). Natural deictic communication with humanoid robots. In IROS 2007. IEEE/RSJ International Conference on Intelligent Robots and Systems, (pp. 1441–1448). IEEE.

  • Szafir, D., & Mutlu, B. (2012). Pay attention!: Designing adaptive agents that monitor and improve user engagement. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (pp. 11–20). ACM.

  • Tanaka, F., & Cicourel, A. (2007). Socialization between toddlers and robots at an early childhood education center. Proceedings of the National Academy of Sciences, 104(46), 17,954–17,958.

    Article  Google Scholar 

  • Terrell, A., & Mutlu, B. (2012). A regression-based approach to modeling addressee backchannels. In Proceedings of the 13th Annual Meeting of the Special Interest Group on Discourse and Dialogue, (pp. 280–289). Association for Computational Linguistics.

  • Thompson, L., Driscoll, D., & Markson, L. (1998). Memory for visual-spoken language in children and adults. Journal of Nonverbal Behavior, 22, 167–187.

    Article  Google Scholar 

  • Walker, M. A., Litman, D. J., Kamm, C. A., & Abella, A. (1997). Paradise: A framework for evaluating spoken dialogue agents. In Proceedings of the Eighth Conference on European Chapter of the Association for Computational Linguistics, (pp. 271–280). Association for Computational Linguistics.

  • Yamazaki, A., Yamazaki, K., Kuno, Y., Burdelski, M., Kawashima, M., & Kuzuoka, H. (2008). Precision timing in human-robot interaction: Coordination of head movement and utterance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (pp. 131–140). ACM.

Download references

Acknowledgments

The authors would like to thank Jingjing Du for her early help with this work, Jilana Boston, Brandi Hefty, and Ross Luo for their help with behavioral coding, and Catherine Steffel for her help with the editing of the paper. National Science Foundation awards 1017952 and 1149970 and an equipment loan from Mitsubishi Heavy Industries, Ltd. provided support for this work. The case study presented here was published in the Proceedings of Robotics: Science and Systems (Huang and Mutlu 2013).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chien-Ming Huang.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 195 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, CM., Mutlu, B. Multivariate evaluation of interactive robot systems. Auton Robot 37, 335–349 (2014). https://doi.org/10.1007/s10514-014-9415-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-014-9415-y

Keywords

Navigation