Closing the Feedback Loop: The Relationship Between Input and Output Modalities in Human-Robot Interactions
- 31 Downloads
Previous studies suggested that communication modalities used for human control and robot feedback influence human-robot interactions. However, they generally tended to focus on one part of the communication, ignoring the relationship between control and feedback modalities. We aim to understand whether the relationship between a user’s control modality and a robot’s feedback modality influences the quality of the interaction and if so, find the most compatible pairings. In a laboratory Wizard-of-Oz experiment, participants were asked to guide a robot through a maze by using either hand gestures or vocal commands. The robot provided vocal or motion feedback to the users across the experimental conditions forming different combinations of control-feedback modalities. We found that the combinations of control-feedback modalities affected the quality of human-robot interaction (subjective experience and efficiency) in different ways. Participants showed less worry and were slower when they communicated with the robot by voice and received vocal feedback, compared to gestural control and receiving vocal feedback. In addition, they felt more distress and were faster when they communicated with the robot by gestures and received motion feedback compared to vocal control and motion feedback. We also found that providing feedback improves the quality of human-robot interaction. In this paper we detail the procedure and results of this experiment.
KeywordsHuman-robot interaction Feedback loop Navigation task Feedback by motion cues Stimulus-response compatibility
This research was supported by the Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Center at Ben-Gurion University of the Negev. The second author, SH is also supported by Ben-Gurion University of the Negev through the High-tech, Bio-tech and Chemo-tech Scholarship.
- 2.Mirnig, N., Riegler, S., Weiss, A., Tscheligi, M.: A case study on the effect of feedback on itinerary requests in human-robot interaction. In: 2011 IEEE RO-MAN, pp. 343–349. IEEE, July 2011Google Scholar
- 4.Redden, E.S., Carstens, C.B., Pettitt, R.A.: Intuitive speech-based robotic control (No. ARL-TR-5175). Army Research Lab Aberdeen Proving Ground MD Human Research and Engineering Directorate (2010)Google Scholar
- 5.Perrin, X., Chavarriaga, R., Ray, C., Siegwart, R., Millán, J.D.R.: A comparative psychophysical and EEG study of different feedback modalities for HRI. In: Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction, pp. 41–48. ACM, March 2008Google Scholar
- 11.Brooke, J.: SUS-a quick and dirty usability scale. Usability Eval. Ind. 189(194), 4–7 (1996)Google Scholar
- 12.Bangor, A., Kortum, P., Miller, J.: Determining what individual SUS scores mean: adding an adjective rating scale. J. Usability Stud. 4(3), 114–123 (2009)Google Scholar
- 14.Noldus, L.P., Trienes, R.J., Hendriksen, A.H., Jansen, H., Jansen, R.G.: The Observer Video-Pro: new software for the collection, management, and presentation of time-structured data from videotapes and digital media files. Behav. Res. Methods Instrum. Comput. 32(1), 197–206 (2000)CrossRefGoogle Scholar
- 15.Bau, O., Mackay, W.E.: OctoPocus: a dynamic guide for learning gesture-based command sets. In: Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, pp. 37–46. ACM, October 2008Google Scholar
- 18.Beringer, N.: Evoking Gestures in SmartKom − Design of the Graphical User Interface (2001)Google Scholar