Advertisement

Closing the Feedback Loop: The Relationship Between Input and Output Modalities in Human-Robot Interactions

  • Tamara MarkovichEmail author
  • Shanee Honig
  • Tal Oron-Gilad
Conference paper
  • 31 Downloads
Part of the Springer Proceedings in Advanced Robotics book series (SPAR, volume 12)

Abstract

Previous studies suggested that communication modalities used for human control and robot feedback influence human-robot interactions. However, they generally tended to focus on one part of the communication, ignoring the relationship between control and feedback modalities. We aim to understand whether the relationship between a user’s control modality and a robot’s feedback modality influences the quality of the interaction and if so, find the most compatible pairings. In a laboratory Wizard-of-Oz experiment, participants were asked to guide a robot through a maze by using either hand gestures or vocal commands. The robot provided vocal or motion feedback to the users across the experimental conditions forming different combinations of control-feedback modalities. We found that the combinations of control-feedback modalities affected the quality of human-robot interaction (subjective experience and efficiency) in different ways. Participants showed less worry and were slower when they communicated with the robot by voice and received vocal feedback, compared to gestural control and receiving vocal feedback. In addition, they felt more distress and were faster when they communicated with the robot by gestures and received motion feedback compared to vocal control and motion feedback. We also found that providing feedback improves the quality of human-robot interaction. In this paper we detail the procedure and results of this experiment.

Keywords

Human-robot interaction Feedback loop Navigation task Feedback by motion cues Stimulus-response compatibility 

Notes

Acknowledgments

This research was supported by the Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Center at Ben-Gurion University of the Negev. The second author, SH is also supported by Ben-Gurion University of the Negev through the High-tech, Bio-tech and Chemo-tech Scholarship.

References

  1. 1.
    Dubberly, H., Pangaro, P., Haque, U.: ON MODELING What is interaction?: are there different types? Interactions 16(1), 69–75 (2009)CrossRefGoogle Scholar
  2. 2.
    Mirnig, N., Riegler, S., Weiss, A., Tscheligi, M.: A case study on the effect of feedback on itinerary requests in human-robot interaction. In: 2011 IEEE RO-MAN, pp. 343–349. IEEE, July 2011Google Scholar
  3. 3.
    Mohammad, Y., Nishida, T.: Interaction between untrained users and a miniature robot in a collaborative navigation controlled experiment. Int. J. Inf. Acquis. 5(04), 291–307 (2008)CrossRefGoogle Scholar
  4. 4.
    Redden, E.S., Carstens, C.B., Pettitt, R.A.: Intuitive speech-based robotic control (No. ARL-TR-5175). Army Research Lab Aberdeen Proving Ground MD Human Research and Engineering Directorate (2010)Google Scholar
  5. 5.
    Perrin, X., Chavarriaga, R., Ray, C., Siegwart, R., Millán, J.D.R.: A comparative psychophysical and EEG study of different feedback modalities for HRI. In: Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction, pp. 41–48. ACM, March 2008Google Scholar
  6. 6.
    Greenwald, A.G.: A choice reaction time test of ideomotor theory. J. Exp. Psychol. 86, 20–25 (1970)CrossRefGoogle Scholar
  7. 7.
    Greenwald, A.G.: Sensory feedback mechanisms in performance control: with special reference to the ideo-motor mechanism. Psychol. Rev. 77(2), 73 (1970)CrossRefGoogle Scholar
  8. 8.
    Proctor, R.W., Vu, K.P.L.: Stimulus-Response Compatibility Principles: Data, Theory, and Application. CRC Press, Boca Raton (2006)CrossRefGoogle Scholar
  9. 9.
    Wickens, C.D., Sandry, D.L., Vidulich, M.: Compatibility and resource competition between modalities of input, central processing, and output. Hum. Factors 25(2), 227–248 (1983)CrossRefGoogle Scholar
  10. 10.
    Raubal, M., Egenhofer, M.J.: Comparing the complexity of wayfinding tasks in built environments. Environ. Plan. 25(6), 895–913 (1998)CrossRefGoogle Scholar
  11. 11.
    Brooke, J.: SUS-a quick and dirty usability scale. Usability Eval. Ind. 189(194), 4–7 (1996)Google Scholar
  12. 12.
    Bangor, A., Kortum, P., Miller, J.: Determining what individual SUS scores mean: adding an adjective rating scale. J. Usability Stud. 4(3), 114–123 (2009)Google Scholar
  13. 13.
    Matthews, G., Campbell, S.E., Falconer, S., Joyner, L.A., Huggins, J., Gilliland, K., Grier, R., Warm, J.S.: Fundamental dimensions of subjective state in performance settings: task engagement, distress, and worry. Emotion 2(4), 315 (2002)CrossRefGoogle Scholar
  14. 14.
    Noldus, L.P., Trienes, R.J., Hendriksen, A.H., Jansen, H., Jansen, R.G.: The Observer Video-Pro: new software for the collection, management, and presentation of time-structured data from videotapes and digital media files. Behav. Res. Methods Instrum. Comput. 32(1), 197–206 (2000)CrossRefGoogle Scholar
  15. 15.
    Bau, O., Mackay, W.E.: OctoPocus: a dynamic guide for learning gesture-based command sets. In: Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, pp. 37–46. ACM, October 2008Google Scholar
  16. 16.
    Matthews, G., Emo, A.K., Funke, G., Zeidner, M., Roberts, R.D., Costa Jr., P.T., Schulze, R.: Emotional intelligence, personality, and task-induced stress. J. Exp. Psychol. Appl. 12(2), 96 (2006)CrossRefGoogle Scholar
  17. 17.
    Nielsen, J.: Usability Engineering. Academic Press, Cambridge (1993)CrossRefGoogle Scholar
  18. 18.
    ‏Beringer, N.: Evoking Gestures in SmartKom − Design of the Graphical User Interface (2001)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Tamara Markovich
    • 1
    Email author
  • Shanee Honig
    • 1
  • Tal Oron-Gilad
    • 1
  1. 1.Ben-Gurion University of the NegevBeer-ShevaIsrael

Personalised recommendations