Skip to main content

Using Human Eye Gaze Patterns as Indicators of Need for Assistance from a Socially Assistive Robot

Part of the Lecture Notes in Computer Science book series (LNAI,volume 11876)

Abstract

With current growth in social robotics comes a need for well developed and fine tuned agents which respond to the user in a seamless and intuitive manner. Socially assistive robots in particular have become popular for their uses in care for older adults for medication adherence and socializing. Since eye gaze cues are important mediators in human-human interactions, we hypothesize that gaze patterns can be applied to human-robot interactions to identify when the user may need assistance. We reviewed videos (N = 16) of robot supported collaborative work to explore how recognition of gaze patterns for an assistive robot in the context of a medication management task can help predict when a user needs assistance. We found that mutual gaze is a better predictor than confirmatory request, gaze away, and goal reference. While eye gaze serves as an important indicator for need for assistance, it should be combined with other indicators, such as verbal cues or facial expressions to sufficiently represent assistance needed in the interaction and provide timely assistance.

Keywords

  • Gaze detection
  • Gaze patterns
  • Assistive agents

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-35888-4_19
  • Chapter length: 11 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   89.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-35888-4
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   119.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.

Notes

  1. 1.

    http://chronoviz.com.

References

  1. Admoni, H., Scassellati, B.: Social eye gaze in human-robot interaction: a review. J. Hum.-Robot Interact. 6(1), 25–63 (2017)

    CrossRef  Google Scholar 

  2. Andrist, S., Gleicher, M., Mutlu, B.: Looking coordinated: bidirectional gaze mechanisms for collaborative interaction with virtual characters. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2571–2582. ACM (2017)

    Google Scholar 

  3. Andrist, S., Tan, X.Z., Gleicher, M., Mutlu, B.: Conversational gaze aversion for humanlike robots. In: Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, pp. 25–32. ACM (2014)

    Google Scholar 

  4. Bruce, A., Nourbakhsh, I., Simmons, R.: The role of expressiveness and attention in human-robot interaction. In: Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No. 02CH37292), vol. 4, pp. 4138–4142. IEEE (2002)

    Google Scholar 

  5. Casas, J., et al.: Architecture for a social assistive robot in cardiac rehabilitation. In: 2018 IEEE 2nd Colombian Conference on Robotics and Automation (CCRA), pp. 1–6. IEEE (2018)

    Google Scholar 

  6. Fang, R., Doering, M., Chai, J.Y.: Embodied collaborative referring expression generation in situated human-robot interaction. In: 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 271–278. IEEE (2015)

    Google Scholar 

  7. Feil-Seifer, D., Matarić, M.J.: Defining socially assistive robotics. In: 9th International Conference on Rehabilitation Robotics, ICORR 2005, pp. 465–468 (2005)

    Google Scholar 

  8. Grigore, E.C., Eder, K., Pipe, A.G., Melhuish, C., Leonards, U.: Joint action understanding improves robot-to-human object handover. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4622–4629. IEEE (2013)

    Google Scholar 

  9. Huang, C.M., Andrist, S., Sauppé, A., Mutlu, B.: Using gaze patterns to predict task intent in collaboration. Front. Psychol. 6, 1049 (2015)

    Google Scholar 

  10. Jeste, D.V., Depp, C.A.: Positive mental aging. Am. J. Geriatr. Psychiatry 18(1), 1–3 (2010)

    CrossRef  Google Scholar 

  11. Jokinen, K.: Gaze and gesture activity in communication. In: Stephanidis, C. (ed.) UAHCI 2009. LNCS, vol. 5615, pp. 537–546. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-02710-9_60

    CrossRef  Google Scholar 

  12. Kendon, A., Cook, M.: The consistency of gaze patterns in social interaction. Br. J. Psychol. 60(4), 481–494 (1969)

    CrossRef  Google Scholar 

  13. Kleinke, C.L.: Gaze and eye contact: a research review. Psychol. Bull. 100(1), 78 (1986)

    CrossRef  Google Scholar 

  14. Matarić, M.J., Eriksson, J., Feil-Seifer, D.J., Winstein, C.J.: Socially assistive robotics for post-stroke rehabilitation. J. NeuroEng. Rehabil. 4(5), 1–9 (2007)

    Google Scholar 

  15. Mehlmann, G., Häring, M., Janowski, K., Baur, T., Gebhard, P., André, E.: Exploring a model of gaze for grounding in multimodal HRI. In: Proceedings of the 16th International Conference on Multimodal Interaction, pp. 247–254. ACM (2014)

    Google Scholar 

  16. Pennisi, P., et al.: Autism and social robotics: a systematic review. Autism Res. 9(2), 165–183 (2016)

    CrossRef  Google Scholar 

  17. Satake, S., Kanda, T., Glas, D.F., Imai, M., Ishiguro, H., Hagita, N.: How to approach humans?: strategies for social robots to initiate interaction. In: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction, HRI 2009, pp. 109–116. ACM, New York (2009)

    Google Scholar 

  18. Schneider, B., Pea, R.: Real-time mutual gaze perception enhances collaborative learning and collaboration quality. Int. J. Comput.-Supported Collab. Learn. 8(4), 375–397 (2013)

    CrossRef  Google Scholar 

  19. Tapus, A., Maja, M., Scassellatti, B.: The grand challenges in socially assistive robotics. IEEE Robot. Autom. Mag. 14(1), 35–42 (2007)

    CrossRef  Google Scholar 

  20. Wada, K., Shibata, T.: Robot therapy in a care house - change of relationship among the residents and seal robot during a 2-month long study. IEEE Trans. Robot. 23(5), 972–980 (2007)

    CrossRef  Google Scholar 

  21. Wilson, J.R., Lee, N.Y., Saechao, A., Tickle-Degnen, L., Scheutz, M.: Supporting human autonomy in a robot-assisted medication sorting task. Int. J. Soc. Robot. 10(5), 621–641 (2018)

    CrossRef  Google Scholar 

  22. Wilson, J.R., Wransky, M., Tierno, J.: General approach to automatically generating need-based assistance. In: Proceeding of Advances in Cognitive Systems (2018))

    Google Scholar 

  23. Yamazaki, K., et al.: Prior-to-request and request behaviors within elderly day care: implications for developing service robots for use in multiparty settings. In: Bannon, L.J., Wagner, I., Gutwin, C., Harper, R.H.R., Schmidt, K. (eds.) ECSCW 2007, pp. 61–78. Springer, London (2007). https://doi.org/10.1007/978-1-84800-031-5_4

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ulyana Kurylo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Kurylo, U., Wilson, J.R. (2019). Using Human Eye Gaze Patterns as Indicators of Need for Assistance from a Socially Assistive Robot. In: , et al. Social Robotics. ICSR 2019. Lecture Notes in Computer Science(), vol 11876. Springer, Cham. https://doi.org/10.1007/978-3-030-35888-4_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-35888-4_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-35887-7

  • Online ISBN: 978-3-030-35888-4

  • eBook Packages: Computer ScienceComputer Science (R0)