Skip to main content

GazeLens: Guiding Attention to Improve Gaze Interpretation in Hub-Satellite Collaboration

Part of the Lecture Notes in Computer Science book series (LNISA,volume 11747)

Abstract

In hub-satellite collaboration using video, interpreting gaze direction is critical for communication between hub coworkers sitting around a table and their remote satellite colleague. However, 2D video distorts images and makes this interpretation inaccurate. We present GazeLens, a video conferencing system that improves hub coworkers’ ability to interpret the satellite worker’s gaze. A \(360^{\circ }\) camera captures the hub coworkers and a ceiling camera captures artifacts on the hub table. The system combines these two video feeds in an interface. Lens widgets strategically guide the satellite worker’s attention toward specific areas of her/his screen allow hub coworkers to clearly interpret her/his gaze direction. Our evaluation shows that GazeLens (1) increases hub coworkers’ overall gaze interpretation accuracy by \(25.8\%\) in comparison to a conventional video conferencing system, (2) especially for physical artifacts on the hub table, and (3) improves hub coworkers’ ability to distinguish between gazes toward people and artifacts. We discuss how screen space can be leveraged to improve gaze interpretation.

Keywords

  • Remote collaboration
  • Telepresence
  • Gaze
  • Lens widgets

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-29384-0_18
  • Chapter length: 22 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   129.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-29384-0
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   171.00
Price excludes VAT (USA)
Fig. 1.
Fig. 2.

(image courtesy requested).

Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.

Notes

  1. 1.

    https://docs.microsoft.com/en-us/dotnet/framework/.

  2. 2.

    https://www.kjell.com/se/sortiment/dator-natverk/datortillbehor/webbkameror/plexgear-720p-webbkamera-p61271.

References

  1. Vertegaal, R.: The GAZE groupware system: mediating joint attention in multiparty communication and collaboration. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1999, pp. 294–301. ACM, New York (1999)

    Google Scholar 

  2. Akkil, D., James, J.M., Isokoski, P., Kangas, J.: GazeTorch: enabling gaze awareness in collaborative physical tasks. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA 2016, pp. 1151–1158. ACM, New York (2016)

    Google Scholar 

  3. Vertegaal, R., van der Veer, G., Vons, H.: Effects of gaze on multiparty mediated communication. In: Proceedings of Graphics Interface 2000, pp. 95–102. ACM, New York (2010)

    Google Scholar 

  4. Vertegaal, R., Slagter, R., Van der Veer, G., Nijholt, A.: Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2001, pp. 301–308. ACM, New York (2001)

    Google Scholar 

  5. Higuch, K., Yonetani, R., Sato, Y.: Can eye help you? Effects of visualizing eye fixations on remote collaboration scenarios for physical tasks. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI 2016, pp. 5180–5190. ACM, New York (2016)

    Google Scholar 

  6. Xu, B., Ellis, J., Erickson, T.: Attention from afar: simulating the gazes of remote participants in hybrid meetings. In: Proceedings of the 2017 Conference on Designing Interactive Systems, DIS 2017, pp. 101–113. ACM, New York (2017)

    Google Scholar 

  7. Sellen, A., Buxton, B., Arnott, J.: Using spatial cues to improve videoconferencing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1992, pp. 651–652. ACM, New York (1992)

    Google Scholar 

  8. Nguyen, D., Canny, J.: MultiView: spatially faithful group video conferencing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2005, pp. 799–808. ACM, New York (2005)

    Google Scholar 

  9. Pan, Y., Steed, A.: A gaze-preserving situated multiview telepresence system. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2014, pp. 2173–2176. ACM, New York (2014)

    Google Scholar 

  10. Chen, M.: Leveraging the asymmetric sensitivity of eye contact for videoconferencing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2002, pp. 49–56. ACM, New York (2002)

    Google Scholar 

  11. Ishii, H., Kobayashi, M.: ClearBoard: a seamless medium for shared drawing and conversation with eye contact. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1992, pp. 525–532. ACM, New York (1992)

    Google Scholar 

  12. Hauber, J., Regenbrecht, H., Billinghurst, M., Cockburn, A.: Spatiality in videoconferencing: trade-offs between efficiency and social presence. In: Proceedings of the 2006 20th Anniversary Conference on Computer Supported Cooperative Work, CSCW 2006, pp. 413–422. ACM, New York (2006)

    Google Scholar 

  13. Otsuka, K.: MMSpace: kinetically-augmented telepresence for small group-to-group conversations. In: Proceedings of 2016 IEEE Virtual Reality (VR). IEEE (2016)

    Google Scholar 

  14. Küchler, M., Kunz, A.: Holoport-a device for simultaneous video and data conferencing featuring gaze awareness. In: Proceedings of Virtual Reality Conference 2006, pp. 81–88. IEEE (2006)

    Google Scholar 

  15. Jones, A., et al.: Achieving eye contact in a one-to-many 3D video teleconferencing system. ACM Trans. Graph. (TOG) 28(3) (2009)

    CrossRef  Google Scholar 

  16. Gotsch, D., Zhang, X., Meeritt, T., Vertegaal, R.: TeleHuman2: a cylindrical light field teleconferencing system for life-size 3D human telepresence. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI 2018, p. 552. ACM, New York (2018)

    Google Scholar 

  17. Otsuki, M., Kawano, T., Maruyama, K., Kuzuoka, H., Suzuki, Y.: ThirdEye: simple add-on display to represent remote participant’s gaze direction in video communication. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI 2017, pp. 5307–5312. ACM, New York (2017)

    Google Scholar 

  18. Vertegaal, R., Weevers, I., Sohn, C., Cheung, C.: GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2003, pp. 521–528. ACM, New York (2003)

    Google Scholar 

  19. Norris, J., Schnädelbach, H., Qiu, G.: CamBlend: an object focused collaboration tool. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2012, pp. 627–636. ACM, New York (2012)

    Google Scholar 

  20. Bailey, R., McNamara, A., Sudarsanam, N., Grimm, C.: Subtle gaze direction. ACM Trans. Graph. (TOG) 28(4) (2009)

    CrossRef  Google Scholar 

  21. Hata, H., Koike, H., Sato, Y.: Visual guidance with unnoticed blur effect. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, AVI 2016, pp. 28–35. ACM, New York (2016)

    Google Scholar 

  22. Stokes, R.: Human factors and appearance design considerations of the Mod II PICTUREPHONE station set. ACM Trans. Graph. (TOG) 17(2), 318 (1969)

    MathSciNet  Google Scholar 

  23. Average human sitting posture dimensions required in interior design. https://gharpedia.com/average-human-sitting-posture-dimensions-required-in-interior-design/

  24. Gemmell, J., Toyama, K., Zitnick, C.L., Kang, T., Seitz, S.: Gaze awareness for video-conferencing: a software approach. IEEE MultiMedia 7(4), 26–35 (2000)

    CrossRef  Google Scholar 

  25. Giger, D., Bazin, J.-C., Kuster, C., Popa, T., Gross, M.: Gaze correction with a single webcam. In: 2014 IEEE International Conference on Multimedia and Expo (ICME). IEEE (2014)

    Google Scholar 

  26. Venolia, G., et al.: Embodied social proxy: mediating interpersonal connection in hub-and-satellite teams. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2010, pp. 1049–1058. ACM, New York (2010)

    Google Scholar 

  27. Kendon, A.: Some functions of gaze-direction in social interaction. Acta Psychologica 26, 22–63 (1967)

    CrossRef  Google Scholar 

  28. Brennan, S.E., Chen, X., Dickinson, C.A., Neider, M.B., Zelinsky, G.J.: Coordinating cognition: the costs and benefits of shared gaze during collaborative search. Cognition 106(3), 1465–1477 (2008)

    CrossRef  Google Scholar 

  29. Akkil, D., Isokoski, P.: I see what you see: gaze awareness in mobile video collaboration. In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA 2018, p. 32. ACM, New York (2018)

    Google Scholar 

  30. Yao, N., Brewer, J., D’Angelo, S., Horn, M., Gergle, D.: Visualizing gaze information from multiple students to support remote instruction. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, CHI 2018, p. LBW051. ACM, New York (2018)

    Google Scholar 

  31. Avellino, I., Fleury, C., Beaudouin-Lafon, M.: Accuracy of deictic gestures to support telepresence on wall-sized displays. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 2393–2396. ACM, New York (2015)

    Google Scholar 

  32. Monk, A.F., Gale, C.: A look is worth a thousand words: full gaze awareness in video-mediated conversation. Discourse Process. 33(4), 257–278 (2002)

    CrossRef  Google Scholar 

  33. Hart, S.G., Staveland, L.E: Development of NASA-TLX (task load index): results of empirical and theoretical research. Adv. Psychol. 52, 139–183 (1998)

    Google Scholar 

  34. Cisco TelePresence MX Series. https://www.cisco.com/c/en/us/products/collaboration-endpoints/telepresence-mx-series/index.html. Accessed 26 Jan 2019

  35. RealPresence Group Series. http://www.polycom.com/products-services/hd-telepresence-video-conferencing/realpresence-room/realpresence-group-series.html. Accessed 26 Jan 2019

  36. Cisco CTS-SX80-IPST60-K9 TelePresence (CTS-SX80-IPST60-K9). https://www.bechtle.com/ch-en/shop/cisco-cts-sx80-ipst60-k9-telepresence-896450-40-p. Accessed 26 Jan 2019

  37. Enterprise Video Conference. http://www.avsolutions.com/enterprise-video-conference. Accessed 26 Jan 2019

  38. Caine, K.: Local standards for sample size at CHI. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 981–992. ACM, New York (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Khanh-Duy Le .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 14940 KB)

Rights and permissions

Reprints and Permissions

Copyright information

© 2019 IFIP International Federation for Information Processing

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Le, KD., Avellino, I., Fleury, C., Fjeld, M., Kunz, A. (2019). GazeLens: Guiding Attention to Improve Gaze Interpretation in Hub-Satellite Collaboration. In: Lamas, D., Loizides, F., Nacke, L., Petrie, H., Winckler, M., Zaphiris, P. (eds) Human-Computer Interaction – INTERACT 2019. INTERACT 2019. Lecture Notes in Computer Science(), vol 11747. Springer, Cham. https://doi.org/10.1007/978-3-030-29384-0_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-29384-0_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-29383-3

  • Online ISBN: 978-3-030-29384-0

  • eBook Packages: Computer ScienceComputer Science (R0)