Skip to main content

Looking for Info: Evaluation of Gaze Based Information Retrieval in Augmented Reality

  • Conference paper
  • First Online:
Human-Computer Interaction – INTERACT 2021 (INTERACT 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12932))

Included in the following conference series:

Abstract

This paper presents the results of an empirical study and a real-world deployment of a gaze-adaptive UI for Augmented Reality (AR). AR introduces an attention dilemma between focusing on the reality vs. on AR content. Past work suggested eye gaze as a technique to open information interfaces, however there is only little empirical work. We present an empirical study comparing gaze-adaptive to an always-on interface in tasks that vary focus between reality and virtual content. Across tasks, we find most participants prefer the gaze-adaptive UI and find it less distracting. When focusing on reality, the gaze UI is faster, perceived as easier and more intuitive. When focusing on virtual content, always-on is faster but user preferences are split. We conclude with the design and deployment of an interactive application in a public museum, demonstrating the promising potential in the real world.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Deutsches Museum: http://www.deutsches-museum.de/.

  2. 2.

    https://www.stereolabs.com/zed-mini/.

References

  1. Ajanki, A., et al.: An augmented reality interface to contextual information. Virtual Reality 15(2–3), 161–173 (2011)

    Article  Google Scholar 

  2. Azuma, R., Furmanski, C.: Evaluating label placement for augmented reality view management. In: Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality, p. 66. IEEE Computer Society (2003)

    Google Scholar 

  3. Bell, B., Feiner, S., Höllerer, T.: View management for virtual and augmented reality. In: Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology, UIST 2001, pp. 101–110. ACM, New York (2001). https://doi.org/10.1145/502348.502363

  4. Bernardos, A.M., Gómez, D., Casar, J.R.: A comparison of head pose and deictic pointing interaction methods for smart environments. Int. J. Hum.-Comput. Interact. 32(4), 325–351 (2016)

    Article  Google Scholar 

  5. Blattgerste, J., Renner, P., Pfeiffer, T.: Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In: Proceedings of the Workshop on Communication by Gaze Interaction, COGAIN 2018, pp. 1–9. ACM, New York (2018). https://doi.org/10.1145/3206343.3206349

  6. Bolt, R.A.: Gaze-orchestrated dynamic windows. In: ACM SIGGRAPH Computer Graphics, vol. 15, pp. 109–119. ACM (1981)

    Google Scholar 

  7. Lindlbauer, D., Feit, A., Hilliges, O.: Context-aware online adaptation of mixed reality interfaces. In: Proceedings of the 32th Annual ACM Symposium on User Interface Software and Technology, UIST 2019, pp. 213–222. ACM, New York (2019)

    Google Scholar 

  8. Davari, S., Lu, F., Bowman, D.A.: Occlusion management techniques for everyday glanceable AR interfaces. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp. 324–330. IEEE (2020)

    Google Scholar 

  9. Feiner, S., Macintyre, B., Seligmann, D.: Knowledge-based augmented reality. Commun. ACM 36(7), 53–62 (1993). https://doi.org/10.1145/159544.159587

  10. Feiner, S.K.: Augmented reality: a new way of seeing. Sci. Am. 286(4), 48–55 (2002)

    Article  Google Scholar 

  11. Fender, A., Herholz, P., Alexa, M., Müller, J.: Optispace: automated placement of interactive 3D projection mapping content. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 269. ACM (2018)

    Google Scholar 

  12. Grubert, J., Langlotz, T., Zollmann, S., Regenbrecht, H.: Towards pervasive augmented reality: context-awareness in augmented reality. IEEE Trans. Vis. Comput. Graph. 23(6), 1706–1724 (2017). https://doi.org/10.1109/TVCG.2016.2543720

    Article  Google Scholar 

  13. Hartmann, K., Götzelmann, T., Ali, K., Strothotte, T.: Metrics for functional and aesthetic label layouts. In: Butz, A., Fisher, B., Krüger, A., Olivier, P. (eds.) SG 2005. LNCS, vol. 3638, pp. 115–126. Springer, Heidelberg (2005). https://doi.org/10.1007/11536482_10

    Chapter  Google Scholar 

  14. Höllerer, T., Feiner, S.: Mobile Augmented Reality. Telegeoinformatics: Location-Based Computing and Services. Taylor and Francis Books Ltd., London (2004)

    Google Scholar 

  15. Imamov, S., Monzel, D., Lages, W.S.: Where to display? How interface position affects comfort and task switching time on glanceable interfaces. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 851–858 (2020). https://doi.org/10.1109/VR46266.2020.00110

  16. Ishiguro, Y., Rekimoto, J.: Peripheral vision annotation: noninterference information presentation method for mobile augmented reality. In: Proceedings of the 2nd Augmented Human International Conference, AH 2011, pp. 8:1–8:5. ACM, New York (2011). https://doi.org/10.1145/1959826.1959834

  17. Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1990, pp. 11–18. ACM, New York (1990). https://doi.org/10.1145/97243.97246

  18. Julier, S., Baillot, Y., Brown, D., Lanzagorta, M.: Information filtering for mobile augmented reality. IEEE Comput. Graph. Appl. 22(5), 12–15 (2002). https://doi.org/10.1109/MCG.2002.1028721

  19. Keil, J., Zoellner, M., Engelke, T., Wientapper, F., Schmitt, M.: Controlling and filtering information density with spatial interaction techniques via handheld augmented reality. In: Shumaker, R. (ed.) VAMR 2013. LNCS, vol. 8021, pp. 49–57. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-39405-8_6

    Chapter  Google Scholar 

  20. Kim, M., Lee, M.K., Dabbish, L.: Shop-i: Gaze based Interaction in the physical world for in-store social shopping experience. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA 2015, pp. 1253–1258. Association for Computing Machinery, Seoul (2015). https://doi.org/10.1145/2702613.2732797

  21. Kytö, M., Ens, B., Piumsomboon, T., Lee, G.A., Billinghurst, M.: Pinpointing: precise head- and eye-based target selection for augmented reality. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI 2018, pp. 81:1–81:14. ACM, New York (2018). https://doi.org/10.1145/3173574.3173655

  22. Lages, W., Bowman, D.: Adjustable adaptation for spatial augmented reality workspaces. In: Symposium on Spatial User Interaction, SUI 2019, Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3357251.3358755

  23. Lu, F., Davari, S., Lisle, L., Li, Y., Bowman, D.A.: Glanceable ar: evaluating information access methods for head-worn augmented reality. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 930–939. IEEE (2020)

    Google Scholar 

  24. Madsen, J.B., Tatzqern, M., Madsen, C.B., Schmalstieg, D., Kalkofen, D.: Temporal coherence strategies for augmented reality labeling. IEEE Trans. Vis. Comput. Graph. 22(4), 1415–1423 (2016)

    Article  Google Scholar 

  25. Mäkelä, V., et al.: Virtual field studies: conducting studies on public displays in virtual reality. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, CHI 2020, pp. 1–15. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3313831.3376796

  26. McNamara, A., Boyd, K., George, J., Suther, A., Jones, W., Oh, S.: Information placement in virtual reality. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 1078–1079 (2019). https://doi.org/10.1109/VR.2019.8797910, iSSN: 2642-5246

  27. McNamara, A., Boyd, K., Oh, D., Sharpe, R., Suther, A.: Using eye tracking to improve information retrieval in virtual reality. In: 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp. 242–243 (2018). https://doi.org/10.1109/ISMAR-Adjunct.2018.00076

  28. McNamara, A., Kabeerdoss, C.: Mobile augmented reality: placing labels based on gaze position. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp. 36–37 (2016). https://doi.org/10.1109/ISMAR-Adjunct.2016.0033

  29. McNamara, A., Kabeerdoss, C., Egan, C.: Mobile user interfaces based on user attention. In: Proceedings of the 2015 Workshop on Future Mobile User Interfaces, FutureMobileUI 2015, pp. 1–3. Association for Computing Machinery, Florence (2015). https://doi.org/10.1145/2754633.2754634

  30. Miniotas, D., Spakov, O., Tugoy, I., MacKenzie, I.S.: Speech-augmented eye gaze interaction with small closely spaced targets. In: Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, ETRA 2006, pp. 67–72. ACM, New York (2006). https://doi.org/10.1145/1117309.1117345

  31. Müller, J., Walter, R., Bailly, G., Nischt, M., Alt, F.: Looking glass: a field study on noticing interactivity of a shop window. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2012, pp. 297–306. Association for Computing Machinery, New York (2012). https://doi.org/10.1145/2207676.2207718

  32. Müller, L., Pfeuffer, K., Gugenheimer, J., Pfleging, B., Prange, S., Alt, F.: Spatialproto: exploring real-world motion captures for rapid prototyping of interactive mixed reality. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, CHI 2021. Association for Computing Machinery, New York(2021). https://doi.org/10.1145/3411764.3445560

  33. Pfeuffer, K., et al.: ARtention: a design space for gaze-adaptive user interfaces in augmented reality. Comput. Graph. 95, 1–12 (2021). https://doi.org/10.1016/j.cag.2021.01.001. http://www.sciencedirect.com/science/article/pii/S0097849321000017

  34. Pfeuffer, K., Alexander, J., Chong, M.K., Gellersen, H.: Gaze-touch: combining gaze with multi-touch for interaction on the same surface. In: Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST 2014, pp. 509–518. ACM, New York (2014). https://doi.org/10.1145/2642918.2647397

  35. Pfeuffer, K., Mayer, B., Mardanbegi, D., Gellersen, H.: Gaze + pinch interaction in virtual reality. In: Proceedings of the 5th Symposium on Spatial User Interaction, pp. 99–108. ACM (2017)

    Google Scholar 

  36. Pfeuffer, K., Mecke, L., Delgado Rodriguez, S., Hassib, M., Maier, H., Alt, F.: Empirical evaluation of gaze-enhanced menus in virtual reality. In: 26th ACM Symposium on Virtual Reality Software and Technology, VRST 2020. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3385956.3418962

  37. Prange, S., Müller, V., Buschek, D., Alt, F.: Quakequiz: a case study on deploying a playful display application in a museum context. In: Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia, MUM 2017, pp. 49–56. Association for Computing Machinery, New York (2017). https://doi.org/10.1145/3152832.3152841

  38. Rivu, R., Abdrabou, Y., Pfeuffer, K., Esteves, A., Meitner, S., Alt, F.: Stare: gaze-assisted face-to-face communication in augmented reality. In: ACM Symposium on Eye Tracking Research and Applications, ETRA 2020 Adjunct. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3379157.3388930

  39. Rosten, E., Reitmayr, G., Drummond, T.: Real-time video annotations for augmented reality. In: Bebis, G., Boyle, R., Koracin, D., Parvin, B. (eds.) ISVC 2005. LNCS, vol. 3804, pp. 294–302. Springer, Heidelberg (2005). https://doi.org/10.1007/11595755_36

    Chapter  Google Scholar 

  40. Sibert, L.E., Jacob, R.J.K.: Evaluation of eye gaze interaction. In: Proceedings of SIGCHI Conference on Human Factors in Computing Systems, CHI 2000, pp. 281–288. ACM, New York (2000). https://doi.org/10.1145/332040.332445

  41. Stellmach, S., Dachselt, R.: Look & touch: gaze-supported target acquisition. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2012, pp. 2981–2990. ACM, New York (2012). https://doi.org/10.1145/2207676.2208709

  42. Tanriverdi, V., Jacob, R.J.K.: Interacting with eye movements in virtual environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2000, pp. 265–272. ACM, New York (2000). https://doi.org/10.1145/332040.332443

  43. Tönnis, M., Klinker, G.: Boundary conditions for information visualization with respect to the user’s gaze. In: Proceedings of the 5th Augmented Human International Conference, AH 2014, pp. 44:1–44:8. ACM, New York (2014). https://doi.org/10.1145/2582051.2582095

  44. Velichkovsky, B., Sprenger, A., Unema, P.: Towards gaze-mediated interaction: collecting solutions of the Midas touch problem. In: Howard, S., Hammond, J., Lindgaard, G. (eds.) Human-Computer Interaction INTERACT ’97. ITIFIP, pp. 509–516. Springer, Boston (1997). https://doi.org/10.1007/978-0-387-35175-9_77

    Chapter  Google Scholar 

  45. Vertegaal, R., et al.: Attentive user interfaces. Commun. ACM 46(3), 30–33 (2003)

    Article  Google Scholar 

  46. White, S., Feiner, S.: Sitelens: situated visualization techniques for urban site visits. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, pp. 1117–1120. ACM, New York (2009). https://doi.org/10.1145/1518701.1518871

Download references

Acknowledgments

We thank the German Museum in Munich and in particular Claus Henkensiefken for their collaboration in the context of this project as well as all visitors who participated in our research. The presented work was funded by the German Research Foundation (DFG) under project no. 425869382 and by dtec.bw – Digitalization and Technology Research Center of the Bundeswehr [MuQuaNet].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robin Piening .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Piening, R. et al. (2021). Looking for Info: Evaluation of Gaze Based Information Retrieval in Augmented Reality. In: Ardito, C., et al. Human-Computer Interaction – INTERACT 2021. INTERACT 2021. Lecture Notes in Computer Science(), vol 12932. Springer, Cham. https://doi.org/10.1007/978-3-030-85623-6_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85623-6_32

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85622-9

  • Online ISBN: 978-3-030-85623-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics