Personal and Ubiquitous Computing

, Volume 19, Issue 5–6, pp 967–981 | Cite as

Eye tracking for public displays in the wild

  • Yanxia Zhang
  • Ming Ki Chong
  • Jörg Müller
  • Andreas Bulling
  • Hans Gellersen
Original Article

Abstract

In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.

Keywords

Eye tracking Gaze interaction Public displays  Scrolling Calibration-free In-the-wild study Deployment 

References

  1. 1.
    Boring S, Baur D, Butz A, Gustafson S, Baudisch P (2010) Touch projector: mobile interaction through video. In: Proceedings of the CHI 2010, ACM Press, 2287–2296Google Scholar
  2. 2.
    Brignull H, Rogers Y (2003) Enticing people to interact with large public displays in public spaces. In: Proceedings of the INTERACT 2003, IOS Press, 17–24Google Scholar
  3. 3.
    Eaddy M, Blasko G, Babcock J, Feiner S (2004) My own private kiosk: privacy-preserving public displays. In: Proceedings of the ISWC 2004, IEEE computer society, 132–135Google Scholar
  4. 4.
    Hansen D, Ji Q (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500CrossRefGoogle Scholar
  5. 5.
    Jacob RJK (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Syst 9(2):152–169CrossRefGoogle Scholar
  6. 6.
    Kukka H, Oja H, Kostakos V, Gonçalves J, Ojala T (2013) What makes you click: exploring visual signals to entice interaction on public displays. In: Proceedings of the CHI 2013, ACM Press, 1699–1708Google Scholar
  7. 7.
    Kumar M, Winograd T (2007) Gaze-enhanced scrolling techniques. In: Proceedings of the UIST 2007, ACM Press, 213–216Google Scholar
  8. 8.
    MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the ETRA 2008, ACM Press, 55–58Google Scholar
  9. 9.
    Mardanbegi D, Hansen DW, Pederson T (2012) Eye-based head gestures. In: Proceedings of the ETRA 2012, ACM Press, 139–146Google Scholar
  10. 10.
    Marshall P, Morris R, Rogers Y, Kreitmayer S, Davies M (2011) Rethinking ‘multi-user’: an in-the-wild study of how groups approach a walk-up-and-use tabletop interface. In: Proceedings of the CHI 2011, ACM, 3033–3042Google Scholar
  11. 11.
    Morimoto CH, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24CrossRefGoogle Scholar
  12. 12.
    Müller J, Alt F, Michelis D, Schmidt A (2010) Requirements and design space for interactive public displays. In: Proceedings of the MM 2010, ACM Press, 1285–1294Google Scholar
  13. 13.
    Müller J, Walter R, Bailly G, Nischt M, Alt F (2012) Looking glass: a field study on noticing interactivity of a shop window. In: Proceedings of the CHI 2012, ACM Press, 297–306Google Scholar
  14. 14.
    Nakanishi Y, Fujii T, Kiatjima K, Sato Y, Koike H (2002) Vision-based face tracking system for large displays. In: Proceedings of the UbiComp 2002, Springer, 152–159Google Scholar
  15. 15.
    Peltonen P, Kurvinen E, Salovaara A, Jacucci G, Ilmonen T, Evans J, Oulasvirta A, Saarikko P (2008) It’s mine, don’t touch!: interactions at a large multi-touch display in a city centre. In: Proceedings of the CHI 2008, ACM Press, 1285–1294Google Scholar
  16. 16.
    Ren G, Li C, O’Neill E, Willis P (2013) 3d freehand gestural navigation for interactive public displays. IEEE Comput Graph Appl 33(2):47–55CrossRefGoogle Scholar
  17. 17.
    Schmidt C, Müller J, Bailly G (2013) Screenfinity: extending the perception area of content on very large public displays. In: Proceedings of the CHI 2013, ACM Press, 1719–1728Google Scholar
  18. 18.
    Sippl A, Holzmann C, Zachhuber D, Ferscha A (2010) Real-time gaze tracking for public displays. In: Proceedings of the AmI 2010, Springer, 167–176Google Scholar
  19. 19.
    Smith BA, Yin Q, Feiner SK, Nayar SK (2013) Gaze locking: passive eye contact detection for human-object interaction. In: Proceedings UIST 2013, ACM Press, 271–280Google Scholar
  20. 20.
    Smith JD, Vertegaal R, Sohn C (2005) Viewpointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. In: Proceedings of the UIST 2005, ACM Press, 53–61Google Scholar
  21. 21.
    Turner J, Alexander J, Bulling A, Schmidt D, Gellersen H (2013) Eye pull, eye push: moving objects between large screens and personal devices with gaze & touch. In: Proceedings of the INTERACT 2013, 170–186Google Scholar
  22. 22.
    Turner J, Bulling A, Alexander J, Gellersen H (2014) Cross-device gaze-supported point-to-point content transfer. In: Proceedings of the ETRA 2014, ACM Press, 19–26Google Scholar
  23. 23.
    Vertegaal R, Mamuji A, Sohn C, Cheng D (2005) Media eyepliances: using eye tracking for remote control focus selection of appliances. In: CHI EA 2005, ACM Press, 1861–1864Google Scholar
  24. 24.
    Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the UbiComp 2013, ACM Press, 439–448Google Scholar
  25. 25.
    Vogel D, Balakrishnan R (2004) Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users. In: Proceedings of the UIST 2004, ACM, 137–146Google Scholar
  26. 26.
    Walter R, Bailly G, Müller J (2013) Strikeapose: revealing mid-air gestures on public displays. In: Proceedings of the CHI 2013, ACM Press, 841–850Google Scholar
  27. 27.
    Zhai S, Morimoto C, Ihde S (1999) Manual and gaze input cascaded (magic) pointing. In: Proceedings of the CHI 1999, ACM Press, 246–253Google Scholar
  28. 28.
    Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the CHI 2013, ACM Press, 851–860Google Scholar
  29. 29.
    Zhang Y, Bulling A, Gellersen H (2014) PCR: a calibration-free method for tracking horizontal gaze direction using a single camera. In: Proceedings of the AVI 2014, ACM Press, 129–132Google Scholar
  30. 30.
    Zhang Y, Müller J, Chong MK, Bulling A, Gellersen H (2014) GazeHorizon: enabling passers-by to interact with public displays by gaze. In: Proceedings of the UbiComp 2014, ACM Press, 559–563Google Scholar
  31. 31.
    Zhu D, Gedeon T, Taylor K (2011) Moving to the centre: a gaze-driven remote camera control for teleoperation. Interact Comput 23(1):85–95CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2015

Authors and Affiliations

  • Yanxia Zhang
    • 1
  • Ming Ki Chong
    • 1
  • Jörg Müller
    • 2
  • Andreas Bulling
    • 3
  • Hans Gellersen
    • 1
  1. 1.Lancaster UniversityLancasterUK
  2. 2.Aarhus UniversityAarhusDenmark
  3. 3.Max Planck Institute for InformaticsSaarbrückenGermany

Personalised recommendations