Advertisement

The Costs and Benefits of Combining Gaze and Hand Gestures for Remote Interaction

  • Yanxia Zhang
  • Sophie Stellmach
  • Abigail Sellen
  • Andrew Blake
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9298)

Abstract

Gaze has been proposed as an ideal modality for supporting remote target selection. We explored the potential of integrating gaze with hand gestures for remote interaction on a large display in terms of user experience and preference. We conducted a lab study to compare interaction in a photo-sorting task using gesture only, or the combination of gaze plus gesture. Results from the study show that a combination of gaze and gesture input can lead to significantly faster selection, reduced hand fatigue and increased ease of use compared to using only hand input. People largely preferred the combination of gaze for target selection and hand gestures for manipulation. However, gaze can cause particular kinds of errors and can induce a cost due to switching modalities.

Keywords

Hand gestural interface Gaze interaction Mid-air gestures Remote interaction Large display Smart living room 

References

  1. 1.
    Vogel, D., Balakrishnan, R.: Distant freehand pointing and clicking on very large, high resolution displays. In: Proceedings of UIST 2005, pp. 33–42. ACM (2005)Google Scholar
  2. 2.
    Sibert, L.E., Jacob, R.J.K.: Evaluation of eye gaze interaction. In: Proceedings of CHI 2000, pp. 281–288. ACM (2000)Google Scholar
  3. 3.
    Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing. In: Proceedings of CHI 1999, pp. 246–253. ACM (1999)Google Scholar
  4. 4.
    Zhang, Y., Bulling, A., Gellersen, H.: SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of CHI 2013, pp. 851–860. ACM (2013)Google Scholar
  5. 5.
    Stellmach, S., Dachselt, R.: Look and touch: gaze-supported target acquisition. In: Proceedings of CHI 2012, pp. 2981–2990. ACM (2012)Google Scholar
  6. 6.
    Turner, J., Alexander, J., Bulling, A., Schmidt, D., Gellersen, H.: Eye pull, eye push: moving objects between large screens and personal devices with gaze and touch. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013, Part II. LNCS, vol. 8118, pp. 170–186. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  7. 7.
    Pelz, J., Hayhoe, M., Loeber, R.: The coordination of eye, head, and hand movements in a natural task. Exp. Brain Res. 139, 266–277 (2001)CrossRefGoogle Scholar
  8. 8.
    Kosunen, I., Jylha, A., Ahmed, I., An, C., Chech, L., Gamberini, L., Cavazza, M., Jacucci, G.: Comparing eye and gesture pointing to drag items on large screens. In: Proceedings of ITS 2013, pp. 425–428. ACM (2013)Google Scholar
  9. 9.
    Yoo, B., Han, J-J., Choi, C., Yi, K., Suh, S., Park, D., Kim, C.: 3D user interface combining gaze and hand gestures for large-scale display. In: Proceedings of EA CHI 2010, pp. 3709–3714. ACM (2010)Google Scholar
  10. 10.
    Keskin, C., Kıraç, F., Kara, Y.E., Akarun, L.: Hand pose estimation and hand shape classification using multi-layered randomized decision forests. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012, Part VI. LNCS, vol. 7577, pp. 852–863. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  11. 11.
    Kumar, M., Paepcke, A., Winograd, T.: EyePoint: practical pointing and selection using gaze and keyboard. In: Proceedings of CHI 2007, pp. 421–430. ACM (2007)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  • Yanxia Zhang
    • 1
  • Sophie Stellmach
    • 2
  • Abigail Sellen
    • 3
  • Andrew Blake
    • 3
  1. 1.Lancaster UniversityLancasterUK
  2. 2.Microsoft CorporationRedmondUSA
  3. 3.Microsoft ResearchCambridgeUK

Personalised recommendations