Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch

  • Jayson Turner
  • Jason Alexander
  • Andreas Bulling
  • Dominik Schmidt
  • Hans Gellersen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8118)

Abstract

Previous work has validated the eyes and mobile input as a viable approach for pointing at, and selecting out of reach objects. This work presents Eye Pull, Eye Push, a novel interaction concept for content transfer between public and personal devices using gaze and touch. We present three techniques that enable this interaction: Eye Cut & Paste, Eye Drag & Drop, and Eye Summon & Cast. We outline and discuss several scenarios in which these techniques can be used. In a user study we found that participants responded well to the visual feedback provided by Eye Drag & Drop during object movement. In contrast, we found that although Eye Summon & Cast significantly improved performance, participants had difficulty coordinating their hands and eyes during interaction.

Keywords

Eye-Based Interaction Mobile Cross-Device Content Transfer Interaction Techniques 

References

  1. 1.
    Aliakseyeu, D., Nacenta, M.A., Subramanian, S., Gutwin, C.: Bubble radar: efficient pen-based interaction. In: Proc. AVI 2006. ACM (2006)Google Scholar
  2. 2.
    Ballagas, R., Rohs, M., Sheridan, J., Borchers, J.: Byod: Bring your own device. In: UbiComp 2004 Workshop on Ubiquitous Display Environments, Nottingham, UK (September 2004)Google Scholar
  3. 3.
    Baudisch, P., Cutrell, E., Robbins, D., Czerwinski, M., Tandler, P., Bederson, B., Zierlinger, A.: Drag-and-pop and drag-and-pick: Techniques for accessing remote screen content on touch- and pen-operated systems. Proceedings of Interact, 57–64 (2003)Google Scholar
  4. 4.
    Bieg, H.J., Chuang, L.L., Fleming, R.W., Reiterer, H., Bülthoff, H.H.: Eye and pointer coordination in search and selection tasks. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, pp. 89–92. ACM, New York (2010)CrossRefGoogle Scholar
  5. 5.
    Boring, S., Baur, D., Butz, A., Gustafson, S., Baudisch, P.: Touch projector: mobile interaction through video. In: Proc. CHI 2010. ACM (2010)Google Scholar
  6. 6.
    Bragdon, A., DeLine, R., Hinckley, K., Morris, M.R.: Code space: touch + air gesture hybrid interactions for supporting developer meetings. In: Proc. ITS 2011 (2011)Google Scholar
  7. 7.
    Carter, S., Churchill, E., Denoue, L., Helfman, J., Nelson, L.: Digital graffiti: public annotation of multimedia content. In: CHI 2004 Extended Abstracts on Human Factors in Computing Systems, pp. 1207–1210. ACM (2004)Google Scholar
  8. 8.
    Dickie, C., Hart, J., Vertegaal, R., Eiser, A.: Lookpoint: an evaluation of eye input for hands-free switching of input devices between multiple computers. In: Proc. OZCHI 2006, pp. 119–126. ACM (2006)Google Scholar
  9. 9.
    Doeweling, S., Glaubitt, U.: Drop-and-drag: easier drag & drop on large touchscreen displays. In: Proc. NordiCHI 2010, pp. 158–167. ACM (2010)Google Scholar
  10. 10.
    Drewes, H., Schmidt, A.: The magic touch: Combining magic-pointing with a touch-sensitive mouse. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5727, pp. 415–428. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  11. 11.
    Greenberg, S., Boyle, M., Laberge, J.: PDAs and shared public displays: Making personal information public, and public information personal. Pers. and Ubiq. Comp. 3, 54–64 (1999)Google Scholar
  12. 12.
    Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proc. CHI 1990. ACM, New York (1990)Google Scholar
  13. 13.
    Kumar, M., Paepcke, A., Winograd, T.: Eyepoint: practical pointing and selection using gaze and keyboard. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2007, pp. 421–430. ACM, New York (2007)CrossRefGoogle Scholar
  14. 14.
    Lankford, C.: Effective eye-gaze input into windows. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, ETRA 2000, pp. 23–27. ACM, New York (2000)CrossRefGoogle Scholar
  15. 15.
    Mardanbegi, D., Hansen, D.W., Pederson, T.: Eye-based head gestures. In: Proc. ETRA 2012. ACM (2012)Google Scholar
  16. 16.
    Mardanbegi, D., Hansen, D.W.: Mobile gaze-based screen interaction in 3D environments. In: Proceedings of the 1st Conference on Novel Gaze-Controlled Applications, NGCA 2011, pp. 2:1–2:4. ACM, New York (2011)Google Scholar
  17. 17.
    Myers, B.A.: Using handhelds and PCs together. Comm. ACM 44, 34–41 (2001)CrossRefGoogle Scholar
  18. 18.
    Rekimoto, J.: Pick-and-Drop: A direct manipulation technique for multiple computer environments. In: Proc. UIST, pp. 31–39 (1997)Google Scholar
  19. 19.
    Schmidt, D., Seifert, J., Rukzio, E., Gellersen, H.: A cross-device interaction style for mobiles and surfaces. In: Proceedings of the Designing Interactive Systems Conference, DIS 2012, pp. 318–327. ACM, New York (2012)Google Scholar
  20. 20.
    Stellmach, S., Dachselt, R.: Look & touch: Gaze-supported target acquisition. In: Proc. CHI 2012. ACM (2012)Google Scholar
  21. 21.
    Stellmach, S., Dachselt, R.: Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In: Proc. CHI 2013. ACM (2013)Google Scholar
  22. 22.
    Stellmach, S., Stober, S., Nürnberger, A., Dachselt, R.: Designing gaze-supported multimodal interactions for the exploration of large image collections. In: Proceedings of the 1st Conference on Novel Gaze-Controlled Applications, NGCA 2011, pp. 1:1–1:8. ACM, New YorkGoogle Scholar
  23. 23.
    Tani, M., Yamaashi, K., Tanikoshi, K., Futakawa, M., Tanifuji, S.: Object-oriented video: Interaction with real-world objects through live video. In: CHI 1992, pp. 593–598 (1992)Google Scholar
  24. 24.
    Turner, J., Bulling, A., Gellersen, H.: Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. In: Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012. ACM Press (2012)Google Scholar
  25. 25.
    Ware, C., Mikaelian, H.: An evaluation of an eye tracker as a device for computer input. In: Proc. CHI 1987, pp. 183–188. ACM (1986)Google Scholar
  26. 26.
    Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (magic) pointing. In: Proc. CHI 1999, pp. 246–253. ACM, New York (1999)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2013

Authors and Affiliations

  • Jayson Turner
    • 1
  • Jason Alexander
    • 1
  • Andreas Bulling
    • 2
  • Dominik Schmidt
    • 3
  • Hans Gellersen
    • 1
  1. 1.School of Computing and CommunicationsLancaster UniversityLancasterUnited Kingdom
  2. 2.Perceptual User Interfaces GroupMax Planck Institute for InformaticsSaabrückenGermany
  3. 3.The Human Computer Interaction LabHasso Plattner InstitutePotsdamGermany

Personalised recommendations