The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse

  • Heiko Drewes
  • Albrecht Schmidt
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5727)

Abstract

In this paper, we show how to use the combination of eye-gaze and a touch-sensitive mouse to ease pointing tasks in graphical user interfaces. A touch of the mouse positions the mouse pointer at the current gaze position of the user. Thus, the pointer is always at the position where the user expects it on the screen. This approach changes the user experience in tasks that include frequent switching between keyboard and mouse input (e.g. working with spreadsheets). In a user study, we compared the touch-sensitive mouse with a traditional mouse and observed speed improvements for pointing tasks on complex backgrounds. For pointing task on plain backgrounds, performances with both devices were similar, but users perceived the gaze-sensitive interaction of the touch-sensitive mouse as being faster and more convenient. Our results show that using a touch-sensitive mouse that positions the pointer on the user’s gaze position reduces the need for mouse movements in pointing tasks enormously.

Keywords

Eye-tracking eye-gaze pointing touch-sensitive mouse MAGIC pointing 

References

  1. 1.
    Ashdown, M., Oka, K., Sato, Y.: Combining head tracking and mouse input for a GUI on multiple monitors. In: Extended Abstracts on Human Factors in Computing Systems. CHI 2005, pp. 1188–1191. ACM Press, New York (2005)Google Scholar
  2. 2.
    Bolt, R.A.: Gaze-orchestrated dynamic windows. In: Proceedings of the 8th Annual Conference on Computer Graphics and interactive Techniques. SIGGRAPH 1981, pp. 109–119. ACM Press, New York (1981)Google Scholar
  3. 3.
    Card, S.K., English, W.K., Burr, B.J.: Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics 21, 601–613 (1978)CrossRefGoogle Scholar
  4. 4.
    Duchowski, A.: Eye Tracking Methodology: Theory & Practice. Springer, Heidelberg (2003)CrossRefMATHGoogle Scholar
  5. 5.
    ERICA eye tracker product description, http://www.eyeresponse.com
  6. 6.
    Fitts, P.M.: The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology 47, 381–391 (1954)CrossRefGoogle Scholar
  7. 7.
    Hinckley, K., Sinclair, M.: Touch-sensing input devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI 1999, pp. 223–230. ACM Press, New York (1999)Google Scholar
  8. 8.
    Jacob, R.J.: The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. ACM Trans. Inf. Syst 9(2), 152–169 (1991)CrossRefGoogle Scholar
  9. 9.
    Kumar, M., Paepcke, A., Winograd, T.: EyePoint: practical pointing and selection using gaze and keyboard. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI 2007, pp. 421–430. ACM Press, New York (2007)CrossRefGoogle Scholar
  10. 10.
    Majaranta, P., Räihä, K.: Twenty years of eye typing: systems and design issues. In: Proceedings of the, Symposium on Eye Tracking Research & Applications ETRA 2002, pp. 15–22. ACM Press, New York (2002)Google Scholar
  11. 11.
    Majaranta, P., Aula, A., Räihä, K.: Effects of feedback on eye typing with a short dwell time. In: Proceedings of the, Symposium on Eye Tracking Research & Applications ETRA 2004, pp. 139–146. ACM Press, New York (2004)Google Scholar
  12. 12.
    Miniotas, D., Špakov, O., MacKenzie, I.S.: Eye gaze interaction with expanding targets. In: Extended Abstracts on Human Factors in Computing Systems CHI 2004, pp. 1255–1258. ACM Press, New York (2004)CrossRefGoogle Scholar
  13. 13.
    Qprox capacitive sensor QT 113 product description, http://www.qprox.com/
  14. 14.
    Salvucci, D.D., Anderson, J.R.: Intelligent gaze-added interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI 2000, pp. 273–280. ACM Press, New York (2000)Google Scholar
  15. 15.
    Ware, C., Mikaelian, H.H.: An evaluation of an eye tracker as a device for computer input. In: Proceedings of the CHI + GI 1987, pp. 183–188. ACM Press, New York (1987)Google Scholar
  16. 16.
    Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI 1999, pp. 246–253. ACM Press, New York (1999)Google Scholar
  17. 17.
    Zhang, Q., Imamiya, A., Go, K., Mao, X.: Resolving ambiguities of a gaze and speech interface. In: Proceedings of the, Symposium on Eye Tracking Research & Applications ETRA 2004, pp. 85–92. ACM Press, New York (2004)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2009

Authors and Affiliations

  • Heiko Drewes
    • 1
  • Albrecht Schmidt
    • 2
  1. 1.Media Informatics GroupLMU University of MunichMünchenGermany
  2. 2.Pervasive Computing GroupUniversity of Duisburg-EssenEssenGermany

Personalised recommendations