Advertisement

User Defined Eye Movement-Based Interaction for Virtual Reality

  • Wen-jun Hou
  • Kai-xiang ChenEmail author
  • Hao Li
  • Hu Zhou
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10911)

Abstract

Most of the applications of eye movement-based interaction in VR are limited to blinking and gaze at present, however, gaze gestures were neglected. Therefore, the potential of eye movement-based interaction in VR is far from being realized. In addition, many scholars tried to define some special eye movements as input instructions, but these definitions are almost always empirical and neglect users’ habits and cultural background. In this paper, we focus on how Chinese users interact in VR using eye movements without relying on a graphical user interface. We present a guessability study focusing on intuitive eye movement-based interaction of common commands in 30 tasks of 3 categories in VR. A total of 360 eye movements were collected from 12 users and a consensus set of eye movements in VR that best met user’s cognition was obtained. This set can be applied to the design of eye movement-based interaction in VR to help designers to develop user-centered and intuitive eye movement-based interaction in VR. Meanwhile this set can be migrated to other interactive media and user interfaces, such as a Post-WIMP interface base on eye movement-based interaction, as a reference to design.

Keywords

Eye movement-based interaction Gaze gesture Virtual reality Guessability Intuitive interaction 

References

  1. 1.
    Jerald, J., LaViola Jr., J.J., Marks, R.: VR interactions. In: ACM SIGGRAPH, Los Angeles, USA, pp. 1–105. ACM Press, New York (2017)Google Scholar
  2. 2.
    Jacob, R.J.: Eye movement-based human-computer interaction techniques: toward non-command interfaces. Adv. Hum. Comput. Interact. 4, 151–190 (1993)Google Scholar
  3. 3.
    Haber, R.N., Hershenson, M.: The Psychology of Visual Perception, 1st edn. Holt, Rinehart & Winston, Oxford (1980)Google Scholar
  4. 4.
    Young, L.R., Sheena, D.: Survey of eye movement recording methods. Behav. Res. Methods Instrum. 7(5), 397–429 (1975)CrossRefGoogle Scholar
  5. 5.
    Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. ACM Trans. Inf. Syst. 9(2), 152–169 (1990)CrossRefGoogle Scholar
  6. 6.
    Goldberg, J.H., Kotval, X.P.: Computer interface evaluation using eye movements: methods and constructs. Int. J. Ind. Ergon. 24(6), 631–645 (1999)CrossRefGoogle Scholar
  7. 7.
    Hyrskykari, A., Istance, H., Vickers, S.: Gaze gestures or dwell-based interaction? In: Proceedings of the Symposium on Eye Tracking Research and Applications, California, USA, pp. 229–232. ACM Press, New York (2012)Google Scholar
  8. 8.
    Shioiri, S., Cavanagh, P.: Saccadic suppression of low-level motion. Vis. Res. 29(8), 915–928 (1989)CrossRefGoogle Scholar
  9. 9.
    Blackler, A., Popovic, V., Mahar, D.: Investigating users’ intuitive interaction with complex artefacts. Appl. Ergon. 41(1), 72–92 (2010)CrossRefGoogle Scholar
  10. 10.
    Cooper, A., Reimann, R., Cronin, D.: About Face 3: The Essentials of Interaction Design. Wiley Publishing, Inc., Indiana (2007)Google Scholar
  11. 11.
    Wobbrock, J.O., Aung, H.H., Rothrock, B., Myers, B.A.: Maximizing the guessability of symbolic input. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, Portland, USA, pp. 1869–1872. ACM Press, New York (2005)Google Scholar
  12. 12.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, USA, pp. 1083–1092. ACM Press, New York (2009)Google Scholar
  13. 13.
    Ruiz, J., Li, Y., Lank, E.: User-defined motion gestures for mobile interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, pp. 197–206. ACM Press, New York (2011)Google Scholar
  14. 14.
    Vatavu, R.D.: User-defined gestures for free-hand TV control. In: Proceedings of the 10th European Conference on Interactive TV and Video, Berlin, Germany, pp. 45–48. ACM Press, New York (2012)Google Scholar
  15. 15.
    Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems, Paris, France, vol. 8118, pp. 955–960. ACM Press, New York (2013)Google Scholar
  16. 16.
    Silpasuwanchai, C., Ren, X.: Jump and shoot!: prioritizing primary and alternative body gestures for intense gameplay. In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems, Toronto, Canada, pp. 951–954. ACM Press, New York (2014)Google Scholar
  17. 17.
    Leng, H.Y., Norowi, N.M., Jantan, A.H.: A user-defined gesture set for music interaction in immersive virtual environment. In: Proceedings of the 3rd International Conference on Human-Computer Interaction and User Experience, Indonesia, pp. 44–51. ACM Press, New York (2017)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Wen-jun Hou
    • 1
    • 2
  • Kai-xiang Chen
    • 1
    • 2
    Email author
  • Hao Li
    • 1
    • 2
  • Hu Zhou
    • 1
    • 2
  1. 1.School of Digital Media and Design ArtsBeijing University of Posts and TelecommunicationsBeijingChina
  2. 2.Beijing Key Laboratory of Network Systems and Network CultureBeijing University of Posts and TelecommunicationsBeijingChina

Personalised recommendations