GazeAR: Mobile Gaze-Based Interaction in the Context of Augmented Reality Games

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9768)

Abstract

Gaze-based interaction in the gaming context offers various research opportunities. However, when looking at available games supported by eye tracking technology it becomes apparent that the potential has not been fully exploited: a majority of gaze-based games are tailored for static settings (desktop PC). We propose an experimental setting that transfers approaches of mobile gaze-based interactions to the augmented reality (AR) games domain. It is our main aim to find out if the inclusion of gaze input in an AR game has a positive impact on the User Experience (UX) in comparison to a solely touch-based approach. By doing so designers and researchers should receive insights in the design of gaze-based mobile AR games. To find answers we carried out a comparative study consisting of two mobile game prototypes. Results show that the inclusions of gaze in AR games is very well received by players and this novel approach was preferred in comparison to a design without gaze interaction.

Keywords

Gaze-based interaction Augmented reality games 

References

  1. 1.
    Burnett, D., Coulton, P., Murphy, E., Race, N.: Designing mobile augmented reality interfaces for locative games and playful experiences. In: Proceedings of the Digital Games Research Conference 2014 (2014). http://eprints.lancs.ac.uk/69998/1/DiGRA_Scarecrows_Full_paper.pdf
  2. 2.
    Cordeiro, D., Correia, N., Jesus, R.M.: Arzombie: a mobile augmented reality game with multimodal interaction. In: 7th International Conference on Intelligent Technologies for Interactive Entertainment, INTETAIN 2015, Torino, Italy, 10–12 June 2015, pp. 22–31 (2015). http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=7325481
  3. 3.
    Dal Corso, A., Olsen, M., Steenstrup, K.H., Wilm, J., Jensen, S., Paulsen, R.R., Eiríksson, E., Nielsen, J., Frisvad, J.R., Einarsson, G., Kjer, H.M.: Virtualtable: a projection augmented reality game. In: SIGGRAPH Asia 2015 Posters, SA 2015, pp. 40:1–40:1. ACM, New York (2015). http://doi.acm.org/10.1145/2820926.2820950
  4. 4.
    Dybdal, M.L., Agustin, J.S., Hansen, J.P.: Gaze input for mobile devices by dwell and gestures. In: Proceedings of the Eye Tracking Research and Applications, ETRA 2012, pp. 225–228. ACM, New York (2012). http://doi.acm.org/10.1145/2168556.2168601
  5. 5.
    Hagbi, N., Grasset, R., Bergig, O., Billinghurst, M., El-Sana, J.: In-placesketching for augmented reality games. Comput. Entertain. 12(3), 3:1–3:18 (2015). http://doi.acm.org/10.1145/2702109.2633419 CrossRefGoogle Scholar
  6. 6.
    Hassenzahl, M., Burmester, M., Koller, F.: Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualitt. In: Szwillus, G., Ziegler, J. (eds.) Mensch & Computer 2003: Interaktion in Bewegung, pp. 187–196. B. G. Teubner, Stuttgart (2003)CrossRefGoogle Scholar
  7. 7.
    Lander, C., Gehring, S., Krüger, A., Boring, S., Bulling, A.: GazeProjector: Location-independent Gaze Interaction on and Across Multiple Displays. DFKI Research Report 15–01, DFKI, Saarbrücken (2015). http://www.dfki.de/web/research/publications?pubid=7618
  8. 8.
    Nagamatsu, T., Yamamoto, M., Sato, H.: Mobigaze: development of a gaze interface for handheld mobile devices. In: CHI 2010 Extended Abstracts on Human Factors in Computing Systems, CHI EA 2010, pp. 3349–3354. ACM, New York (2010). http://doi.acm.org/10.1145/1753846.1753983
  9. 9.
    Patidar, P., Raghuvanshi, H., Sarcar, S.: Quickpie: an interface for fast and accurate eye gazed based text entry. In: India HCI 2013. ACM (2014)Google Scholar
  10. 10.
    Pouke, M., Karhu, A., Hickey, S., Arhippainen, L.: Gaze tracking and non-touch gesture based interaction method for mobile 3d virtual spaces. In: Proceedings of the Australian Computer-Human Interaction Conference, OzCHI 2012, pp. 505–512. ACM, New York (2012). http://doi.acm.org/10.1145/2414536.2414614
  11. 11.
    Singh, G., Bowman, D., Hicks, D., Cline, D., Ogle, J., Johnson, A., Zlokas, R., Tucker, T., Ragan, E.: Ci-spy: designing a mobile augmented reality system for scaffolding historical inquiry learning. In: 2015 IEEE International Symposium on Mixed and Augmented Reality - Media, Art, Social Science, Humanities and Design (ISMAR-MASH’D), pp. 9–14, September 2015Google Scholar
  12. 12.
    Smith, J.D., Graham, T.C.N.: Use of eye movements for video game control. In: Proceedings of the Advances in Computer Entertainment Technology, ACE 2006. ACM, New York (2006). http://doi.acm.org/10.1145/1178823.1178847
  13. 13.
    Sundstedt, V.: Gazing at games: using eye tracking to control virtual characters. In: ACM SIGGRAPH 2010 Courses, SIGGRAPH 2010, pp. 5:1–5:160. ACM, New York (2010). http://doi.acm.org/10.1145/1837101.1837106
  14. 14.
    Wood, E., Bulling, A.: Eyetab: model-based gaze estimation on unmodified tablet computers. In: Proceedings of the Eye Tracking Research and Applications, ETRA 2014, pp. 207–210. ACM, New York (2014). http://doi.acm.org/10.1145/2578153.2578185

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Digital MediaUniversity of Applied Sciences Upper AustriaHagenbergAustria
  2. 2.Department of Education and PsychologyJohannes Kepler University LinzLinzAustria

Personalised recommendations