Abstract
Gaze-based interaction in the gaming context offers various research opportunities. However, when looking at available games supported by eye tracking technology it becomes apparent that the potential has not been fully exploited: a majority of gaze-based games are tailored for static settings (desktop PC). We propose an experimental setting that transfers approaches of mobile gaze-based interactions to the augmented reality (AR) games domain. It is our main aim to find out if the inclusion of gaze input in an AR game has a positive impact on the User Experience (UX) in comparison to a solely touch-based approach. By doing so designers and researchers should receive insights in the design of gaze-based mobile AR games. To find answers we carried out a comparative study consisting of two mobile game prototypes. Results show that the inclusions of gaze in AR games is very well received by players and this novel approach was preferred in comparison to a design without gaze interaction.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Burnett, D., Coulton, P., Murphy, E., Race, N.: Designing mobile augmented reality interfaces for locative games and playful experiences. In: Proceedings of the Digital Games Research Conference 2014 (2014). http://eprints.lancs.ac.uk/69998/1/DiGRA_Scarecrows_Full_paper.pdf
Cordeiro, D., Correia, N., Jesus, R.M.: Arzombie: a mobile augmented reality game with multimodal interaction. In: 7th International Conference on Intelligent Technologies for Interactive Entertainment, INTETAIN 2015, Torino, Italy, 10–12 June 2015, pp. 22–31 (2015). http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=7325481
Dal Corso, A., Olsen, M., Steenstrup, K.H., Wilm, J., Jensen, S., Paulsen, R.R., EirÃksson, E., Nielsen, J., Frisvad, J.R., Einarsson, G., Kjer, H.M.: Virtualtable: a projection augmented reality game. In: SIGGRAPH Asia 2015 Posters, SA 2015, pp. 40:1–40:1. ACM, New York (2015). http://doi.acm.org/10.1145/2820926.2820950
Dybdal, M.L., Agustin, J.S., Hansen, J.P.: Gaze input for mobile devices by dwell and gestures. In: Proceedings of the Eye Tracking Research and Applications, ETRA 2012, pp. 225–228. ACM, New York (2012). http://doi.acm.org/10.1145/2168556.2168601
Hagbi, N., Grasset, R., Bergig, O., Billinghurst, M., El-Sana, J.: In-placesketching for augmented reality games. Comput. Entertain. 12(3), 3:1–3:18 (2015). http://doi.acm.org/10.1145/2702109.2633419
Hassenzahl, M., Burmester, M., Koller, F.: Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualitt. In: Szwillus, G., Ziegler, J. (eds.) Mensch & Computer 2003: Interaktion in Bewegung, pp. 187–196. B. G. Teubner, Stuttgart (2003)
Lander, C., Gehring, S., Krüger, A., Boring, S., Bulling, A.: GazeProjector: Location-independent Gaze Interaction on and Across Multiple Displays. DFKI Research Report 15–01, DFKI, Saarbrücken (2015). http://www.dfki.de/web/research/publications?pubid=7618
Nagamatsu, T., Yamamoto, M., Sato, H.: Mobigaze: development of a gaze interface for handheld mobile devices. In: CHI 2010 Extended Abstracts on Human Factors in Computing Systems, CHI EA 2010, pp. 3349–3354. ACM, New York (2010). http://doi.acm.org/10.1145/1753846.1753983
Patidar, P., Raghuvanshi, H., Sarcar, S.: Quickpie: an interface for fast and accurate eye gazed based text entry. In: India HCI 2013. ACM (2014)
Pouke, M., Karhu, A., Hickey, S., Arhippainen, L.: Gaze tracking and non-touch gesture based interaction method for mobile 3d virtual spaces. In: Proceedings of the Australian Computer-Human Interaction Conference, OzCHI 2012, pp. 505–512. ACM, New York (2012). http://doi.acm.org/10.1145/2414536.2414614
Singh, G., Bowman, D., Hicks, D., Cline, D., Ogle, J., Johnson, A., Zlokas, R., Tucker, T., Ragan, E.: Ci-spy: designing a mobile augmented reality system for scaffolding historical inquiry learning. In: 2015 IEEE International Symposium on Mixed and Augmented Reality - Media, Art, Social Science, Humanities and Design (ISMAR-MASH’D), pp. 9–14, September 2015
Smith, J.D., Graham, T.C.N.: Use of eye movements for video game control. In: Proceedings of the Advances in Computer Entertainment Technology, ACE 2006. ACM, New York (2006). http://doi.acm.org/10.1145/1178823.1178847
Sundstedt, V.: Gazing at games: using eye tracking to control virtual characters. In: ACM SIGGRAPH 2010 Courses, SIGGRAPH 2010, pp. 5:1–5:160. ACM, New York (2010). http://doi.acm.org/10.1145/1837101.1837106
Wood, E., Bulling, A.: Eyetab: model-based gaze estimation on unmodified tablet computers. In: Proceedings of the Eye Tracking Research and Applications, ETRA 2014, pp. 207–210. ACM, New York (2014). http://doi.acm.org/10.1145/2578153.2578185
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Lankes, M., Stiglbauer, B. (2016). GazeAR: Mobile Gaze-Based Interaction in the Context of Augmented Reality Games. In: De Paolis, L., Mongelli, A. (eds) Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2016. Lecture Notes in Computer Science(), vol 9768. Springer, Cham. https://doi.org/10.1007/978-3-319-40621-3_28
Download citation
DOI: https://doi.org/10.1007/978-3-319-40621-3_28
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-40620-6
Online ISBN: 978-3-319-40621-3
eBook Packages: Computer ScienceComputer Science (R0)