EyeMote – Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography
Physical activity has emerged as a novel input modality for so-called active video games. Input devices such as music instruments, dance mats or the Wii accessories allow for novel ways of interaction and a more immersive gaming experience. In this work we describe how eye movements recognised from electrooculographic (EOG) signals can be used for gaming purposes in three different scenarios. In contrast to common video-based systems, EOG can be implemented as a wearable and light-weight system which allows for long-term use with unconstrained simultaneous physical activity. In a stationary computer game we show that eye gestures of varying complexity can be recognised online with equal performance to a state-of-the-art video-based system. For pervasive gaming scenarios, we show how eye movements can be recognised in the presence of signal artefacts caused by physical activity such as walking. Finally, we describe possible future context-aware games which exploit unconscious eye movements and show which possibilities this new input modality may open up.
KeywordsInput Modality Signal Artefact Pervasive Game Gaming Application Game Interface
Unable to display preview. Download preview PDF.
- 1.Bulling, A., Ward, J.A., Gellersen, H., Tröster, G.: Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography. In: Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pp. 19–37 (2008)Google Scholar
- 6.Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1999), pp. 246–253 (1999)Google Scholar
- 7.Qvarfordt, P., Zhai, S.: Conversing with the user based on eye-gaze patterns. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2005), pp. 221–230 (2005)Google Scholar
- 9.Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1990), pp. 11–18 (1990)Google Scholar
- 10.Smith, J.D., Graham, T.C.N.: Use of eye movements for video game control. In: Proc. of the International Conference on Advances in Computer Entertainment Technology (ACE 2006), pp. 20–27 (2006)Google Scholar
- 13.Wijesoma, W.S., Kang, S.W., Ong, C.W., Balasuriya, A.P., Koh, T.S., Kow, K.S.: EOG based control of mobile assistive platforms for the severely disabled. In: Proc. of the International Conference on Robotics and Biomimetics (ROBIO 2005), pp. 490–494 (2005)Google Scholar
- 14.Mizuno, F., Hayasaka, T., Tsubota, K., Wada, S., Yamaguchi, T.: Development of hands-free operation interface for wearable computer-hyper hospital at home. In: Proc. of the 25th Annual International Conference of the Engineering in Medicine and Biology Society (EMBS 2003), pp. 3740–3743 (2003)Google Scholar
- 15.Patmore, D.W., Knapp, R.B.: Towards an EOG-based eye tracker for computer control. In: Proc. of the 3rd International ACM Conference on Assistive Technologies (Assets 1998), pp. 197–203 (1998)Google Scholar
- 16.Bulling, A., Roggen, D., Tröster, G.: It’s in Your Eyes - Towards Context-Awareness and Mobile HCI Using Wearable EOG Goggles. In: Proc. of the 10th International Conference on Ubiquitous Computing (UbiComp 2008), pp. 84–93 (2008)Google Scholar
- 18.Isokoski, P., Hyrskykari, A., Kotkaluoto, S., Martin, B.: Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes. In: Proc. of the 3rd Conference on Communication by Gaze Interaction (COGAIN 2007), pp. 78–81 (2007)Google Scholar
- 20.Csíkszentmihályi, M.: Flow: The Psychology of Optimal Experience. Harper Collins, New York (1991)Google Scholar