Advertisement

Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses

  • Rainhard Dieter FindlingEmail author
  • Le Ngu Nguyen
  • Stephan Sigg
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11506)

Abstract

Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%–91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%–99.2%.

Keywords

Closed eyes Gaze gestures Machine learning Mobile computing Recognition Smart glasses Time series analysis 

References

  1. 1.
    Bulling, A., Ward, J.A., Gellersen, H., Troster, G.: Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 741–753 (2011)CrossRefGoogle Scholar
  2. 2.
    Bulling, A., Roggen, D., Tröster, G.: It’s in your eyes - towards context-awareness and mobile HCI using wearable EOG goggles. In: Proceedings of the UbiComp 2008, vol. 344, pp. 84–93, September 2008Google Scholar
  3. 3.
    Bulling, A., Roggen, D., Tröster, G.: Wearable EOG goggles: eye-based interaction in everyday environments. In: CHI 2009 Extended Abstracts on Human Factors in Computing Systems, pp. 3259–3264. ACM, New York (2009)Google Scholar
  4. 4.
    Drewes, H., De Luca, A., Schmidt, A.: Eye-gaze interaction for mobile phones. In: Proceedings of the Mobility 2007, pp. 364–371. ACM, New York (2007)Google Scholar
  5. 5.
    Drewes, H., Schmidt, A.: Interacting with the computer using gaze gestures. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) INTERACT 2007. LNCS, vol. 4663, pp. 475–488. Springer, Heidelberg (2007).  https://doi.org/10.1007/978-3-540-74800-7_43CrossRefGoogle Scholar
  6. 6.
    Heikkilä, H., Räihä, K.J.: Speed and accuracy of gaze gestures. J. Eye Mov. Res. 3(2), 1–14 (2009)Google Scholar
  7. 7.
    Hintze, D., Hintze, P., Findling, R.D., Mayrhofer, R.: A large-scale, long-term analysis of mobile device usage characteristics. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 1, no. 2, June 2017CrossRefGoogle Scholar
  8. 8.
    Hyrskykari, A., Istance, H., Vickers, S.: Gaze gestures or dwell-based interaction? In: Proceedings of the ETRA 2012, pp. 229–232. ACM, New York (2012)Google Scholar
  9. 9.
    Ishimaru, S., Kunze, K., Tanaka, K., Uema, Y., Kise, K., Inami, M.: Smart eyewear for interaction and activity recognition. In: Proceedings of the Conference Extended Abstracts on Human Factors in Computing Systems, pp. 307–310. ACM (2015)Google Scholar
  10. 10.
    Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., Vickers, S.: Designing gaze gestures for gaming: an investigation of performance. In: Proceedings of the ETRA 2010, pp. 323–330. ACM, New York (2010)Google Scholar
  11. 11.
    Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the UbiComp 2014, Adjunct Publication, pp. 1151–1160. ACM (2014)Google Scholar
  12. 12.
    Wobbrock, J.O., Myers, B.A., Kembel, J.A.: Edgewrite: A stylus-based text entry method designed for high accuracy and stability of motion. In: Proceedings of the UIST 2003, pp. 61–70. ACM, New York (2003)Google Scholar
  13. 13.
    Wobbrock, J.O., Rubinstein, J., Sawyer, M.W., Duchowski, A.T.: Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In: Proceedings of the ETRA 2008, pp. 11–18. ACM, New York (2008)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Ambient Intelligence Group, Department of Communications and NetworkingAalto UniversityEspooFinland

Personalised recommendations