Skip to main content

Exploring Eye Expressions for Enhancing EOG-Based Interaction

  • Conference paper
  • First Online:
Human-Computer Interaction – INTERACT 2023 (INTERACT 2023)

Abstract

This paper explores the classification of eye expressions for EOG-based interaction using JINS MEME, an off-the-shelf eye-tracking device. Previous studies have demonstrated the potential for using electrooculography (EOG) for hands-free human-computer interaction using eye movements (directional, smooth pursuit) and eye expressions (blinking, winking). We collected a comprehensive set of 14 eye gestures to explore how well both types of eye gestures be classified together in a machine learning model. Using a Random Forest classifier trained on our collected data using 15 engineered features, we obtained an overall classification performance of 0.77 (AUC). Our results show that we can reliably classify eye expressions, enhancing the range of available eye gestures for hands-free interaction. With continued development and refinement in EOG-based technology, our findings have long-term implications for improving the usability of the technology in general and for individuals who require a richer vocabulary of eye gestures to interact hands-free.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://jinsmeme.com/.

References

  1. Alam, M.M., Raihan, M.M.S., Chowdhury, M.R., Shams, A.B.: High precision eye tracking based on electrooculography (EOG) signal using artificial neural network (ANN) for smart technology application. In: 2021 24th International Conference on Computer and Information Technology (ICCIT) (2021). https://doi.org/10.1109/ICCIT54785.2021.9689821

  2. Barbara, N., Camilleri, T.A., Camilleri, K.P.: EOG-based eye movement detection and gaze estimation for an asynchronous virtual keyboard. Biomed. Signal Process. Control 47, 159–167 (2019). https://doi.org/10.1016/j.bspc.2018.07.005

    Article  Google Scholar 

  3. Belkhiria, C., Boudir, A., Hurter, C., Peysakhovich, V.: EOG-based human-computer interface: 2000–2020 review. Sensors 22(13) (2022). https://doi.org/10.3390/s22134914

  4. Bhuyain, M.F., Kabir Shawon, M.A.U., Sakib, N., Faruk, T., Islam, M.K., Salim, K.M.: Design and development of an EOG-based system to control electric wheelchair for people suffering from quadriplegia or quadriparesis. In: International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST) (2019). https://doi.org/10.1109/ICREST.2019.8644378

  5. Bulling, A., Roggen, D., Tröster, G.: It’s in your eyes: towards context-awareness and mobile HCI using wearable EOG goggles. In: Proceedings of the 10th International Conference on Ubiquitous Computing (2008). https://doi.org/10.1145/1409635.1409647

  6. Bulling, A., Roggen, D., Tröster, G.: Wearable EOG goggles: eye-based interaction in everyday environments. In: CHI 2009 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2009), pp. 3259–3264. ACM, New York (2009). https://doi.org/10.1145/1520340.1520468

  7. Bulling, A., Ward, J.A., Gellersen, H., Tröster, G.: Eye movement analysis for activity recognition. In: Proceedings of the 11th International Conference on Ubiquitous Computing (2009). https://doi.org/10.1145/1620545.1620552

  8. Chang, W.D.: Electrooculograms for human-computer interaction: a review. Sensors 19(12) (2019). https://doi.org/10.3390/s19122690

  9. Daniel, C., Loganathan, S.: A comparison of machine learning and deep learning methods with rule based features for mixed emotion analysis. Int. J. Intell. Eng. Syst. 14(1), 42–53 (2021)

    Google Scholar 

  10. Díaz, D., Yee, N., Daum, C., Stroulia, E., Liu, L.: Activity classification in independent living environment with JINS MEME eyewear. In: IEEE International Conference on Pervasive Computing and Communications (PerCom) (2018). https://doi.org/10.1109/PERCOM.2018.8444580

  11. Dietz, M., Schork, D., Damian, I., Steinert, A., Haesner, M., André, E.: Automatic detection of visual search for the elderly using eye and head tracking data. KI - Künstliche Intelligenz 31(4), 339–348 (2017). https://doi.org/10.1007/s13218-017-0502-z

    Article  Google Scholar 

  12. Drewes, H., Schmidt, A.: Interacting with the computer using gaze gestures. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) INTERACT 2007. LNCS, vol. 4663, pp. 475–488. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-74800-7_43

  13. Findling, R.D., Quddus, T., Sigg, S.: Hide my gaze with EOG! Towards closed-eye gaze gesture passwords that resist observation-attacks with electrooculography in smart glasses. In: Proceedings of the 17th International Conference on Advances in Mobile Computing & Multimedia (MoMM2019), pp. 107–116. ACM, New York (2020). https://doi.org/10.1145/3365921.3365922

  14. Hossain, Z., Shuvo, M., Sarker, P.: Hardware and software implementation of real time electrooculogram (EOG) acquisition system to control computer cursor with eyeball movement. In: 4th International Conference on Advances in Electrical Engineering. vol. 2018-January, pp. 132–137 (2017). https://doi.org/10.1109/ICAEE.2017.8255341

  15. Ku, P.S., Wu, T.Y., Chen, M.Y.: EyeExpression: exploring the use of eye expressions as hands-free input for virtual and augmented reality devices. In: Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST 2017), New York (2017). https://doi.org/10.1145/3139131.3141206

  16. Ku, P.S., Wu, T.Y., Bastias, E.A.V., Chen, M.Y.: Wink it: investigating wink-based interactions for smartphones. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct (2018). https://doi.org/10.1145/3236112.3236133

  17. Ku, P.S., Wu, T.Y., Chen, M.Y.: EyeExpress: expanding hands-free input vocabulary using eye expressions. In: Adjunct Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (2018). https://doi.org/10.1145/3266037.3266123

  18. Kumar, D., Sharma, A.: Electrooculogram-based virtual reality game control using blink detection and gaze calibration. In: 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI), pp. 2358–2362 (2016). https://doi.org/10.1109/ICACCI.2016.7732407

  19. Lin, C.T., et al.: EOG-based eye movement classification and application on HCI baseball game. In: IEEE Access: Practical Inovations, Open Solutions (2019). https://doi.org/10.1109/ACCESS.2019.2927755

  20. Mala, S., Latha, K.: Feature selection in categorizing activities by eye movements using electrooculograph signals. In: International Conference on Science Engineering and Management Research (ICSEMR) (2014). https://doi.org/10.1109/ICSEMR.2014.7043559

  21. Ramirez Gomez, A.R., Clarke, C., Sidenmark, L., Gellersen, H.: Gaze+hold: eyes-only direct manipulation with continuous gaze modulated by closure of one eye. In: ACM Symposium on Eye Tracking Research and Applications, Virtual Event, Germany (2021). https://doi.org/10.1145/3448017.3457381

  22. Ramkumar, S., Sathesh Kumar, K., Emayavaramban, G.: EOG signal classification using neural network for human computer interaction. Int. J. Control Theory Appl. 9(24), 223–231 (2016)

    Google Scholar 

  23. Rostaminia, S., Lamson, A., Maji, S., Rahman, T., Ganesan, D.: W!NCE: unobtrusive sensing of upper facial action units with EOG-Based eyewear. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3(1) (2019). https://doi.org/10.1145/3314410

  24. Vidal, M., Bulling, A., Gellersen, H.: Analysing EOG signal features for the discrimination of eye movements with wearable devices. In: Proceedings of the 1st International Workshop on Pervasive Eye Tracking & Mobile Eye-Based Interaction (PETMEI 2011), pp. 15–20. ACM, New York (2011). https://doi.org/10.1145/2029956.2029962

  25. Wu, S.L., Liao, L.D., Lu, S.W., Jiang, W.L., Chen, S.A., Lin, C.T.: Controlling a human-computer interface system with a novel classification method that uses electrooculography signals. IEEE Trans. Biomed. Eng. 60(8), 2133–2141 (2013). https://doi.org/10.1109/TBME.2013.2248154

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant No. 101021229, GEMINI: Gaze and Eye Movement in Interaction).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joshua Newn .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Newn, J., Quesada, S., Hou, B.J., Khan, A.A., Weidner, F., Gellersen, H. (2023). Exploring Eye Expressions for Enhancing EOG-Based Interaction. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2023. INTERACT 2023. Lecture Notes in Computer Science, vol 14145. Springer, Cham. https://doi.org/10.1007/978-3-031-42293-5_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-42293-5_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-42292-8

  • Online ISBN: 978-3-031-42293-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics