Abstract
Purpose This paper is oriented to the virtual workshop scenario, discusses the creation of a feasible eye movement recognition model and provides new ideas for eye movement recognition and eye movement interaction in the virtual workshop. Method Obtain user’s eye movement data through eye tracker in real time, create blink behavior recognition model and gaze/saccade behavior recognition model based on convolutional neural network (CNN) and bidirectional long short-term memory network (Bi-LSTM). A data set was created by collecting eye movement data of three subjects for comparative experiments to verify the performance of the proposed model. Results The experimental results showed that the recognition accuracy, Kappa coefficient, F1 score and running time of the proposed model had certain advantages over other models. Conclusion The created hybrid model of eye movement recognition for virtual workshops has high reliability and effectiveness, which will create a foundation for future research on building eye movement interaction systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Wang S, Wang Q, Chen H et al (2018) Research and application of eye movement interaction based on eye movement recognition. MATEC web of conferences 246
Zheng Z, Li L, Jing C (2018) Deeply hierarchical bi-directional LSTM for sentiment classification. Comput Sci 045(008):213–217,252
Xiao-ying WU, Rui LI, Sheng-xi WU (2020) Action recognition algorithm based on CNN and bidirectional LSTM. Comput Eng Des 41(02):361–366
Xin L (2019) Research on intelligent human-computer interaction technology and application based on eye movement. Nanjing University
Pai YS, Dingler T, Kunze K (2019) Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality 23:119–131
Bellet Marie E, Bellet Joachim, Nienborg Hendrikje et al (2019) Human-level saccade detection performance using deep neural networks. J Neurophysiol 121:646–661
Tian-yi LUO, Da-yun LIU, Xiu-zheng LI et al (2019) Research on lip-reading recognition based on CNN and Bi-LSTM. Software Guide 18(10):36–39
Bing C, Chao Z, Xiaojuan D et al (2018) Convolutional neural network implementation for eye movement recognition based on video. In: Proceedings—2018 33rd youth academic annual conference of chinese association of automation, YAC2018, pp 179–184
Acknowledgements
The work is supported by Shanghai Municipal Commission of Economy and Informatization of China (No. 2018-GYHLW-02009).
Compliance with Ethical Standards
The study was approved by the Logistics Department for Civilian Ethics Committee of Shanghai University.
All subjects who participated in the experiment were provided with and signed an informed consent form.
All relevant ethical safeguards have been met with regard to subject protection.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Dong, M., Gao, Z., Liu, L. (2020). Hybrid Model of Eye Movement Behavior Recognition for Virtual Workshop. In: Long, S., Dhillon, B.S. (eds) Man-Machine-Environment System Engineering. MMESE 2020. Lecture Notes in Electrical Engineering, vol 645. Springer, Singapore. https://doi.org/10.1007/978-981-15-6978-4_27
Download citation
DOI: https://doi.org/10.1007/978-981-15-6978-4_27
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-6977-7
Online ISBN: 978-981-15-6978-4
eBook Packages: Computer ScienceComputer Science (R0)