Skip to main content

Hybrid Model of Eye Movement Behavior Recognition for Virtual Workshop

  • Conference paper
  • First Online:
Man-Machine-Environment System Engineering (MMESE 2020)

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 645))

Included in the following conference series:

Abstract

Purpose This paper is oriented to the virtual workshop scenario, discusses the creation of a feasible eye movement recognition model and provides new ideas for eye movement recognition and eye movement interaction in the virtual workshop. Method Obtain user’s eye movement data through eye tracker in real time, create blink behavior recognition model and gaze/saccade behavior recognition model based on convolutional neural network (CNN) and bidirectional long short-term memory network (Bi-LSTM). A data set was created by collecting eye movement data of three subjects for comparative experiments to verify the performance of the proposed model. Results The experimental results showed that the recognition accuracy, Kappa coefficient, F1 score and running time of the proposed model had certain advantages over other models. Conclusion The created hybrid model of eye movement recognition for virtual workshops has high reliability and effectiveness, which will create a foundation for future research on building eye movement interaction systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Wang S, Wang Q, Chen H et al (2018) Research and application of eye movement interaction based on eye movement recognition. MATEC web of conferences 246

    Google Scholar 

  2. Zheng Z, Li L, Jing C (2018) Deeply hierarchical bi-directional LSTM for sentiment classification. Comput Sci 045(008):213–217,252

    Google Scholar 

  3. Xiao-ying WU, Rui LI, Sheng-xi WU (2020) Action recognition algorithm based on CNN and bidirectional LSTM. Comput Eng Des 41(02):361–366

    Google Scholar 

  4. Xin L (2019) Research on intelligent human-computer interaction technology and application based on eye movement. Nanjing University

    Google Scholar 

  5. Pai YS, Dingler T, Kunze K (2019) Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality 23:119–131

    Article  Google Scholar 

  6. Bellet Marie E, Bellet Joachim, Nienborg Hendrikje et al (2019) Human-level saccade detection performance using deep neural networks. J Neurophysiol 121:646–661

    Article  Google Scholar 

  7. Tian-yi LUO, Da-yun LIU, Xiu-zheng LI et al (2019) Research on lip-reading recognition based on CNN and Bi-LSTM. Software Guide 18(10):36–39

    Google Scholar 

  8. Bing C, Chao Z, Xiaojuan D et al (2018) Convolutional neural network implementation for eye movement recognition based on video. In: Proceedings—2018 33rd youth academic annual conference of chinese association of automation, YAC2018, pp 179–184

    Google Scholar 

Download references

Acknowledgements

The work is supported by Shanghai Municipal Commission of Economy and Informatization of China (No. 2018-GYHLW-02009).

Compliance with Ethical Standards

The study was approved by the Logistics Department for Civilian Ethics Committee of Shanghai University.

All subjects who participated in the experiment were provided with and signed an informed consent form.

All relevant ethical safeguards have been met with regard to subject protection.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zenggui Gao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dong, M., Gao, Z., Liu, L. (2020). Hybrid Model of Eye Movement Behavior Recognition for Virtual Workshop. In: Long, S., Dhillon, B.S. (eds) Man-Machine-Environment System Engineering. MMESE 2020. Lecture Notes in Electrical Engineering, vol 645. Springer, Singapore. https://doi.org/10.1007/978-981-15-6978-4_27

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-6978-4_27

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-6977-7

  • Online ISBN: 978-981-15-6978-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics