Advertisement

Assessing hands-free interactions for VR using eye gaze and electromyography

Abstract

With the increasing popularity of virtual reality (VR) technologies, more efforts have been going into developing new input methods. While physical controllers are widely used, more novel techniques, such as eye tracking, are now commercially available. In our work, we investigate the use of physiological signals as input to enhance VR experiences. We present a system using gaze tracking and electromyography on a user’s forearm to make selection tasks in virtual spaces more efficient. In a study with 16 participants, we compared five different input techniques using a Fitts’ law task: Using gaze tracking for cursor movement in combination with forearm contractions for making selections was superior to using an HTC Vive controller, Xbox gamepad, dwelling time, and eye-gaze dwelling time. To explore application scenarios and collect qualitative feedback, we further developed and evaluated a game with our input technique. Our findings inform the design of applications that use eye-gaze tracking and forearm muscle movements for effective user input in VR.

This is a preview of subscription content, log in to check access.

Access options

Buy single article

Instant unlimited access to the full article PDF.

US$ 39.95

Price includes VAT for USA

Subscribe to journal

Immediate online access to all issues from 2019. Subscription will auto renew annually.

US$ 99

This is the net price. Taxes to be calculated in checkout.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

References

  1. Barry DT, Gordon KE, Hinton GG (1990) Acoustic and surface EMG diagnosis of pediatric muscle disease. Muscle Nerve 13(4):286–290

  2. Benko H, Saponas TS, Morris D, Tan D (2009) Enhancing input on and above the interactive surface with muscle sensing. In: Proceedings of the ACM international conference on interactive tabletops and surfaces—ITS ’09, p 93. https://doi.org/10.1145/1731903.1731924. http://portal.acm.org/citation.cfm?doid=1731903.1731924

  3. Cardoso J (2016) Comparison of gesture, gamepad, and gaze-based locomotion for VR worlds. In: Proceedings of the 22nd ACM conference on virtual reality software and technology. ACM, pp 319–320

  4. Chatterjee I, Xiao R, Harrison C (2015) Gaze + gesture : expressive , precise and targeted free-space interactions. In: Proceedings of the 2015 ACM on international conference on multimodal interaction (C), pp 131–138. https://doi.org/10.1145/2818346.2820752

  5. Chin CA, Barreto A, Cremades JG, Adjouadi M (2008) Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. J Rehabil Res Dev 45(1):161

  6. Cloudhead G (2015) VR Navigation. http://cloudheadgames.com/cloudhead/vr-navigation/

  7. Costanza E, Inverso SA, Allen R (2005) Toward subtle intimate interfaces for mobile devices using an EMG controller. In: Proceedings of the SIGCHI conference on Human factors in computing systems CHI 05:481. https://doi.org/10.1145/1054972.1055039. http://eprints.soton.ac.uk/270956/

  8. Darken RP, Cockayne WR, Carmein D (1997) The omni-directional treadmill: a locomotion device for virtual worlds. In: Proceedings of the 10th annual ACM symposium on User interface software and technology—UIST ’97, pp 213–221. https://doi.org/10.1145/263407.263550. http://dl.acm.org/citation.cfm?id=263550

  9. Henze N, Rukzio E, Boll S (2011) 100,000,000 taps: analysis and improvement of touch performance in the large. In: Proceedings of the 13th international conference on human computer interaction with mobile devices and services. ACM, pp 133–142

  10. Hernandez Arieta A, Katoh R, Yokoi H, Wenwei Y (2006) Development of a multi-DOF electromyography prosthetic system using the adaptive joint mechanism. Appl Bionics Biomech 3(2):101–111

  11. Hernandez-Rebollar JL, Kyriakopoulos N, Lindeman RW (2002) The AcceleGlove: a whole-hand input device for virtual reality. In: ACM SIGGRAPH 2002 conference abstracts and applications. ACM, p 259

  12. ISO (1998) Ergonomic requirements for office work with visual display terminals (VDTs)—part 11: guidance on usability. ISO 9241-11:1998, p 22

  13. Jacob RJK (1990) What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, pp 11–18

  14. Jacobsen SC, Jerard RB (1974) Computational requirements for control of the Utah arm. In: Proceedings of the 1974 annual conference, vol 1. ACM, pp 149–155

  15. Jones BR, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 869–878

  16. Kiguchi K, Tanaka T, Fukuda T (2004) Neuro-fuzzy control of a robotic exoskeleton with EMG signals. IEEE Trans Fuzzy Syst 12(4):481–490

  17. Kim YR, Kim GJ (2017) HoVR-type: smartphone as a typing interface in VR using hovering. In: 2017 IEEE international conference on consumer electronics (ICCE). IEEE, pp 200–203

  18. Lee JS, Lee SE, Jang SY, Park KS (2005) A simplified hand gesture interface for spherical manipulation in virtual environments. In: Proceedings of the 2005 international conference on Augmented tele-existence. ACM, p 284

  19. MacKenzie IS, Soukoreff RW (2002) Text entry for mobile computing: models and methods, theory and practice. Hum Comput Interact 17(2–3):147–198

  20. Miniotas D (2000) Application of Fitts’ law to eye gaze interaction. In: CHI ’00 extended abstracts on Human factors in computer systems—CHI ’00, pp 339–340. https://doi.org/10.1145/633482.633496. http://portal.acm.org/citation.cfm?doid=633292.633496

  21. Moseley JRJB, Jobe FW, Pink M, Perry J, Tibone J (1992) EMG analysis of the scapular muscles during a shoulder rehabilitation program. Am J Sports Med 20(2):128–134

  22. Myo (2013) Myo Gesture Control Armband. http://www.myo.com/techspecs

  23. Oculus (2015) The Rift’s Recommended Spec, PC SDK 0.6 Released, and Mobile VR Jam Voting. https://www3.oculus.com/en-us/blog/the-rifts-recommended-spec-pc-sdk-0-6-released-and-mobile-vr-jam-voting/

  24. Pai YS, Tag B, Outram B, Vontin N, Sugiura K, Kunze K (2016) GazeSim: simulating foveated rendering using depth in eye gaze for VR. In: ACM SIGGRAPH 2016 Posters. ACM, p 75

  25. Ramcharitar A, Teather RJ (2017) A Fitts’ law evaluation of video game controllers: thumbstick, touchpad and gyrosensor. In: Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems. ACM, New York, NY, CHI EA ’17, pp 2860–2866. https://doi.org/10.1145/3027063.3053213. http://doi.acm.org/10.1145/3027063.3053213

  26. Saponas TS, Tan DS, Morris D, Balakrishnan R, Turner J, Landay JA (2009) Enabling always-available input with muscle–computer interfaces. In: Proceedings of the 22nd annual ACM symposium on User interface software and technology, pp 167–176. https://doi.org/10.1145/1622176.1622208

  27. Saraiji Y (2016) PupilHMDCalibration. https://github.com/mrayy/PupilHMDCalibration

  28. Saraiji Y, Sugimoto S, Fernando CL, Minamizawa K, Tachi S (2016) Layered telepresence: simultaneous multi presence experience using eye gaze based perceptual awareness blending. In: ACM SIGGRAPH 2016 posters. ACM, p 20

  29. Slambekova D, Bailey R, Geigel J (2012) Gaze and gesture based object manipulation in virtual worlds. In: Proceedings of the 18th ACM symposium on virtual reality software and technology. ACM, pp 203–204

  30. Unity (2015) User interfaces for VR. https://unity3d.com/learn/tutorials/topics/virtual-reality/user-interfaces-vr

  31. Ware C, Mikaelian HH (1986) An evaluation of an eye tracker as a device for computer input. ACM SIGCHI Bull 17(SI):183–188. https://doi.org/10.1145/30851.275627

  32. Williams B, Bailey S, Narasimham G, Li M, Bodenheimer B (2011) Evaluation of walking in place on a Wii balance board to explore a virtual environment. ACM Trans Appl Percept 8(3):1–14. https://doi.org/10.1145/2010325.2010329

  33. Wilson AD, Cutrell E (2005) Flowmouse: a computer vision-based pointing and gesture input device. In: Human–computer interaction-INTERACT 2005, vol 3585, pp 565–578. https://doi.org/10.1007/11555261. http://www.springerlink.com/index/DJ6AENMGBTYBC0EE.pdf

  34. Zander TO, Gaertner M, Kothe C, Vilimek R (2010) Combining eye gaze input with a brain? Computer interface for touchless human? Computer interaction. Int J Hum Comput Interact 27(1):38–51. https://doi.org/10.1080/10447318.2011.535752

  35. Zhang X, MacKenzie IS (2007) Evaluating eye tracking with ISO 9241—part 9. In: Proceedings of the 12th international conference on human-computer interaction: intelligent multimodal interaction environments, HCI’07. Springer, Berlin, pp 779–788. http://dl.acm.org/citation.cfm?id=1769590.1769678

Download references

Acknowledgements

This work was supported by the JSPS KAKENHI Grant Number 18H03278.

Author information

Correspondence to Yun Suen Pai.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 37248 KB)

Supplementary material 1 (mp4 37248 KB)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pai, Y.S., Dingler, T. & Kunze, K. Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality 23, 119–131 (2019) doi:10.1007/s10055-018-0371-2

Download citation

Keywords

  • Virtual reality
  • Physiological sensing
  • Eye gaze
  • Electromyography