Skip to main content

Advertisement

Log in

Development of a user evaluation system in virtual reality based on eye-tracking technology

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

This study proposes a user evaluation system based on a game engine, in which a user can measure their area or object of interest using an eye-tracking device. Eye tracking technology, which is widely used in various sectors such as human–computer interaction, user experience, and marketing research, measures the intentions of a human gazing on a part of their interest. However, creating a real-world test environment and model for user evaluation requires considerable time and money. Therefore, studies on user evaluation tests using virtual reality (VR) are being actively conducted. The VR user evaluation system based on the eye-tracking device proposed in this study was studied and developed using the unity game engine and an HTC VIVE Pro eye, in which an eye-tracking device was built in the head-mounted display. To make it appropriate for a VR environment, an eye-tracking algorithm was developed, and object- and surface-based eye-tracking and visualization technologies were applied. Moreover, the algorithm was developed to support component-based programming, so that an investigator can easily establish a test environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Andrienko G, Andrienko N, Demsar U, Dransch D, Dykes J, Fabrikant SI, Jern M, Kraak M, Schumann H, Tominski C (2010) Space, time and visual analytics. Int J Geogr Inf Sci 24(10):1577–1600. https://doi.org/10.1080/13658816.2010.508043

    Article  Google Scholar 

  2. Blascheck T, Kurzhals K, Raschke M, Burch M, Weiskopf D, Ertl T (2017) Visualization of eye tracking data: a taxonomy and survey. Comput Graph Forum 36(8):260–284. https://doi.org/10.1111/cgf.13079

    Article  Google Scholar 

  3. Bulling A, Gellersen H (2010) Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput 9(4):8–12. https://doi.org/10.1109/MPRV.2010.86

    Article  Google Scholar 

  4. Duchowski A, Duchowski A (2017) Eye tracking techniques. Eye tracking methodology: theory and practice. Springer, pp 51–59. https://doi.org/10.1007/978-3-319-57883-5_5

  5. Fortun D, Bouthemy P, Kervrann C (2015) Optical flow modeling and computation: a survey. Comput Vis Image Underst 134:1–21. https://doi.org/10.1016/j.cviu.2015.02.008

    Article  MATH  Google Scholar 

  6. Gibson I, Gao Z, Campbell I (2004) A comparative study of virtual prototyping and physical prototyping. Int J Manuf Technol Manag 6(6):503–522. https://doi.org/10.1504/IJMTM.2004.005931

    Article  Google Scholar 

  7. Haro A, Flickner M, Essa I (2000) Detecting and tracking eyes by using their physiological properties, dynamics, and appearance. In: proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), IEEE, pp 163–168. https://doi.org/10.1109/CVPR.2000.855815

  8. Hickson S, Dufour N, Sud A, Kwatra V, Essa I (2019) Eyemotion: Classifying facial expressions in VR using eye-tracking cameras. In: 2019 IEEE winter conference on applications of computer vision (WACV), IEEE, pp 1626–1635. https://doi.org/10.1109/WACV.2019.00178

  9. Jacob RJ, Karn KS (2003) Eye tracking in human-computer interaction and usability research: ready to deliver the promises. In: The mind’s eye, North-Holland, pp 573–605. https://doi.org/10.1016/B978-044451020-4/50031-1

  10. Khushaba RN, Wise C, Kodagoda S, Louviere J, Kahn BE, Townsend C (2013) Consumer neuroscience: assessing the brain response to marketing stimuli using electroencephalogram (EEG) and eye tracking. Expert Syst Appl 40(9):3803–3812. https://doi.org/10.1016/j.eswa.2012.12.095

    Article  Google Scholar 

  11. Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S, Matusik W, Torralba A (2016) Eye tracking for everyone. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 2176–2184

  12. Kurzhals K, Weiskopf D (2013) Space-time visual analytics of eye-tracking data for dynamic stimuli. IEEE Trans Vis Comput Graph 19(12):2129–2138. https://doi.org/10.1109/TVCG.2013.194

    Article  Google Scholar 

  13. Li D, Winfield D, Parkhurst DJ (2005) Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: IEEE Computer society conference on computer vision and pattern recognition (CVPR'05), IEEE, p 79. https://doi.org/10.1109/CVPR.2005.531

  14. Löwe T, Stengel M, Förster EC, Grogorick S, Magnor M (2015) Visualization and analysis of head movement and gaze data for immersive video in head-mounted displays. In: proceedings of the workshop on eye tracking and visualization (ETVIS)

  15. Majaranta P, Bulling A (2014) Eye tracking and eye-based human–computer interaction. In:. Human–Computer Interaction Series (39–65). Springer, London. https://doi.org/10.1007/978-1-4471-6392-3_3

    Book  Google Scholar 

  16. Müller M, Günther T, Kammer D, Wojdziak J, Lorenz S, Groh R (2016) Smart prototyping-improving the evaluation of design concepts using virtual reality. In:. Lecture Notes in Computer Science International Conference on Virtual, Augmented and Mixed Reality (47–58). Springer, Cham. https://doi.org/10.1007/978-3-319-39907-2_5

    Book  Google Scholar 

  17. Patney A, Salvi M, Kim J, Kaplanyan A, Wyman C, Benty N, Luebke D, Lefohn A (2016) Towards foveated rendering for gaze-tracked virtual reality. ACM Trans Graph 35(6):1–12. https://doi.org/10.1145/2980179.2980246

    Article  Google Scholar 

  18. Pfeiffer J, Pfeiffer T, Meißner M, Weiß E (2020) Eye-tracking-based classification of information search behavior using machine learning: evidence from experiments in physical shops and virtual reality shopping environments. Inf Syst Res 31:675–691

    Article  Google Scholar 

  19. Scott N, Zhang R, Le D, Moyle B (2019) A review of eye-tracking research in tourism. Curr Issue Tour 22(10):1244–1261. https://doi.org/10.1080/13683500.2017.1367367

    Article  Google Scholar 

  20. Stellmach S, Nacke L, Dachselt R (2010) 3D attentional maps: aggregated gaze visualizations in three-dimensional virtual environments. In: Proceedings of the international conference on advanced visual interfaces, ACM, pp 345–348. https://doi.org/10.1145/1842993.1843058

  21. Sundstedt V, Navarro D, Mautner J (2016) Possibilities and challenges with eye tracking in video games and virtual reality applications. In: Proceedings of the SIGGRAPH Asia 2016 courses, ACM, pp 1–150. https://doi.org/10.1145/2988458.2988466

  22. Zhang LM, Jeng TS, Zhang RX (2018) Integration of virtual reality, 3-D eye-tracking, and protocol analysis for re-designing street space. In: proceedings of the 23rd CAADRIA conference on Computer-Aided Architectural Design Research in Asia: Learning, Prototyping and Adapting, pp 431–440

  23. Zheng WL, Dong BN, Lu BL (2014) Multimodal emotion recognition using EEG and eye tracking data. In:. Annu Int Conf IEEE Eng Med Biol Soc 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Publications 2014:(5040–5043). https://doi.org/10.1109/EMBC.2014.6944757

Download references

Acknowledgments

This research was supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP), grant funded by the Korean government (MSIT) (No.2021-0-00986, Development of Interaction Technology to Maximize Realization of Virtual Reality Contentsusing Multimodal Sensory Interface) and by the Basic Science Research Program through the NationalResearch Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2020R1I1A3051739).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to SangHun Nam.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nam, S., Choi, J. Development of a user evaluation system in virtual reality based on eye-tracking technology. Multimed Tools Appl 82, 21117–21130 (2023). https://doi.org/10.1007/s11042-023-14583-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-14583-y

Keywords

Navigation