Skip to main content

Evaluation of Webcam-Based Eye Tracking for a Job Interview Training Platform: Preliminary Results

  • Conference paper
  • First Online:
Artificial Intelligence in HCI (HCII 2022)

Abstract

In job interviews, eye gaze towards the interviewer is an important non-verbal behavior that is considered a trait for hirability of a candidate. Several virtual job interview training platforms include eye trackers to measure eye gaze to provide feedback on performance. Though useful, these eye tracking devices are often pricey and not always accessible. In this article, we explore several camera-based eye tracking methods and implement a webcam-based eye tracking algorithm to determine its suitability for potential integration in virtual job interview simulation platforms. We further use the gaze predictions for interview relevant regions of interest detection. Our study with 12 participants, 7 with eyeglasses and 5 without, shows that during calibration, eyeglasses play no significant role in the differences in mean calibration error. Results from the ROI detection, however, show a limitation that it is important to maintain the same head position and distance during multiple tasks after calibration. Overall, webcam-based eye-tracking has potential, to be integrated in virtual job interview training environments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Dlib. https://pypi.org/project/dlib/. Accessed 1 Mar 2022

  2. Flask. https://flask.palletsprojects.com/en/2.0.x/. Accessed 1 Mar 2022

  3. OpenCV on Wheels. https://pypi.org/project/opencv-python/. Accessed 1 Mar 2022

  4. PyAutoGUI. https://pyautogui.readthedocs.io/en/latest/. Accessed 1 Mar 2022

  5. React - a javascript library for building user interfaces. https://reactjs.org/. Accessed 1 Mar 2022

  6. Virtual Speech. https://virtualspeech.com/. Accessed 1 Mar 2022

  7. How to improve your eye contact (2021). https://www.indeed.com/career-advice/career-development/eye-contact. Accessed 1 Mar 2022

  8. Adiani, D., et al.: Career interview readiness in virtual reality (CIRVR): a platform for simulated interview training for autistic individuals and their employers. ACM Trans. Accessible Comput. 15, 1–28 (2022)

    Article  Google Scholar 

  9. Ahmed, S., et al.: InterViewR: a mixed-reality based interview training simulation platform for individuals with autism. In: 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC), pp. 439–448 (2020). https://doi.org/10.1109/COMPSAC48688.2020.0-211

  10. Anderson, N.R.: Decision making in the graduate selection interview: an experimental investigation. Human Relat. 44(4), 403–417 (1991). https://doi.org/10.1177/001872679104400407

    Article  Google Scholar 

  11. Baur, T., Damian, I., Gebhard, P., Porayska-Pomsta, K., André, E.: A job interview simulation: social cue-based interaction with a virtual character. In: 2013 International Conference on Social Computing, pp. 220–227 (2013). https://doi.org/10.1109/SocialCom.2013.39

  12. Forbes, R.J., Jackson, P.R.: Non-verbal behaviour and the outcome of selection interviews. J. Occup. Psychol. 53, 65–72 (1980). https://doi.org/10.1111/j.2044-8325.1980.tb00007.x

    Article  Google Scholar 

  13. Gifford, R., Ng, C.F., Wilkinson, M.: Nonverbal cues in the employment interview: links between applicant qualities and interviewer judgments. J. Appl. Psychol. 70(4), 729–736 (1985). https://psycnet.apa.org/doi/10.1037/0021-9010.70.4.729

  14. Kalman, R.E.: A new approach to linear filtering and prediction problems. Trans. ASME-J. Basic Eng. 82(Ser. D), 35–45 (1960). https://doi.org/10.1115/1.3662552

  15. McGovern, T.V., Tinsley, H.E.: Interviewer evaluations of interviewee nonverbal behavior. J. Vocat. Behav. 13(2), 163–171 (1978). https://doi.org/10.1016/0001-8791(78)90041-6

    Article  Google Scholar 

  16. McKeever, V.: How much eye contact is too much in a job interview? (2020). https://www.cnbc.com/2020/03/11/how-much-eye-contact-is-too-much-in-a-job-interview.html. Accessed 1 Mar 2022

  17. Nguyen, L.S., Frauendorfer, D., Mast, M.S., Gática-Pérez, D.: Hire me: computational inference of hirability in employment interviews based on nonverbal behavior. IEEE Trans. Multimedia 16(4), 1018–1031 (2014). https://doi.org/10.1109/TMM.2014.2307169

    Article  Google Scholar 

  18. Papoutsaki, A., Laskey, J., Huang, J.: SearchGazer: webcam eye tracking for remote studies of web search. In: CHIIR 2017, pp. 17–26. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3020165.3020170

  19. Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., Hays, J.: WebGazer: scalable webcam eye tracking using user interactions. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI), pp. 3839–3845. AAAI (2016)

    Google Scholar 

  20. Parsons, C.K., Liden, R.C.: Interviewer perceptions of applicant qualifications: a multivariate field study of demographic characteristics and nonverbal cues. J. Appl. Psychol. 69(4), 557–568 (1984). https://doi.org/10.1037/0021-9010.69.4.557

    Article  Google Scholar 

  21. Ribeiro, M.I.: Kalman and extended Kalman filters: concept, derivation and properties. Inst. Syst. Robot. 43, 46 (2004)

    Google Scholar 

  22. Semmelmann, K., Weigelt, S.: Online webcam-based eye tracking in cognitive science: a first look. Behav. Res. Methods 50(2), 451–465 (2017). https://doi.org/10.3758/s13428-017-0913-7

    Article  Google Scholar 

  23. Tian, F., Okada, S., Nitta, K.: Analyzing eye movements in interview communication with virtual reality agents. In: Proceedings of the 7th International Conference on Human-Agent Interaction, pp. 3–10, HAI 2019. Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3349537.3351889

  24. Vargas-Cuentas, N.I., et al.: Developing an eye-tracking algorithm as a potential tool for early diagnosis of autism spectrum disorder in children. PLOS ONE 12(11), e0188826 (2017). https://doi.org/10.1371/journal.pone.0188826

    Article  Google Scholar 

  25. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2001, vol. 1, p. I-511 (2001). https://doi.org/10.1109/CVPR.2001.990517

  26. Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: TurkerGaze: crowdsourcing saliency with webcam based eye tracking. CoRR abs/1504.06755 (2015). http://arxiv.org/abs/1504.06755

  27. Xu, Q., Cheung, S.C.S., Soares, N.: LittleHelper: an augmented reality glass application to assist individuals with autism in job interview. In: 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA), pp. 1276–1279 (2015). https://doi.org/10.1109/APSIPA.2015.7415480

  28. Yang, L., Dong, K., Dmitruk, A.J., Brighton, J., Zhao, Y.: A dual-cameras-based driver gaze mapping system with an application on non-driving activities monitoring. IEEE Trans. Intell. Transp. Syst. 21(10), 4318–4327 (2020). https://doi.org/10.1109/TITS.2019.2939676

    Article  Google Scholar 

  29. Zhu, Z., Ji, Q.: Novel eye gaze tracking techniques under natural head movement. IEEE Trans. Biomed. Eng. 54(12), 2246–2260 (2007). https://doi.org/10.1109/TBME.2007.895750

    Article  Google Scholar 

  30. Zhu, Z., Ji, Q., Bennett, K.: Nonlinear eye gaze mapping function estimation via support vector regression. In: 18th International Conference on Pattern Recognition (ICPR 2006), vol. 1, pp. 1132–1135 (2006). https://doi.org/10.1109/ICPR.2006.864

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deeksha Adiani .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Adiani, D. et al. (2022). Evaluation of Webcam-Based Eye Tracking for a Job Interview Training Platform: Preliminary Results. In: Degen, H., Ntoa, S. (eds) Artificial Intelligence in HCI. HCII 2022. Lecture Notes in Computer Science(), vol 13336. Springer, Cham. https://doi.org/10.1007/978-3-031-05643-7_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05643-7_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05642-0

  • Online ISBN: 978-3-031-05643-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics