Advertisement

Detecting Mental Fatigue from Eye-Tracking Data Gathered While Watching Video

  • Yasunori Yamada
  • Masatomo Kobayashi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10259)

Abstract

Monitoring mental fatigue is of increasing importance for improving cognitive performance and health outcomes. Previous models using eye-tracking data allow inference of fatigue in cognitive tasks, such as driving, but they require us to engage in a specific cognitive task. A model capable of estimating fatigue from eye-tracking data in natural-viewing situations when an individual is not performing cognitive tasks has many potential applications. Here, we collected eye-tracking data from 18 adults as they watched video clips (simulating the situation of watching TV programs) before and after performing cognitive tasks. Using this data, we built a fatigue-detection model including novel feature sets and an automated feature selection method. With eye-tracking data of individuals watching only 30-seconds worth of video, our model could determine whether that person was fatigued with 91.0% accuracy in 10-fold cross-validation (chance 50%). Through a comparison with a model incorporating the feature sets used in previous studies, we showed that our model improved the detection accuracy by up to 13.9% (from 77.1 to 91.0%).

Keywords

Mental fatigue Cognitive fatigue Feature selection Natural viewing Free viewing Visual attention model 

Notes

Acknowledgments

This research was partially supported by the Japan Science and Technology Agency (JST) under the Strategic Promotion of Innovative Research and Development Program.

References

  1. 1.
    Alemdar, H., Ersoy, C.: Wireless sensor networks for healthcare: a survey. Comput. Netw. 54(15), 2688–2710 (2010)CrossRefGoogle Scholar
  2. 2.
    Favela, J., Castro, L.A.: Technology and aging. In: García-Peña, C., Gutiérrez-Robledo, L.M., Pérez-Zepeda, M.U. (eds.) Aging Research-Methodological Issues, pp. 121–135. Springer, Cham (2015)CrossRefGoogle Scholar
  3. 3.
    Boksem, M.A., Tops, M.: Mental fatigue: costs and benefits. Brain Res. Rev. 59(1), 125–139 (2008)CrossRefGoogle Scholar
  4. 4.
    Avlund, K.: Fatigue in older adults: an early indicator of the aging process? Aging Clin. Exp. Res. 22(2), 100–115 (2010)CrossRefGoogle Scholar
  5. 5.
    Maghout-Juratli, S., Janisse, J., Schwartz, K., Arnetz, B.B.: The causal role of fatigue in the stress-perceived health relationship: a MetroNet study. J. Am. Board Family Med. 23(2), 212–219 (2010)CrossRefGoogle Scholar
  6. 6.
    Hopstaken, J.F., Linden, D., Bakker, A.B., Kompier, M.A.: A multifaceted investigation of the link between mental fatigue and task disengagement. Psychophysiology 52(3), 305–315 (2015)CrossRefGoogle Scholar
  7. 7.
    Schleicher, R., Galley, N., Briest, S., Galley, L.: Blinks and saccades as indicators of fatigue in sleepiness warnings: looking tired? Ergonomics 51(7), 982–1010 (2008)CrossRefGoogle Scholar
  8. 8.
    Di Stasi, L.L., Renner, R., Catena, A., Cañas, J.J., Velichkovsky, B.M., Pannasch, S.: Towards a driver fatigue test based on the saccadic main sequence: a partial validation by subjective report data. Transp. Res. Part C: Emerg. Technol. 21(1), 122–133 (2012)CrossRefGoogle Scholar
  9. 9.
    Dawson, D., Searle, A.K., Paterson, J.L.: Look before you (s)leep: evaluating the use of fatigue detection technologies within a fatigue risk management system for the road transport industry. Sleep Med. Rev. 18(2), 141–152 (2014)CrossRefGoogle Scholar
  10. 10.
    Tseng, P.H., Cameron, I.G., Pari, G., Reynolds, J.N., Munoz, D.P., Itti, L.: High-throughput classification of clinical populations from natural viewing eye movements. J. Neurol. 260(1), 275–284 (2013)CrossRefGoogle Scholar
  11. 11.
    Crabb, D.P., Smith, N.D., Zhu, H.: What’s on TV? Detecting age-related neurodegenerative eye disease using eye movement scanpaths. Frontiers in Aging Neurosci. 6, 312 (2014)CrossRefGoogle Scholar
  12. 12.
    Cook, D.B., O’Connor, P.J., Lange, G., Steffener, J.: Functional neuroimaging correlates of mental fatigue induced by cognition among chronic fatigue syndrome patients and controls. Neuroimage 36(1), 108–122 (2007)CrossRefGoogle Scholar
  13. 13.
    Carmi, R., Itti, L.: The role of memory in guiding attention during natural vision. J. Vis. 6(9), 4 (2006)CrossRefGoogle Scholar
  14. 14.
    Cutting, J.E., DeLong, J.E., Brunick, K.L.: Visual activity in Hollywood film: 1935 to 2005 and beyond. Psychol. Aesthet. Creat. Arts 5(2), 115 (2011)CrossRefGoogle Scholar
  15. 15.
    Bordwell, D.: Intensified continuity visual style in contemporary American film. Film Q. 55(3), 16–28 (2002)CrossRefGoogle Scholar
  16. 16.
    Itti, L., Carmi, R.: Eye-tracking data from human volunteers watching complex video stimuli (2009)Google Scholar
  17. 17.
    Mital, P.K., Smith, T.J., Hill, R.L., Henderson, J.M.: Clustering of gaze during dynamic scene viewing is predicted by motion. Cogn. Comput. 3(1), 5–24 (2011)CrossRefGoogle Scholar
  18. 18.
    Xu, P., Ehinger, K.A., Zhang, Y., Finkelstein, A., Kulkarni, S.R., Xiao, J.: TurkerGaze: crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755 (2015)
  19. 19.
    Zhang, Y., Wilcockson, T., Kim, K.I., Crawford, T., Gellersen, H., Sawyer, P.: Monitoring dementia with automatic eye movements analysis. In: Czarnowski, I., Caballero, A.M., Howlett, R.J., Jain, L.C. (eds.) Intelligent Decision Technologies 2016. SIST, vol. 57, pp. 299–309. Springer, Cham (2016). doi: 10.1007/978-3-319-39627-9_26 CrossRefGoogle Scholar
  20. 20.
    Tass, P., Rosenblum, M., Weule, J., Kurths, J., Pikovsky, A., Volkmann, J., Schnitzler, A., Freund, H.J.: Detection of n: m phase locking from noisy data: application to magnetoencephalography. Phys. Rev. Lett. 81(15), 3291 (1998)CrossRefGoogle Scholar
  21. 21.
    Treisman, A.M., Gelade, G.: A feature-integration theory of attention. Cogn. Psychol. 12(1), 97–136 (1980)CrossRefGoogle Scholar
  22. 22.
    Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 11, 1254–1259 (1998)CrossRefGoogle Scholar
  23. 23.
    Harel, J., Koch, C., Perona, P.: Graph-based visual saliency. In: Advances in Neural Information Processing Systems, pp. 545–552 (2006)Google Scholar
  24. 24.
    Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144–152. ACM (1992)Google Scholar
  25. 25.
    Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST) 2(3), 27 (2011)Google Scholar
  26. 26.
    Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Mach. Learn. 46(1–3), 389–422 (2002)CrossRefzbMATHGoogle Scholar
  27. 27.
    Yan, K., Zhang, D.: Feature selection and analysis on correlated gas sensor data with recursive feature elimination. Sens. Actuators B: Chem. 212, 353–363 (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.IBM Research - TokyoTokyoJapan

Personalised recommendations