Real Time Assessment of Cognitive State: Research and Implementation Challenges

  • Michael C. TrumboEmail author
  • Mikaela L. Armenta
  • Michael J. Haass
  • Karin M. Butler
  • Aaron P. Jones
  • Charles S. H. Robinson
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9744)


Inferring the cognitive state of an individual in real time during task performance allows for implementation of corrective measures prior to the occurrence of an error. Current technology allows for real time cognitive state assessment based on objective physiological data though techniques such as neuroimaging and eye tracking. Although early results indicate effective construction of classifiers that distinguish between cognitive states in real time is a possibility in some settings, implementation of these classifiers into real world settings poses a number of challenges. Cognitive states of interest must be sufficiently distinct to allow for continuous discrimination in the operational environment using technology that is currently available as well as practical to implement.


Cognitive state Real time Eye tracking Attention 



We wish to acknowledge James D. Morrow of Sandia National Laboratories, Albuquerque New Mexico for creating the software used in our study to display the visual stimuli and record subject responses.

Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.


  1. 1.
    Haynes, J., Rees, G.: Decoding mental states from brain activity in humans. Nat. Rev. Neurosci. 7(7), 523–534 (2006)CrossRefGoogle Scholar
  2. 2.
    Liang, Y., et al.: Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans. Intell. Trans. Syst. 8(2), 340–350 (2007)CrossRefGoogle Scholar
  3. 3.
    Neerincx, M.A., Harbers, M., Lim, D., van der Tas, V.: Automatic feedback on cognitive load and emotional state of traffic controllers. In: Harris, D. (ed.) EPCE 2014. LNCS, vol. 8532, pp. 42–49. Springer, Heidelberg (2014)Google Scholar
  4. 4.
    Marshall, S.P.: Identifying cognitive state from eye metrics. Aviat. Space Environ. Med. 78(1), B165–B175 (2007)Google Scholar
  5. 5.
    deCharms, R.: Applications of real-time fMRI. Nat. Rev. Neurosci. 9(9), 720–729 (2008)CrossRefGoogle Scholar
  6. 6.
    Parra, L., et al.: Response error correction-a demonstration of improved human-machine performance using real-time EEG monitoring. IEEE Trans. Neural Syst. Rehabil. Eng. 11(2), 173–177 (2003)CrossRefGoogle Scholar
  7. 7.
    Cox, D., Savoy, R.: Functional magnetic resonance imaging (fMRI) “brain reading”: detecting and classifying distributed patterns of fMRI activity in human visual cortex. NeuroImage 19(2), 261–270 (2003)CrossRefGoogle Scholar
  8. 8.
    Snow, M.P., Barker, R.A., O’Neill, K.R., Offer, B.W., Edwards, R.E.: Augmented cognition in a prototype uninhabited combat air vehicle operator console. In: Foundations of Augmented Cognition. 2nd edn., pp. 279–288 (2006)Google Scholar
  9. 9.
    Ikehara, C.S., Crosby, M.E.: Assessing cognitive load with physiological sensors. In: Proceedings of the 38th Annual HICSS 2005, p. 295a, January 2005Google Scholar
  10. 10.
    Palmer, E.D., Kobus, D.A.: The future of augmented cognition systems in education and training. In: Schmorrow, D.D., Reeves, L.M. (eds.) HCII 2007 and FAC 2007. LNCS (LNAI), vol. 4565, pp. 373–379. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  11. 11.
    Holmqvist, K., et al.: Eye Tracking. OUP Oxford, Oxford (2011)Google Scholar
  12. 12.
    Beatty, J., Lucero-Wagoner, B.: The pupillary system. Handb. Psychophysiol. 2, 142–162 (2000)Google Scholar
  13. 13.
    Carlson-Radvansky, L.: Memory for relational information across eye movements. Percept. Psychophys. 61(5), 919–934 (1999)CrossRefGoogle Scholar
  14. 14.
    Stern, J., et al.: The endogenous eyeblink. Psychophysiology 21(1), 22–33 (1984)CrossRefGoogle Scholar
  15. 15.
    Farwell, L.A., Smith, S.S.: Using brain MERMER testing to detect knowledge despite efforts to conceal. J. Forensic Sci. 46(1), 135–143 (2001)CrossRefGoogle Scholar
  16. 16.
    Oliva, A., Schyns, P.G.: Diagnostic colors mediate scene recognition. Cogn. Psychol. 41(2), 176–210 (2000)CrossRefGoogle Scholar
  17. 17.
    Follet, B., et al.: New insights into ambient and focal visual fixations using an automatic classification algorithm. i-Perception 2(6), 592–610 (2011)CrossRefGoogle Scholar
  18. 18.
    Unema, P., et al.: Time course of information processing during scene perception: the relationship between saccade amplitude and fixation duration. Vis. Cogn. 12(3), 473–494 (2005)CrossRefGoogle Scholar
  19. 19.
    Scinto, L., et al.: Cognitive strategies for visual search. Acta Psychol. 62(3), 263–292 (1986)CrossRefGoogle Scholar
  20. 20.
    Lenneman, J.K., Lenneman, J., Cassavaugh, N., Backs, R.: Differential effects of focal and ambient visual processing demands on driving performance. In: Proceedings of the Fifth International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design, vol. 5, pp. 306–312. University of Iowa Public Policy Center (2009)Google Scholar
  21. 21.
    Leibowitz, H.W., Shupert, C.L., Post, R.B.: The two modes of visual processing: implications for spatial orientation. In: PVHD, NASA Conference Publication, vol. 2306, pp. 41–44, March 1983Google Scholar
  22. 22.
    Biele, C., Kopacz, A., Krejtz, K.: Shall we care about the user’s feelings? Influence of affect and engagement on visual attention. In: Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation, p. 7. ACM, June 2013Google Scholar
  23. 23.
    Duchowski, A.T. Krejtz, K.: Visualizing dynamic ambient/focal attention with coefficient K. In: Proceedings of the First Workshop on ETVIS, Chicago, IL, October 2015Google Scholar
  24. 24.
    Yarbus, A.: Eye Movements and Vision. pp. 171–211. Plenum Press, New York (1967)CrossRefGoogle Scholar
  25. 25.
    Chang, A., et al.: The human circadian system adapts to prior photic history. J. Physiol. 589(5), 1095–1102 (2011)CrossRefGoogle Scholar
  26. 26.
    Blignaut, P., Wium, D.: Eye-tracking data quality as affected by ethnicity and experimental design. Behav. Res. 46(1), 67–80 (2013)CrossRefGoogle Scholar
  27. 27.
    Haass, M.J., Matzen, L.E., Butler, K.M., Armenta, M.: A new method for categorizing scanpaths from eye tracking data. In: ETRA 2016, April 2016Google Scholar
  28. 28.
    Pannasch, S., Velichkovsky, B.: Distractor effect and saccade amplitudes: further evidence on different modes of processing in free exploration of visual images. Vis. Cogn. 17(6–7), 1109–1131 (2009)CrossRefGoogle Scholar
  29. 29.
    Parkhurst, D., et al.: Modeling the role of salience in the allocation of overt visual attention. Vis. Res. 42(1), 107–123 (2002)CrossRefGoogle Scholar
  30. 30.
    Judd, T., Ehinger, K., Durand, F., Torralba, A.: Learning to predict where humans look. In: 2009 IEEE 12th International Conference on Computer Vision, pp. 2106–2113. IEEE September 2009Google Scholar
  31. 31.
    Kuechenmeister, C.: Eye tracking in relation to age, sex, and illness. Arch. Gen. Psychiatry 34(5), 578 (1977)CrossRefGoogle Scholar
  32. 32.
    Chua, H.F., Boland, J.E., Nisbett, R.E.: Cultural variation in eye movements during scene perception. Proc. Nat. Acad. Sci. U.S.A. 102(35), 12629–12633 (2005)CrossRefGoogle Scholar
  33. 33.
    Romano Bergstrom, J.C., Olmsted-Hawala, E.L., Jans, M.E.: Age-related differences in eye tracking and usability performance: website usability for older adults. Int. J. Hum.-Comput. Inter. 29(8), 541–548 (2013)CrossRefGoogle Scholar
  34. 34.
    O’Brien, S.: Eye tracking in translation process research: methodological challenges and solutions. Methodol. Technol. Innov. Transl. Process. Res. 38, 251–266 (2009)Google Scholar
  35. 35.
    Berka, C., et al.: Real-Time analysis of EEG indexes of alertness, cognition, and memory acquired with a wireless EEG headset. Int. J. Hum.-Comput. Inter. 17(2), 151–170 (2004)CrossRefGoogle Scholar
  36. 36.
    Freeman, F., et al.: Evaluation of an adaptive automation system using three EEG indices with a visual tracking task. Biol. Psychol. 50(1), 61–76 (1999)CrossRefGoogle Scholar
  37. 37.
    Gevins, A., et al.: Monitoring working memory load during computer-based tasks with EEG pattern recognition methods. Hum. Factors 40(1), 79–91 (1998)CrossRefGoogle Scholar
  38. 38.
    Christensen, J., et al.: The effects of day-to-day variability of physiological data on operator functional state classification. NeuroImage 59(1), 57–63 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Michael C. Trumbo
    • 1
    Email author
  • Mikaela L. Armenta
    • 1
  • Michael J. Haass
    • 1
  • Karin M. Butler
    • 1
  • Aaron P. Jones
    • 2
  • Charles S. H. Robinson
    • 2
  1. 1.Sandia National LaboratoriesAlbuquerqueUSA
  2. 2.University of New MexicoAlbuquerqueUSA

Personalised recommendations