Near-Instantaneous Classification of Perceptual States from Cortical Surface Recordings
Human visual processing is of such complexity that, despite decades of focused research, many basic questions remain unanswered. Although we know that the inferotemporal cortex is a key region in object recognition, we don’t fully understand its physiologic role in brain function, nor do we have the full set of tools to explore this question. Here we show that electrical potentials from the surface of the human brain contain enough information to decode a subject’s perceptual state accurately, and with fine temporal precision. Electrocorticographic (ECoG) arrays were placed over the inferotemporal cortical areas of seven subjects. Pictures of faces and houses were quickly presented while each subject performed a simple visual task. Results showed that two well-known types of brain signals—event-averaged broadband power and event-averaged raw potential—can independently or together be used to classify the presented image. When applied to continuously recorded brain activity, our decoding technique could accurately predict whether each stimulus was a face, house, or neither, with ~20 ms timing error. These results provide a roadmap for improved brain-computer interfacing tools to help neurosurgeons, research scientists, engineers, and, ultimately, patients.
KeywordsHuman vision Electrocorticography Broadband power Event-related potential Fusiform cortex
We are grateful to the patients and staff at Harborview Hospital in Seattle. We are grateful for helpful discussions with Kalanit Grill-Spector and Bharathi Jagadeesh. This work was supported by National Aeronautics and Space Administration Graduate Student Research Program (KJM), the NIH (R01-NS065186 (KJM, JGO, RPNR), T32-EY20485 (DH), R01-EB006356 (GS), R01-EB00856 (GS) and P41-EB018783 (GS)), the NSF (EEC-1028725 (RPNR)), the US Army Research Office (W911NF-07-1-0415 (GS), W911NF-08-1-0216 (GS) and W911NF-14-1-0440 (GS)), and Fondazione Neurone (GS).
- S.M. Crouzet, H. Kirchner, S.J. Thorpe, Fast saccades toward faces: face detection in just 100 ms. J. Vis. 10(16), 11–17 (2010)Google Scholar
- A.D. Engell, G. McCarthy, Repetition suppression of face-selective evoked and induced EEG recorded from human cortex. Human Brain Map (2014). doi: 10.1002/hbm.22467
- C. Guger, B.Z. Allison, G.R. Müller-Putz, in Brain-Computer Interface Research: A State-of-the-Art Summary 4 (in press)Google Scholar
- C. Jacques, N. Witthoft, K.S. Weiner, B.L. Foster, V. Rangarajan, D. Hermes, K.J. Miller, J. Parvizi, K. Grill-Spector, Corresponding ECoG and fMRI category-selective signals in Human ventral temporal cortex. Neuropsychologia (2015)Google Scholar
- N. Kanwisher, J. McDermott, M.M. Chun, The fusiform face area: a module in human extrastriate cortex specialized for face perception. J. Neurosci. 17, 4302–4311 (1997)Google Scholar
- K.J. Miller, G. Schalk, D. Hermes, J.G. Ojemann, R.P.N. Rao, Spontaneous Decoding of the Timing and Content of Human Object Perception from Cortical Surface Recordings Reveals Complementary Information in the Event-Related Potential and Broadband Spectral Change (2015b)Google Scholar
- B. Porat, A Course in Digital Signal Processing (Wiley, New York, 1997)Google Scholar
- A. Puce, T. Allison, J.C. Gore, G. McCarthy, Face-sensitive regions in human extrastriate cortex studied by functional MRI. J. Neurophysiol. 74, 1192–1199 (1995)Google Scholar