Advertisement

Cognitive, Affective, & Behavioral Neuroscience

, Volume 19, Issue 1, pp 123–137 | Cite as

Spatial attention affects the early processing of neutral versus fearful faces when they are task-irrelevant: a classifier study of the EEG C1 component

  • David AcunzoEmail author
  • Graham MacKenzie
  • Mark C. W. van Rossum
Article
  • 98 Downloads

Abstract

EEG studies suggest that the emotional content of visual stimuli is processed rapidly. In particular, the C1 component, which occurs up to 100 ms after stimulus onset and likely reflects activity in primary visual cortex V1, has been reported to be sensitive to emotional faces. However, difficulties replicating these results have been reported. We hypothesized that the nature of the task and attentional condition are key to reconcile the conflicting findings. We report three experiments of EEG activity during the C1 time range elicited by peripherally presented neutral and fearful faces under various attentional conditions: the faces were spatially attended or unattended and were either task-relevant or not. Using traditional event-related potential analysis, we found that the early activity changed depending on facial expression, attentional condition, and task. In addition, we trained classifiers to discriminate the different conditions from the EEG signals. Although the classifiers were not able to discriminate between facial expressions in any condition, they uncovered differences between spatially attended and unattended faces but solely when these were task-irrelevant. In addition, this effect was only present for neutral faces. Our study provides further indication that attention and task are key parameters when measuring early differences between emotional and neutral visual stimuli.

Keywords

Attention C1 component Classifiers MVPA EEG Facial expression 

Notes

Acknowledgements

DA was supported by the Doctoral Training Centre in Neuroinformatics of the University of Edinburgh (UK), which is co-funded by the EPSRC/MRC grant EP/F500386/1 and BBSRC BB/F529254/1.

Supplementary material

13415_2018_650_MOESM1_ESM.docx (30 kb)
ESM 1 (DOCX 30 kb)

References

  1. Acunzo, D. J., MacKenzie, G., & van Rossum, M. (2012). Systematic biases in early ERP and ERF components as a result of high-pass filtering. Journal of Neuroscience Methods, 209, 212–218.Google Scholar
  2. Bayle, D. J., & Taylor, M. J. (2010). Attention Inhibition of Early Cortical Activation to Fearful Faces. Brain Research, 1313, 113–123.Google Scholar
  3. Bertrand, O., Perrin, F., & Pernier, J. (1985). A theoretical justification of the average reference in topographic evoked potential studies. Electroencephalography and Clinical Neurophysiology/Evoked Potentials Section, 62(6), 462–464.  https://doi.org/10.1016/0168-5597(85)90058-9 Google Scholar
  4. Bindemann, M., Burton, A. M., Langton, S. R. H., Schweinberger, S. R., & Doherty, M. J. (2007). The control of attention to faces. Journal of Vision, 7(10), 15–8.  https://doi.org/10.1167/7.10.15 Google Scholar
  5. Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10, 433–436.Google Scholar
  6. Chouiter, L., Tzovara, A., Dieguez, S., Annoni, J. M., Magezi, D., De Lucia, M., & Spierer, L. (2015). Experience-based auditory predictions modulate brain activity to silence as do real sounds. Journal of cognitive neuroscience, 27(10), 1968-1980.Google Scholar
  7. Clark, V. P., Fan, S., & Hillyard, S. A. (1995). Identification of Early Visual Evoked Potential Generators by Retinotopic and Topographic Analyses. Human Brain Mapping, 2, 170–187.Google Scholar
  8. Cravo, A. M., Rohenkohl, G., Wyart, V., & Nobre, A. C. (2011). Endogenous modulation of low frequency oscillations by temporal expectations. Journal of Neurophysiology, 106(6), 2964–2972.  https://doi.org/10.1152/jn.00157.2011 Google Scholar
  9. Crouzet, S. M., Kirchner, H., & Thorpe, S. J. (2010). Fast saccades toward faces: Face detection in just 100 ms. Journal of Vision, 10(4), 1–17.Google Scholar
  10. Delorme, A., & Makeig, S. (2004). EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134(1), 9–21.  https://doi.org/10.1016/j.jneumeth.2003.10.009 Google Scholar
  11. Eger, E., Jedynak, A., Iwaki, T., & Skrandies, W. (2003). Rapid extraction of emotional expression: evidence from evoked potential fields during brief presentation of face stimuli. Neuropsychologia, 41, 808–817.Google Scholar
  12. Eimer, M., Holmes, A., & McGlone, F. P. (2003). The role of spatial attention in the processing of facial expression: an ERP study of rapid brain responses to six basic emotions. Cognitive, Affective, & Behavioral Neuroscience, 3(2), 97–110.Google Scholar
  13. Eimer, M., & Kiss, M. (2006). Attentional capture by task-irrelevant fearful faces is revealed by the N2pc component. Biological Psychology, 74, 108–112.Google Scholar
  14. Eldar, S., Yankelevitch, R., Lamy, D., & Bar-Haim, Y. (2010). Enhanced neural reactivity and selective attention to threat in anxiety. Biological Psychology, 85(2), 252–257.Google Scholar
  15. Foxe, J. J., Strugstad, E. C., Sehatpour, P., Molholm, S., Pasieka, W., Schroeder, C. E., & McCourt, M. E. (2008). Parvocellular and magnocellular contributions to the initial generators of the visual evoked potential: high-density electrical mapping of the “C1” component. Brain Topography, 21(1), 11–21.Google Scholar
  16. Fu, S., Fedota, J. R., Greenwood, P. M., & Parasuraman, R. (2010a). Dissociation of visual C1 and P1 components as a function of attentional load: An event-related potential study. Biological Psychology, 85(1), 171–178.Google Scholar
  17. Fu, S., Fedota, J. R., Greenwood, P. M., & Parasuraman, R. (2010b). Early interaction between perceptual load and involuntary attention: An event-related potential study. Neuroscience Letters, 468(1), 68–71.Google Scholar
  18. Fu, S., Huang, Y., Luo, Y.-J., Wang, Y., Fedota, J. R., Greenwood, P. M., & Parasuraman, R. (2009). Perceptual load interacts with involuntary attention at early processing stages: Event-related potential studies. NeuroImage, 48(1), 191–199.Google Scholar
  19. Gomez Gonzalez, C. M., Clark, V. P., Fan, S., Luck, S. J., & Hillyard, S. A. (1994). Sources of attention-sensitive visual event-related potentials. Brain Topography, 7(1), 41–51.Google Scholar
  20. Halgren, E., Raij, T., Marinkovic, K., Jousmäki, V., & Hari, R. (2000). Cognitive response profile of the human fusiform face area as determined by MEG. Cerebral Cortex, 10(1), 69–81.Google Scholar
  21. Hillyard, S. A., & Anllo-Vento, L. (1998). Event-related brain potentials in the study of visual selective attention. Proceedings of the National Academy of Sciences of the United States of America, 95, 781–787.Google Scholar
  22. Holmes, A., Vuilleumier, P., & Eimer, M. (2003). The processing of emotional facial expression is gated by spatial attention: evidence from event-related brain potentials. Cognitive Brain Research, 16(2), 174–184.  https://doi.org/10.1016/S0926-6410(02)00268-9 Google Scholar
  23. Jasper, H. H. (1958). The ten-twenty electrode system of the International Federation. Electroencephalography and Clinical Neurophysiology, 1958, 10, 371-375.Google Scholar
  24. Jeffreys, D. A., & Axford, J. (1972a). Source Locations of Pattern-Specific Components of Human Visual Evoked Potentials, I. Component of Striate Cortical Origin. Experimental Brain Research, 16(1), 1–21.Google Scholar
  25. Jeffreys, D. A., & Axford, J. (1972b). Source Locations of Pattern-Specific Components of HumanVisual Evoked Potentials, II. Component of Extrastriate Cortical Origin. Experimental Brain Research, 16(1), 22–40.Google Scholar
  26. Kawasaki, H., Adolphs, R., Kaufman, O., Damasio, H., Damasio, A. R., Granner, M., et al. (2001). Single-neuron responses to emotional visual stimuli recorded in human ventral prefontal cortex. Nature Neuroscience, 4(1), 15–16.Google Scholar
  27. Kelly, S. P., Gomez-Ramirez, M., & Foxe, J. J. (2008). Spatial Attention Modulates Initial Afferent Activity in Human Primary Visual Cortex. Cerebral Cortex, 18(11), 2629–2636.  https://doi.org/10.1093/cercor/bhn022 Google Scholar
  28. Krolak-Salmon, P., Fischer, C., Vighetto, A., & Mauguière, F. (2001). Processing of facial emotional expression: spatio-temporal data as assessed by scalp event-related potentials. European Journal of Neuroscience, 13(5), 987–994.Google Scholar
  29. Krolak-Salmon, P., Hénaff, M.-A., Isnard, J., Tallon-Baudry, C., Guénot, M., Vighetto, A., et al. (2003). An Attention Modulated Response to Disgust in Human Ventral Anterior Insular. Annals of Neurology, 53(4), 446–453.Google Scholar
  30. Langton, S. R. H., Law, A. S., Burton, A. M., & Schweinberger, S. R. (2008). Attention capture by faces. Cognition, 107(1), 330–342.  https://doi.org/10.1016/j.cognition.2007.07.012 Google Scholar
  31. Liu, L., & Ioannides, A. A. (2010). Emotion separation is completed early and it depends on visual field presentation. PLoS ONE, 5(3), e9790.Google Scholar
  32. Los, S. A., & Heslenfeld, D. J. (2005). Intentional and Unintentional Contributions to Nonspecific Preparation: Electrophysiological Evidence. Journal of Experimental Psychology: General, 134(1), 52–72.  https://doi.org/10.1037/0096-3445.134.1.52 Google Scholar
  33. De Lucia, M., Tzovara, A., Bernasconi, F., Spierer, L., & Murray, M. M. (2012). Auditory perceptual decision-making based on semantic categorization of environmental sounds. Neuroimage, 60(3), 1704-1715Google Scholar
  34. Luck, S. J., Woodman, G. F., & Vogel, E. K. (2000). Event-related potential studies of attention. Trends in Cognitive Sciences, 4(11), 432–440.Google Scholar
  35. Miniussi, C., Wilding, E. L., Coull, J. T., & Nobre, A. C. (1999). Orienting attention in time - Modulation of brain potentials. Brain, 122, 1507–1518.Google Scholar
  36. Morel, S., Ponz, A., Mercier, M., Vuilleumier, P., & George, N. (2009). EEG-MEG evidence for early differential repetition effects for fearful, happy and neutral faces. Brain Research, 1254(C), 84–98.Google Scholar
  37. Nobre, A. C., Sebestyen, G. N., & Miniussi, C. (2000). The dynamics of shifting visuospatial attention revealed by event-related potentials. Neuropsychologia, 38(7), 964-974.Google Scholar
  38. Oostenveld, R., Fries, P., Maris, E., & Schoffelen, J.-M. (2011). FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data. Computational Intelligence and Neuroscience, 2011(1), 1–9.  https://doi.org/10.1155/2011/156869 Google Scholar
  39. Oosterhof, N. N., Connolly, A. C., & Haxby, J. V. (2016). CoSMoMVPA: Multi-Modal Multivariate Pattern Analysis of Neuroimaging Data in Matlab/GNU Octave. Frontiers in Neuroinformatics, 10(14), 1512.00810v2–27.  https://doi.org/10.3389/fninf.2016.00027 Google Scholar
  40. Pfeuty, M., Ragot, R., & Pouthas, V. (2005). Relationship between CNV and timing of an upcoming event. Neuroscience Letters, 382(1-2), 106–111.  https://doi.org/10.1016/j.neulet.2005.02.067 Google Scholar
  41. Pizzagalli, D. A., Regard, M., & Lehmann, D. (1999). Rapid emotional face processing in the human right and left brain hemispheres: an ERP study. Neuroreport, 10(13), 2691–2698.Google Scholar
  42. Pourtois, G., Grandjean, D., Sander, D., & Vuilleumier, P. (2004). Electrophysiological Correlates of Rapid Spatial Orienting Towards Fearful Faces. Cerebral Cortex, 14(6), 619–633.  https://doi.org/10.1093/cercor/bhh023 Google Scholar
  43. Pourtois, G., Schettino, A., & Vuilleumier, P. (2013). Brain mechanisms for emotional influences on perception and attention: What is magic and what is not. Biological Psychology, 92(3), 492–512.  https://doi.org/10.1016/j.biopsycho.2012.02.007 Google Scholar
  44. Proverbio, A. M., Del Zotto, M., & Zani, A. (2007). Inter-individual differences in the polarity of early visual responses and attention effects. Neuroscience Letters, 419(2), 131–136.Google Scholar
  45. Rauss, K. S., Pourtois, G., Vuilleumier, P., & Schwartz, S. (2009). Attentional load modifies early activity in human primary visual cortex. Human Brain Mapping, 30(5), 1723–1733.Google Scholar
  46. Rauss, K. S., Schwartz, S., & Pourtois, G. (2011). Top-down effects on early visual processing in humans: A predictive coding framework. Neuroscience & Biobehavioral Reviews, 35(5), 1237–1253.Google Scholar
  47. Rossi, V., & Pourtois, G. (2012). State-dependent attention modulation of human primary visual cortex: A high density ERP study. NeuroImage, 60(4), 2365–2378.  https://doi.org/10.1016/j.neuroimage.2012.02.007 Google Scholar
  48. Rossi, V., & Pourtois, G. (2014). Electrical neuroimaging reveals content-specific effects of threat in primary visual cortex and fronto-parietal attentional networks. NeuroImage, 98, 11–22.  https://doi.org/10.1016/j.neuroimage.2014.04.064 Google Scholar
  49. Rossi, V., & Pourtois, G. (2017). Someone’s lurking in the dark: The role of state anxiety on attention deployment to threat-related stimuli. Biological Psychology, 122, 21–32.  https://doi.org/10.1016/j.biopsycho.2015.10.014 Google Scholar
  50. Santesso, D. L., Meuret, A. E., Hofmann, S. G., Mueller, E. M., Ratner, K. G., Roesch, E. B., & Pizzagalli, D. A. (2008). Electrophysiological correlates of spatial orienting towards angry faces: a source localization study. Neuropsychologia, 46(5), 1338–1348.Google Scholar
  51. Smith, S. M., & Nichols, T. E. (2009). Threshold-free cluster enhancement: Addressing problems of smoothing, threshold dependence and localisation in cluster inference. NeuroImage, 44(1), 83–98.  https://doi.org/10.1016/j.neuroimage.2008.03.061 Google Scholar
  52. Spielberger, C. D., Gorsuch, R. L., Lushene, R., Vagg, P. R., & Jacobs, G. A. (1983). Manual for the State-Trait Anxiety Inventory.Google Scholar
  53. Streit, M., Dammers, J., Simsek-Kraues, S., Brinkmeyer, J., Wolwer, W., & Ioannides, A. (2003). Time course of regional brain activations during facial emotion recognition in humans. Neuroscience Letters, 342, 101–104.Google Scholar
  54. Theeuwes, J., & Van der Stigchel, S. (2006). Faces capture attention: Evidence from inhibition of return. Visual Cognition, 13(6), 657–665.  https://doi.org/10.1080/13506280500410949
  55. Tottenham, N., Tanaka, J. W., Leon, A. C., McCarry, T., Nurse, M., Hare, T. A., et al. (2009). The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Research, 168(3), 242–249.  https://doi.org/10.1016/j.psychres.2008.05.006 Google Scholar
  56. Trillenberg, P., Verleger, R., Wascher, E., Wauschkuhn, B., & Wessel, K. (2000). CNV and temporal uncertainty with ‘ageing’ and ‘non-ageing’ S1–S2 intervals. Clinical Neurophysiology, 111(7), 1216–1226.  https://doi.org/10.1016/s1388-2457(00)00274-1 Google Scholar
  57. Tzovara, A., Murray, M. M., Plomp, G., Herzog, M. H., Michel, C. M., & De Lucia, M. (2012). Decoding stimulus-related information from single-trial EEG responses based on voltage topographies. Pattern Recognition, 45(6), 2109–2122.  https://doi.org/10.1016/j.patcog.2011.04.007
  58. Walter, W. G., Cooper, R., Aldridge, V. J., McCallum, W. C., & Winter, A. L. (1964). Contingent Negative Variation : An Electric Sign of Sensori-Motor Association and Expectancy in the Human Brain. Nature, 203(4943), 380–384.  https://doi.org/10.1038/203380a0 Google Scholar
  59. West, G. L., Anderson, A. K., Ferber, S., & Pratt, J. (2011). Electrophysiological evidence for biased competition in V1 for fear expressions. Journal of Cognitive Neuroscience, 23(11), 3410–3418.Google Scholar
  60. Zhu, X.-R., & Luo, Y.-J. (2012). Fearful faces evoke a larger C1 than happy faces in executive attention task: An event-related potential study. Neuroscience Letters, 526(2), 118–121.  https://doi.org/10.1016/j.neulet.2012.08.011 Google Scholar

Copyright information

© Psychonomic Society, Inc. 2018

Authors and Affiliations

  • David Acunzo
    • 1
    Email author
  • Graham MacKenzie
    • 2
  • Mark C. W. van Rossum
    • 3
  1. 1.CIMeC/University of TrentoRoveretoItaly
  2. 2.Psychology, Faculty of Natural SciencesUniversity of StirlingStirlingUK
  3. 3.School of Psychology and School of Mathematical SciencesUniversity of NottinghamNottinghamUK

Personalised recommendations