Advertisement

A Novel Oddball Paradigm for Affective BCIs Using Emotional Faces as Stimuli

  • Qibin Zhao
  • Akinari Onishi
  • Yu Zhang
  • Jianting Cao
  • Liqing Zhang
  • Andrzej Cichocki
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7062)

Abstract

The studies of P300-based brain computer interfaces (BCIs) have demonstrated that visual attention to an oddball event can enhance the event-related potential (ERP) time-locked to this event. However, it was unclear that whether the more sophisticated face-evoked potentials can also be modulated by related mental tasks. This study examined ERP responses to objects, faces, and emotional faces when subjects performs visual attention, face recognition and categorization of emotional facial expressions respectively in an oddball paradigm. The results revealed the significant difference between target and non-target ERPs for each paradigm. Furthermore, the significant difference among three mental tasks was observed for vertex-positive potential (VPP) (p < 0.01), late positive potential (LPP) / P3b (p < 0.05) at the centro-parietal regions and N250 (p < 0.003) at the occipito-temporal regions. The high classification performance for single-trial emotional face-related ERP demonstrated facial emotion processing can be used as a novel oddball paradigm for the affective BCIs.

Keywords

Brain Computer Interface (BCI) Event-Related Potential (ERP) P300 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Wolpaw, J., Birbaumer, N., McFarland, D., Pfurtscheller, G., Vaughan, T.: Brain-computer interfaces for communication and control. Clinical Neurophysiology 113(6), 767–791 (2002)CrossRefGoogle Scholar
  2. 2.
    Farwell, L., Donchin, E.: Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalography and Clinical Neurophysiology 70(6), 510–523 (1988)CrossRefGoogle Scholar
  3. 3.
    Treder, M., Blankertz, B. (C) overt attention and visual speller design in an ERP-based brain-computer interface. Behavioral and Brain Functions 6(1), 28 (2010)CrossRefGoogle Scholar
  4. 4.
    Martens, S., Hill, N., Farquhar, J., Schölkopf, B.: Overlap and refractory effects in a brain-computer interface speller based on the visual P300 event-related potential. Journal of Neural Engineering 6 (2009)Google Scholar
  5. 5.
    Townsend, G., LaPallo, B., Boulay, C., Krusienski, D., Frye, G., Hauser, C., Schwartz, N., Vaughan, T., Wolpaw, J., Sellers, E.: A novel P300-based brain-computer interface stimulus presentation paradigm: Moving beyond rows and columns. Clinical Neurophysiology 121(7), 1109–1120 (2010)CrossRefGoogle Scholar
  6. 6.
    Furdea, A., Halder, S., Krusienski, D., Bross, D., Nijboer, F., Birbaumer, N., Kübler, A.: An auditory oddball (P300) spelling system for brain-computer interfaces. Psychophysiology 46(3), 617–625 (2009)CrossRefGoogle Scholar
  7. 7.
    Lenhardt, A., Kaper, M., Ritter, H.: An adaptive P300-based online brain–computer interface. IEEE Transactions on Neural Systems and Rehabilitation Engineering 16(2), 121–130 (2008)CrossRefGoogle Scholar
  8. 8.
    Xu, P., Yang, P., Lei, X., Yao, D.: An enhanced probabilistic LDA for multi-class brain computer interface. PloS One 6(1), e14634 (2011)CrossRefGoogle Scholar
  9. 9.
    Sadeh, B., Zhdanov, A., Podlipsky, I., Hendler, T., Yovel, G.: The validity of the face-selective ERP N170 component during simultaneous recording with functional MRI. Neuroimage 42(2), 778–786 (2008)CrossRefGoogle Scholar
  10. 10.
    Schweinberger, S., Huddy, V., Burton, A.: N250r: A face-selective brain response to stimulus repetitions. Neuro Report 15(9), 1501 (2004)Google Scholar
  11. 11.
    Nasr, S., Esteky, H.: A study of N250 event-related brain potential during face and non-face detection tasks. Journal of Vision 9(5), 5 (2009)CrossRefGoogle Scholar
  12. 12.
    Neumann, M.F., Mohamed, T.N., Schweinberger, S.R.: Face and object encoding under perceptual load: ERP evidence. NeuroImage 54(4), 3021–3027 (2011)CrossRefGoogle Scholar
  13. 13.
    Dennis, T., Chen, C.: Emotional face processing and attention performance in three domains: Neurophysiological mechanisms and moderating effects of trait anxiety. International Journal of Psychophysiology 65(1), 10–19 (2007)CrossRefGoogle Scholar
  14. 14.
    Luo, W., Feng, W., He, W., Wang, N., Luo, Y.: Three stages of facial expression processing: ERP study with rapid serial visual presentation. NeuroImage 49(2), 1857–1867 (2010)CrossRefGoogle Scholar
  15. 15.
    Willis, M., Palermo, R., Burke, D., Atkinson, C., McArthur, G.: Switching associations between facial identity and emotional expression: A behavioural and ERP study. Neuroimage 50(1), 329–339 (2010)CrossRefGoogle Scholar
  16. 16.
    Lee, K., Lee, T., Yoon, S., Cho, Y., Choi, J., Kim, H.: Neural correlates of top-down processing in emotion perception: An ERP study of emotional faces in white noise versus noise-alone stimuli. Brain Research 1337, 56–63 (2010)CrossRefGoogle Scholar
  17. 17.
    Zhao, Q., Onishi, A., Zhang, Y., Cichocki, A.: An affective BCI using multiple ERP components associated to facial emotion processing. State-of-the-Art in BCI Research: BCI Award (in print, 2011), http://www.bci-award.com/

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Qibin Zhao
    • 1
  • Akinari Onishi
    • 1
  • Yu Zhang
    • 1
    • 2
  • Jianting Cao
    • 3
  • Liqing Zhang
    • 4
  • Andrzej Cichocki
    • 1
  1. 1.Laboratory for Advanced Brain Signal Processing, Brain Science InstituteRIKENSaitamaJapan
  2. 2.School of Control Science and EngineeringEast China of Science and TechnologyShanghaiChina
  3. 3.Saitama Institute of TechnologySaitamaJapan
  4. 4.MOE-Microsoft Laboratory for Intelligent Computing and Intelligent Systems, Department of Computer Science and EngineeringShanghai Jiao Tong UniversityShanghaiChina

Personalised recommendations