Exploiting Prior Neurophysiological Knowledge to Improve Brain Computer Interface Performance
Most EEG/MEG based Brain Computer Interfaces (BCI) employ machine learning techniques to discriminate and classify the recorded data belonging to different classes. Usually, no neurophysiological knowledge is used within the classification algorithms. Here, a method is proposed that includes prior knowledge about the locations of sources of imagined movement of the left and the right hand by projecting EEG/MEG data onto a subspace defined by modeled sources at the corresponding locations in somatosensory areas.
Three different source models are investigated. First, one radial dipole on each side is based on the assumption that both location and orientation are known. Hence, for two sides, a 2-dimensional subspace is selected. Second, three dipoles at each location span a 6-dimensional subspace assuming known locations but uncertain orientations. Third, we modeled the sources as multipoles up to quadrupolar order resulting in a 16-dimensional subspace. The multipole expansion systematically corrects for inaccuracies both in location and exact shape of the source.
After the projection onto respective topographies, feature extraction is performed on the reduced data by Common Spatial Filter (CSP) analysis. Finally, Linear Discriminant Analysis (LDA) is applied for classification. The projection of the data leads to a reduction of the dimensionality of the signal focusing on those parts of the signal which are generated or suppressed in the motor cortex during imagined hand movement. Since EEG/MEG data are strongly affected by various types of artifacts hampering the classification the proposed procedure leads to a removal of parts of the signal and therefore a reduction of artifacts. For EEG data it is shown that a projection with respect to source locations prior to CSP analysis leads to a gain of BCI performance when the sources are modeled as multipoles.
KeywordsBCI source modeling multipoles ERD
Unable to display preview. Download preview PDF.