In this study we developed a Graphical User Interface (GUI) to control a Brain-Computer Interface (BCI) by means of Event Related Potentials (ERP) and/or Motor Imagery (MI). It allows users to select actions to operate an upper-limb neuroprosthesis. The action’s selection is divided into 2 steps: choice and confirmation, which can be controlled using ERP or MI. We also present results of experiments with 12 participants who used this GUI and show that high performance is achieved with all possible combinations of paradigms. The GUI mode in which both the selection and confirmation steps use the ERP paradigm obtains the highest accuracy.
Keywords
- Graphical User Interface
- Motor Imagery
- Event Relate Poten
- Action Selection
- Functional Electrical Stimulation
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.