A hybrid brain-computer interface (hBCI) has recently been proposed to address the limitations of existing single-modal brain computer interfaces (BCIs) in terms of accuracy and information transfer rate (ITR) by combining more than one modality. The hBCI system also showed promising prospects for patients because the design of a human-centered smart robot control system may allow the performance of multiple tasks with high efficiency. In this paper, we present a hybrid multicontrol system that simultaneously uses electroencephalography (EEG) and electrooculography (EOG) signals. After the preprocessing phase, we used a common spatial pattern (CSP) algorithm to extract EEG and EOG features from motor imagery and eye movements. Moreover, a support vector machine (SVM) was used to solve a multiclass problem and complete flight operations through the asynchronous hBCI control of a four-axis quadcopter (e.g., takeoff, forward, backward, rightward, leftward, and landing). Online decoding of experimental results showed that 97.14, 95.23, 98.09, and 96.66% average accuracies, and 45.80, 43.99, 46.78, and 45.34 bits/min average ITRs were achieved in the control of a quadcopter. These online experimental results showed that the proposed hybrid system might be better in terms of completing multidirection control tasks to increase the multitasking and dimensionality of a BCI.
- Common spatial pattern (CSP)
- Hierarchical support vector machine (hSVM)
- Hybrid brain computer interface (hBCI)
ASJC Scopus subject areas
- Materials Science(all)