Online classification algorithm for eye-movement-based communication systems using two temporal EEG sensors

Abdelkader Nasreddine Belkacem, Duk Shin, Hiroyuki Kambara, Natsue Yoshimura, Yasuharu Koike

Research output: Contribution to journalArticlepeer-review

31 Citations (Scopus)

Abstract

Real-time classification of eye movements offers an effective mode for human-machine interaction, and many eye-based interfaces have been presented in the literature. However, such systems often require that sensors be attached around the eyes, which can be obtrusive and cause discomfort. Here, we used two electroencephalography sensors positioned over the temporal areas to perform real-time classification of eye-blink and five classes of eye movement direction. We applied a continuous wavelet transform for online detection then extracted some discriminable time-series features. Using linear classification, we obtain an average accuracy of 85.2% and sensitivity of 77.6% over all classes. The results showed that the proposed algorithm was efficient in the detection and classification of eye movements, providing high accuracy and low-latency for single trials. This work demonstrates the promise of portable eye-movement-based communication systems and the sensor positions, features extraction, and classification methods used.

Original languageEnglish
Pages (from-to)40-47
Number of pages8
JournalBiomedical Signal Processing and Control
Volume16
DOIs
Publication statusPublished - Feb 2015
Externally publishedYes

Keywords

  • Brain-computer interface (BCI)
  • EOG-related applications
  • Electrooculography (EOG)
  • Eye movements Electroencephalogram (EEG)
  • Online classification
  • Wearable sensors

ASJC Scopus subject areas

  • Signal Processing
  • Health Informatics

Fingerprint

Dive into the research topics of 'Online classification algorithm for eye-movement-based communication systems using two temporal EEG sensors'. Together they form a unique fingerprint.

Cite this