Using finite state machine and a hybrid of EEG signal and EOG artifacts for an asynchronous wheelchair navigation

https://doi.org/10.1016/j.eswa.2014.10.052Get rights and content

Highlights

  • Asynchronous wheelchair navigation system using EEG signals and EOG artifacts.

  • Horizontal gazes are observed at C3, C4 and eyelid positions at O2.

  • The system is modeled as a finite state machine with three states for each mode.

  • Total of six different directions move in forward and backward.

  • Actual navigation tasks are performed and registered an average accuracy of 97.5%.

Abstract

In this study, an asynchronous wheelchair navigation system using a hybrid of EEG signal and EOG artifacts embedded in EEG signals is demonstrated. The EEG signals are recorded at three different locations on the scalp in the occipital and motor cortex regions. First, an EEG signal related to eyelid position is analyzed and used to determine whether the eyes are closed or open. If the eyes are closed, no wheelchair movement is allowed. If the eyes are open, EOG traces (artifacts) from two other EEG signals are examined to infer the gaze direction of the eyes. A sliding window is utilized to position important cues in the trace signals at the center of the window for effective classification. The variance of the EEG signal is used to determine the eyelid position by thresholding. Then, features extracted from the EOG traces are used as inputs to a pair of minimum distance classifiers whose outputs reveal the gaze shift performed by the eyes. The wheelchair navigation system is designed to move forward and backward in a total of six different directions. However, the number of distinct gaze direction that can be used as commands to move the wheelchair is only three. Therefore, we model the system as a finite state machine with three modes, each containing three states to overcome this deficiency. The system is equipped with proximity sensor to avoid collision with obstacles. A stop command is also available for safety measures. In a real-time experiment involving 20 participants, the system performed well as it registered a high accuracy of 97.88% with an average of computational time less than 1 s. The system was also tested by five participants in a navigation experiment where each participant successfully completed all tasks without collision while showing improvement in maneuvering ability over attempts.

Introduction

Assisting the physically disabled and aging community to perform daily activities is challenging as their needs vary depending on their physical limitations. Until today, various forms of human machine interface (HMI) systems have been developed to help the physically challenged to regain some mobility and improve the quality of their lives. A typical HMI system normally takes a signal or a combination of signals as inputs and transform them into a command to be executed by an actuator. These signals can come in different forms like sip and puff (Mougharbel, El-Hajj, Ghamlouch, & Monacelli, 2013), eye tracking control (Nguyen & Jo, 2012), tongue control (Kim et al., 2013), head gesture (Halawani, ur Réhman, Li, & Anani, 2012) electromyogram (EMG) (Xu, Zhang, Luo, & Chen, 2013), electrooculogram (EOG) (Ianez, Ubeda, Azorin, & Perez-Vidal, 2012) and electroencephalogram (EEG) (Perdikis et al., 2014).

A brain-actuated wheelchair is an example of an HMI that can be used for assistive purpose. This paper presents an approach to navigate an asynchronous wheelchair using finite state machine and an EEG signal from the alpha band (8–13 Hz) signal and EOG traces extracted from EEG delta band (<3 Hz) signals. The alpha signal is recorded at the occipital region while the delta signals are obtained from the motor cortex area. From these signals, the eyelid positions (closed or open) and the gaze directions (straight, right or left) can be determined by a classifier. Depending on the eyelid positions and gaze directions, a command is issued to move the wheelchair. The wheelchair navigation system is modeled as a finite state machine to allow it to move in three directions forwardly or backwardly, using three gaze directions only. The user can also instruct the wheelchair to stop when needed.

The remainder of this paper is organized as follows. Section 2 describes the related works that use EEG and EOG signals to control an automated wheelchair. In Section 3, the methodology of the proposed system is presented where details of the data acquisition, sliding window, feature classification, finite state machine model and hardware implementation are given. Section 4 describes the experiments performed and the results achieved. Finally in Section 5, conclusions are drawn and future works outlined.

Section snippets

Related work

EEG and EOG signals have been studied by researchers and medical practitioners for various purposes. One of the applications of EEG and EOG signals is as inputs to a wheelchair navigation system. In the early systems of wheelchair navigation using these signals, the wheelchair could only be steered in forward direction (Cao et al., 2014, Choi and Jo, 2013, Choi and Cichocki, 2008, Diez et al., 2013, Galan et al., 2008, Hai et al., 2013, Iturrate et al., 2009, Kaufmann et al., 2014, Khare et

Data acquisition and signal properties

The signals are acquired using g.Mobilab from Guger Technologies at a sampling rate of 256 Hz. The gold electrodes are placed at C3, C4 and O2 with Cz as reference and FPz as ground. The electrode arrangement follows the International 10–20 montage system and the impedance is kept under 10 kΩ. The signals are recorded and analyzed using LabVIEW of National Instrument. The EOG traces are then extracted from the EEG signals using Butterworth filters. For the O2 signal, the EEG signal related to the

Experimental procedure

A total of 20 healthy participants with age ranging from 23 to 27 years old were recruited in the experiment. The participants were briefed on the purpose and nature of the study before signing a consent form for their participation in the experimental session.

The study is conducted in three separate sessions namely the training, online and navigation sessions. The first session involves collecting offline data to determine some threshold values needed for sliding window adjustment and

Evaluation of online session

Table 4 shows the confusion matrix of the results during an online session. Overall, 783 out of 800 instructions were executed correctly and this figure translates to 97.88% execution rate. It is observed that the go straight, stop and null act instructions are perfectly executed. Seven turn left and ten turn right commands are wrongly executed as going straight. This is mainly due to the operator failing to initiate a response within two seconds after the commands are issued. This occurs when

Conclusions

In this study, a hybrid BCI for the asynchronous wheelchair navigation system using an EEG signal and EOG traces embedded in EEG signals is described. A sliding window is used to capture important cues in the signals for a high classification rate. The variance of the EEG signal is used to determine the condition of closed or open eye by thresholding. Then, features extracted from the EOG traces are used as inputs to a pair of minimum distance classifiers whose outputs reveal the gaze shift

Acknowledgements

This work was supported by High Impact Research Grant (www.hir.um.edu.my) Reference No.: UM.C/HIR/MOHE/ENG/16 from Ministry of Higher Education (MOHE), Malaysia.

References (38)

  • K. Choi et al.

    Control of a wheelchair by motor imagery in real time

  • B. Choi et al.

    A low-cost EEG system-based hybrid brain-computer interface for humanoid robot navigation and recognition

    Plos One

    (2013)
  • N.T. Hai et al.

    Mean threshold and ARNN algorithms for identification of eye commands in an EEG-controlled wheelchair

    Engineering

    (2013)
  • A. Halawani et al.

    Active vision for controlling an electric wheelchair

    Intelligent Service Robotics

    (2012)
  • D.D. Huang et al.

    Electroencephalography (EEG)-based brain-Computer interface (BCI): A 2-D virtual wheelchair control based on event-related desynchronization/synchronization and state control

    IEEE Transactions on Neural Systems and Rehabilitation Engineering

    (2012)
  • I. Iturrate et al.

    A noninvasive brain-actuated wheelchair based on a P300 Neurophysiological protocol and automated navigation

    IEEE Transactions on Robotics

    (2009)
  • T. Kaufmann et al.

    Toward brain-computer interface based wheelchair control utilizing tactually-evoked event-related potentials

    Journal of NeuroEngineering and Rehabilitation

    (2014)
  • V. Khare et al.

    Brain computer interface based real time control of wheelchair using electroencephalogram

    International Journal of Soft Computing

    (2011)
  • J. Kim et al.

    The tongue enables computer and wheelchair control for people with spinal cord injury

    Science Translational Medicine

    (2013)
  • Cited by (0)

    View full text