An auditory brain–computer interface (BCI)

https://doi.org/10.1016/j.jneumeth.2007.02.009Get rights and content

Abstract

Brain–computer interfaces (BCIs) translate brain activity into signals controlling external devices. BCIs based on visual stimuli can maintain communication in severely paralyzed patients, but only if intact vision is available. Debilitating neurological disorders however, may lead to loss of intact vision. The current study explores the feasibility of an auditory BCI. Sixteen healthy volunteers participated in three training sessions consisting of 30 2–3 min runs in which they learned to increase or decrease the amplitude of sensorimotor rhythms (SMR) of the EEG. Half of the participants were presented with visual and half with auditory feedback. Mood and motivation were assessed prior to each session.

Although BCI performance in the visual feedback group was superior to the auditory feedback group there was no difference in performance at the end of the third session. Participants in the auditory feedback group learned slower, but four out of eight reached an accuracy of over 70% correct in the last session comparable to the visual feedback group. Decreasing performance of some participants in the visual feedback group is related to mood and motivation. We conclude that with sufficient training time an auditory BCI may be as efficient as a visual BCI. Mood and motivation play a role in learning to use a BCI.

Introduction

Patients with severe motor paralysis need muscle-independent communication technologies. Communication is of utmost importance for the quality of life of such patients (Bach, 1993). Especially when patients have neurological or muscular diseases, such as amyotrophic lateral sclerosis (ALS), which leads to motor impairment as the disease progresses. The most horrifying outlook of the patients is the so-called locked-in state (LIS), in which only residual voluntary muscular control remains, rendering communication difficult or even impossible. In the complete locked-in state (CLIS) all voluntary muscular control is lost. Brain–computer interfaces (BCIs) are systems that translate brain activity into signals controlling external devices (Birbaumer, 2006). BCIs can maintain communication in severely paralyzed patients (Birbaumer et al., 1999, Kübler et al., 2001a, Kübler et al., 2001b).

Many CLIS patients have compromised vision, and may not be able to use a visually based BCI. Thus, BCIs on the basis of other sensory modalities need to be explored. Usually the auditory system is uncompromised in these patients.

The feasibility of an auditory BCI has been investigated in few studies using different EEG input signals, e.g., P300 evoked potential and slow cortical potentials (SCPs), which will be briefly reviewed. Sellers and Donchin tested healthy volunteers and patients with ALS with a four-choice P300 BCI. Patients were presented either visually or auditorily or both with the words “yes”, “no”, “pass”, and “end” (Sellers and Donchin, 2006). The patients’ task was to focus their attention on either “yes” or “no”. The authors were able to show that a target probability of 25% was low enough to reliably elicit a P300 and that this response remained stable over a period of 12 sessions in healthy volunteers as well as in ALS patients.

Hill et al. (2005) attempted to classify P300 evoked responses that occurred in response to two simultaneous presented auditory stimulus streams. Both streams constituted an auditory oddball paradigm. To choose one of two possible targets (binary decision), the participant had to focus on either one of the streams. When attention was focused on the target stimuli (e.g., by counting them), EEG responses to target stimuli and standard stimuli could be classified. Although variation between participants existed, classification results suggested that it was possible for a user to direct conscious attention, and thereby modulate the event-related potentials that occur in response to auditory stimuli reliably enough, in single trials, to provide a useful basis for an auditory BCI.

Slow cortical potentials constitute another component of the EEG which can be used as a BCI input signal (Birbaumer et al., 1999). Hinterberger et al. compared learning to regulate the SCP amplitude of the EEG by means of visual or auditory feedback or a combination of the two modalities (Hinterberger et al., 2004). Three groups of 18 participants each underwent three sessions of SCP training. More than 70% correct responses were achieved by six participants with visual feedback, by five participants with auditory feedback and only two participants with combined feedback. The average accuracy in the last session was 67% in the visual condition, 59% in the auditory condition and 57% in the combined condition. Thus, in this study, visual feedback was superior to the auditory and combined feedback, but BCI control could be also achieved with auditory feedback.

An auditory BCI which uses sensorimotor rhythm (SMR) as an input signal has not been tested. Most adults display SMR in the EEG recorded over sensorimotor cortices (Niedermeyer and Lopes da Silva, 2005). The major component of SMR is often called the mu rhythm and is in the alpha band (8–12 Hz), usually accompanied by changes in synchronization in the beta band (13–25 Hz). The SMR amplitude increases or decreases related to motor movement. More important for BCI research, SMR amplitude also changes as a function of motor imagery. A recent study with visual feedback of SMR amplitude with four paralyzed ALS patients showed successful SMR regulation for all patients (Kübler et al., 2005). Performance was over 70% in all patients (Kübler et al., 2005), and thus sufficient for communication (Perelmouter and Birbaumer, 2000). This makes the SMR-based BCI a good candidate for patients. However, whether the SMR-based BCI is feasible when auditory feedback of SMR amplitude is provided remained an open question.

In the present study, we compared BCI performance based on auditory or visual stimuli of the SMR amplitude in healthy volunteers. We aimed at answering two questions: first, can healthy participants achieve the same level of performance with auditory feedback as with visual feedback? Second, are there differences in learning as a function of feedback modality? Additionally, we were interested whether psychological variables, namely mood and motivation, would influence performance.

Section snippets

Participants

Participants were 16 students from the University of Tübingen (6 men, 10 women) with no history of neurological or psychiatric disorder, who were divided in two groups. Participants were paid €8/h and all were naïve with regards to BCI training. The average age of the participants was equal in both groups (auditory feedback group (26.75 ± 4.13); visual feedback group (24.13 ± 2.64)). Participants gave informed consent for the study, which had been reviewed and approved by the Medical Ethical

Group data

Performance for both groups are presented in Fig. 3. The 9 × 2 repeated measures ANOVA revealed a main effect of feedback modality (F8,1 = 8.326; p < .05) and a significant interaction between block and modality (F8,1 = 3.757; p < .001), but no main effect of blocks. With an average performance of 74.1% (S.D. ± 13.45) the visual feedback group performed better than the auditory feedback group (55.96%, S.D. ± 10.26).

Simple pairwise comparisons of blocks between the groups revealed that in the first block the

Discussion

To summarize, in three training sessions visual feedback of SMR amplitude led to better average BCI performance than auditory feedback. This was due to high performance at the beginning of training in the visual feedback group. After three training sessions performance was the same in both groups. Thus, auditory feedback required more training, but lead to approximately the same level of performance at the end of training. We conclude, that a two-choice BCI based on auditory feedback is as

Acknowledgements

Authors would like to thank all the participants of the study for their time and effort, Boris Kleber for creating the auditory stimuli and Ramon Brendel for support during the Pilot Study for auditory feedback. Funded by the Deutsche Forschungsgemeinschaft (DFG/SFB 500/TB5) and the National Institutes of Health (NIH).

References (18)

  • A. Kübler et al.

    Brain–computer communication: self-regulation of slow cortical potentials for verbal communication

    Arch. Phys. Med. Rehabil.

    (2001)
  • E.W. Sellers et al.

    A P300-based brain–computer interface: initial tests by ALS patients

    Clin. Neurophysiol.

    (2006)
  • M. Averbeck et al.

    Skalen zur Erfassung der Lebensqualität (SEL)—manual

    (1997)
  • J.R. Bach

    Amyotrophic lateral sclerosis–communication status and survival with ventilatory support

    Am. J. Phys. Med. Rehabil.

    (1993)
  • N. Birbaumer et al.

    A spelling device for the paralysed

    Nature

    (1999)
  • N. Birbaumer

    Breaking the silence: brain–computer interfaces (BCI) for communication and motor control

    Psychophysiology

    (2006)
  • N.J. Hill et al.

    An auditory paradigm for brain–computer interfaces

    Advances in neural information processing systems

    (2005)
  • T. Hinterberger et al.

    A multimodal brain-based feedback and communication system

    Exp. Brain Res.

    (2004)
  • A. Kübler et al.

    Brain–computer communication: unlocking the locked in

    Psychol. Bull.

    (2001)
There are more references available in the full text version of this article.

Cited by (300)

  • Decoding Bilingual EEG Signals with Complex Semantics Using Adaptive Graph Attention Convolutional Network

    2024, IEEE Transactions on Neural Systems and Rehabilitation Engineering
View all citing articles on Scopus
View full text