"Our emotional brain makes us more than a ‘mere computer processing information’, because it allows us to deal with and to take into account the value of this information, enabling us to have feeling or a sense for the beautiful." Masao Ito (2002)
Brain computer interfaces (BCI), are systems that use brain signals (electric, magnetic, metabolic) to control external devices such as computers, switches, wheelchairs, or neuroprostheses. While BCI research hopes to create new communication channels for disabled or elderly persons using their brain signals, recently efforts has been focused on developing potential applications in rehabilitation, multimedia communication and relaxation (such as immersive virtual reality control). The various BCI systems use different methods to extract the users intentions from her/his brain electrical activity. In our lab, we want to improve the performance of current BCI systems by focusing on how and when EEG brain signals can trigger consistent computer commands. We will use that insight to develop BCI technologies that function in real-time and respond to multiple-commands. Such advances would significantly augment human-computer interaction (HCI).
(Lower) Typical topographic spatial patterns during imaginary hand and foot movements allow us to design optimal layout of electrodes. For three commands BCI variant, we achieved a classification rate between 84% - 92%.
Our BCI system has an EEG data acquisition module, signal preprocessing with online blind-source separation (BSS) to reduce artifacts and improvement signal to noise ratio, a features extraction system and an ensemble of adaptive classifiers (Fig. 1).
We are currently investigating and testing several promising EEG paradigms including: (i) measuring the brain activity over the primary motor cortex that results from imaginary limbs and tongue movements, (ii) detecting the presence of EEG periodic waveforms, called steady-state visual evoked potentials (SSVEP) elicited by a flashing light sources (e.g., LEDs or phase-reversing checkerboards), and (iii) identifying characteristic event related potentials (ERP) in EEG that follow an event noticed by the user (or his/her intention), e.g., P300 peak waveforms after a flash of a character the user focused attention on.
One approach we take, exploits temporal/spatial changes and/or spectral characteristics of SMR (sensorimotor rhythm) oscillations, or mu-rhythm (8-12 Hz), and beta rhythm (18-25 Hz). These oscillations typically decrease during movement or when preparing for movement (event related desynchronization, ERD) and increase after movement and in relaxation (event-related synchronization, ERS). ERD and ERS help identify those EEG features associated with the task of motor imagery EEG classification.
Taking a different approach, we designed a smart multiple choice table in a form of an array of small checkerboard images flickering with different frequency. When a BCI user focuses his/her attention, or gazes on a specific flickering image or a symbol, a corresponding periodic component (SSVEP) can be in EEG signals (although it is buried in large noise and is difficult to extract). By applying BSS and bank of sub-band filters, we showed that it is possible to decompose the SSVEP waveforms in a series of cortical components. Paradigm based on SSVEP remains one of the most promising and reliable communication modes for a fast BCI system that could discriminate of a high number of unique commands or symbols, so as to allow autonomous navigation computer screen cursor (or realize a virtual joystick).
Although the above paradigms are quite well known, our designs of experiments are quite original. Moreover, we developed and implemented novel and efficient, real-time signal preprocessing tools, and feature extraction algorithms. For example, we have applied adaptive spatial-temporal filtering and new nonnegative matrix factorization (NMF) technique to extract features (Fig. 2) and novel ICA and sub-band filtering approach to reduce noise (Fig. 3). On going research in our lab is attempting to optimize mu-rhythm and SSVEP detection algorithms for high performance, real-time, multi commands BCI.
Another promising extension of BCI is to incorporate various neurofeedbacks to train subjects to modulate EEG brain patterns and parameters such as ERPs, ERD, SMR, P300 or slow cortical potentials (SCPs) to meet a specific criterion or to learn self-regulations skills. The subject then changes their EEG patterns in response to some feedback. Such integration of neurofeedback in BCI (Fig. 4) is an emerging technology for rehabilitation, but we believe is also a new paradigm in neuroscience that might reveal previously unknown brain activities associated with behavior or self-regulated mental states. The possibility of automated context-awareness as a new interface goes far beyond the standard BCI with simple feedback control. We hope to develop the next level of BCI system using neurofeedbacks for some selective cognitive phenomena. To do so, we need to rely increasingly on findings from other disciplines, especially, neuroscience, information technology, biomedical engineering, machine learning, and clinical rehabilitation.