Brain-Computer Interface (BCI) is an emerging research field attracting a lot of research attention in an effort to build up a new way for communicating between computers and humans using brain signals. However,the performance of multi-class BCI systems is still not high enough. This research targets the feature extraction phase for Multi-Class BCI systems based on motor imagery. By analyzing the properties of covariance matrices and the nature of brain signals through experiments,the research proposes two new methods of feature extraction in BCI systems. The first method is called Approximation-based Common Principal Components (ACPC) analysis. This method aims at finding a common subspace from original subspaces which contain information of classes. Compared with the current state-of-the-art methods based on Common Spatial Patterns (CSP),this method directly deals with multi-class problems instead of converting multi-class problems into many 2-class problems. The second method is based on Aggregate Models. Its main idea comes from a highly challenging problem of large inter-subject and inter-session variability in BCI experiments. Exploiting these characteristics,this method can be used not only in Subject-Dependent Multi-Class BCI systems but also in Subject-Independent Multi-Class BCI ones. A combination of the proposed methods leads to a new method called Segmented Spatial Filters (SSF). The SSF method can not only improve spatial resolution of brain signals but also eciently deal with inter-subject and inter-session variability in multi-class BCI systems. Experiments were conducted on the Dataset 2a of the BCI Competition IV which is a well known dataset for multi-class BCI systems. Experimental results show that the proposed ACPC and Aggregate Model methods are superior to current state-of-the-art feature extraction methods that are based on CSP. The later model can also be applied in Subject-Independent Multi-Class BCI systems in a natural way with better accuracy compared with other related methods.
|Date of Award||1 Jan 2014|