Non-invasive brain-computer interface to extract six motor
imagery features for three-dimensional control
Authors
Advisors
Issue Date
Type
Keywords
Citation
Abstract
Individuals with severe motor and speech disabilities lack the ability to communicate and move. These abilities can be given back to individuals using braincomputer interface (BCI) technology. When neurons within the brain fire, small traces of electrical activity can be measured noninvasively using Electroencephalography (EEG). Then by applying signal processing techniques, features related to certain brain signals can be extracted and classified to be used as control commands for a BCI system. Imagined movements cause changes in the alpha (8-13 Hz) and beta (14-30 Hz) bandwidths of EEG recorded brain signals. This study investigates using new combinations of imagined movements to test the effectiveness of creating more control commands for a BCI system. Six imagined movements were investigated to be used for control in three-dimensions. Linear discriminate analysis (LDA) and neural network (NN) classifiers accuracies at categorizing user intent for six test subjects were tested to confirm that these new imagined movements can be practical control commands for a BCI. Results showed that the combinations of hand and foot imagined movements can be used as additional user intentions. High activity over the C1 and C2 regions of the brain shows that these imagined movements can be distinguished from those typically used in BCI testing. The comparative analysis of LDA and NN classifiers showed accuracies of 71.85% and 78.87% average accuracy for correctly classifying the imagined movement for all subjects after 45 trials per imagined movement. The impact of this is additional imagined movements combinations not previously used in BCI research show promising results for being used as control commands for a system.

