Extracting motor imagery features to control two robotic hands

No Thumbnail Available
Issue Date
Embargo End Date
Reust, Adam
Desai, Jaydip M.
Gomez, Louis

Reust, Adam; Desai, Jaydip M.; Gomez, Louis. 2019. Extracting motor imagery features to control two robotic hands. 2018 IEEE International Symposium on Signal Processing and Information Technology, ISSPIT 2018 14 February 2019, Article number 8642627:pp 118-122


Brain-Machine Interface (BMI) technology has the potential to restore physical movement of the arm or leg by acquiring electroencephalogram (EEG) signals from the human brain, detecting changes associated with a human arm or leg movements, and generate control signals for the assistive devices in real-time. This project was designed to understand motor imagery tasks associated with human hand movement during visual stimulation, record EEG signals for actual and imagery tasks, and train artificial neural network algorithms u sing three different methods: Scaled Conjugate Gradient, Levenberg-Marquardt, and Bayesian Regularization. Hjorth parameters were calculated prior to train neural network algorithm in order to extract four features: rest, right hand, left hand and both hands. The experiment includes 16-channel wired EEG system from g.tec to acquire real-time signals from the human scalp in Simulink at a sampling rate of 512 samples/second. Eight human subjects between ages of 18 to 52 were recruited to perform both studies associated with human hand movements. Motor imagery signals from C3, FCz, and C4 were used for feedforward pattern recognition neural network algorithm. Sixteen features were calculated during EEG signals recording to achieve overall 95 percent accuracy to successfully detect four different classes. A successful BMI model was developed to control two robotic hands using Arduino-Simulink library in real time with trained artificial neural networks.

Table of Content
Click on the DOI link to access the article (may not be free).