Show simple item record

dc.contributor.authorReust, Adam
dc.contributor.authorDesai, Jaydip M.
dc.contributor.authorGomez, Louis
dc.date.accessioned2019-04-21T04:08:45Z
dc.date.available2019-04-21T04:08:45Z
dc.date.issued2019-02-14
dc.identifier.citationReust, Adam; Desai, Jaydip M.; Gomez, Louis. 2019. Extracting motor imagery features to control two robotic hands. 2018 IEEE International Symposium on Signal Processing and Information Technology, ISSPIT 2018 14 February 2019, Article number 8642627:pp 118-122en_US
dc.identifier.isbn978-153867568-7
dc.identifier.urihttps://doi.org/10.1109/ISSPIT.2018.8642627
dc.identifier.urihttp://hdl.handle.net/10057/16025
dc.descriptionClick on the DOI link to access the article (may not be free).en_US
dc.description.abstractBrain-Machine Interface (BMI) technology has the potential to restore physical movement of the arm or leg by acquiring electroencephalogram (EEG) signals from the human brain, detecting changes associated with a human arm or leg movements, and generate control signals for the assistive devices in real-time. This project was designed to understand motor imagery tasks associated with human hand movement during visual stimulation, record EEG signals for actual and imagery tasks, and train artificial neural network algorithms u sing three different methods: Scaled Conjugate Gradient, Levenberg-Marquardt, and Bayesian Regularization. Hjorth parameters were calculated prior to train neural network algorithm in order to extract four features: rest, right hand, left hand and both hands. The experiment includes 16-channel wired EEG system from g.tec to acquire real-time signals from the human scalp in Simulink at a sampling rate of 512 samples/second. Eight human subjects between ages of 18 to 52 were recruited to perform both studies associated with human hand movements. Motor imagery signals from C3, FCz, and C4 were used for feedforward pattern recognition neural network algorithm. Sixteen features were calculated during EEG signals recording to achieve overall 95 percent accuracy to successfully detect four different classes. A successful BMI model was developed to control two robotic hands using Arduino-Simulink library in real time with trained artificial neural networks.en_US
dc.language.isoen_USen_US
dc.publisherIEEEen_US
dc.relation.ispartofseries2018 IEEE International Symposium on Signal Processing and Information Technology, ISSPIT 2018;
dc.subjectBrain-Machine Interfaceen_US
dc.subjectElectroencephalographyen_US
dc.subjectMotor Imageryen_US
dc.subjectNeural Networksen_US
dc.subjectSimulinken_US
dc.titleExtracting motor imagery features to control two robotic handsen_US
dc.typeConference paperen_US
dc.rights.holder© 2018 IEEEen_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record