Show simple item record

dc.contributor.advisorLakshmikanth, Geethalakshmi S.
dc.contributor.authorTelakapalli, Abhignan
dc.date.accessioned2015-08-10T14:57:54Z
dc.date.available2015-08-10T14:57:54Z
dc.date.issued2015-04-24
dc.identifier.citationTelakapalli, Abhignan. Automatic Talk Back and Exoskeleton System Using Brain Neural Computer Interface (BNCI) for ALS Patients. --In Proceedings: 11th Annual Symposium on Graduate Research and Scholarly Projects. Wichita, KS: Wichita State University, p. 27
dc.identifier.urihttp://hdl.handle.net/10057/11400
dc.descriptionPresented to the 11th Annual Symposium on Graduate Research and Scholarly Projects (GRASP) held at the Heskett Center, Wichita State University, April 24, 2015.
dc.descriptionResearch completed at Department of Electrical Engineering and Computer Science, College of Engineering
dc.description.abstractIn this project we execute an effective model for ALS patients to enhance their mobility and increase the possibilities of interaction such as talking, sending an email, and playing video games. The model additionally caters to their everyday requirements empowering a simple cooperation without controls such as joysticks, buttons and switches. In this model we design a talk back system and exoskeleton interfaced to the human brain using Brain Neural Computer Interface (BNCI). The system converts EEG signals generated by the human brain to electronic commands that are processed by the computer. The logic unit then differentiates between a control signal and a communication signal. Accordingly, the output of the logic unit is directed either to the exoskeleton or the text to speech converter. The subjects are initially trained to ensure maximum effectiveness of the system. In the training, signals are collected while they concentrate on various objects. Secondly, they are given a simple task to be accomplished, such as thinking about a word, moving right hand finger, etc. Once the effectiveness is ensured, the system is deployed to the patient. The control unit is responsible to control the whole system. After deployment the control unit compares with predefined logic and is used to control legs and limbs fixtures automatically. The system uses non-invasive EEG-electrodes placed on the scalp to detect and receive brain signals. In addition to EEG electrodes, force detecting sensors are placed on the body to maintain stability. For example if the volunteer leans forward, the sensor detects and moves forward (just like a segway). In addition to the system we also design a virtual gaming software where volunteer learns how to move and control the actuators by practicing them using a gaming environment.
dc.description.sponsorshipGraduate School, Academic Affairs, University Libraries
dc.language.isoen_US
dc.publisherWichita State University. Graduate School
dc.relation.ispartofseriesGRASP
dc.relation.ispartofseriesv.11
dc.titleAutomatic talk back and exoskeleton system using brain neural computer interface (BNCI) for ALS patients
dc.typeAbstract
dc.rights.holderWichita State University


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record