Automatic talk back and exoskeleton system using brain neural computer interface (BNCI) for ALS patients

Loading...
Thumbnail Image
Authors
Telakapalli, Abhignan
Advisors
Lakshmikanth, Geethalakshmi S.
Issue Date
2015-04-24
Type
Abstract
Keywords
Research Projects
Organizational Units
Journal Issue
Citation
Telakapalli, Abhignan. Automatic Talk Back and Exoskeleton System Using Brain Neural Computer Interface (BNCI) for ALS Patients. --In Proceedings: 11th Annual Symposium on Graduate Research and Scholarly Projects. Wichita, KS: Wichita State University, p. 27
Abstract

In this project we execute an effective model for ALS patients to enhance their mobility and increase the possibilities of interaction such as talking, sending an email, and playing video games. The model additionally caters to their everyday requirements empowering a simple cooperation without controls such as joysticks, buttons and switches. In this model we design a talk back system and exoskeleton interfaced to the human brain using Brain Neural Computer Interface (BNCI). The system converts EEG signals generated by the human brain to electronic commands that are processed by the computer. The logic unit then differentiates between a control signal and a communication signal. Accordingly, the output of the logic unit is directed either to the exoskeleton or the text to speech converter. The subjects are initially trained to ensure maximum effectiveness of the system. In the training, signals are collected while they concentrate on various objects. Secondly, they are given a simple task to be accomplished, such as thinking about a word, moving right hand finger, etc. Once the effectiveness is ensured, the system is deployed to the patient. The control unit is responsible to control the whole system. After deployment the control unit compares with predefined logic and is used to control legs and limbs fixtures automatically. The system uses non-invasive EEG-electrodes placed on the scalp to detect and receive brain signals. In addition to EEG electrodes, force detecting sensors are placed on the body to maintain stability. For example if the volunteer leans forward, the sensor detects and moves forward (just like a segway). In addition to the system we also design a virtual gaming software where volunteer learns how to move and control the actuators by practicing them using a gaming environment.

Table of Contents
Description
Presented to the 11th Annual Symposium on Graduate Research and Scholarly Projects (GRASP) held at the Heskett Center, Wichita State University, April 24, 2015.
Research completed at Department of Electrical Engineering and Computer Science, College of Engineering
Publisher
Wichita State University. Graduate School
Journal
Book Title
Series
GRASP
v.11
PubMed ID
DOI
ISSN
EISSN