Automatic talk back and exoskeleton system using brain neural computer interface for ALS patients
We propose an effective model for ALS patients to enhance their mobility and to increase the possibilities of interaction such as talking and sending an email. The model additionally caters to their everyday requirements empowering a simple cooperation without controls such as joysticks, buttons and switches. Here, we design a talk back system and exoskeleton interfaced to the human brain using Brain Neural Computer Interface (BNCI). The system converts EEG signals generated by human brain to electronic commands that are processed by the computer. The logic unit then differentiates between a control and a communication signal. Accordingly, the output of the logic unit is directed either to the exoskeleton or the text to speech converter. The subjects are initially trained to ensure maximum effectiveness of the system. During the training phase, EEG signals are collected while they work on some common tasks, such as concentrating on various objects and moving the right hand. Once this training is accomplished and maximum effectiveness is ensured, the system is deployed to the patient. After deployment the control unit compares with predefined logic and is used to control legs and limbs fixtures automatically. The system uses non-invasive EEG electrodes placed on the scalp to detect and receive brain signals. In addition to EEG electrodes, force detecting sensors are placed on the body to maintain stability (just like a Segway). A virtual gaming software based system where subjects learns controlling in a gaming environment can also be developed.
Poster project completed at Wichita State University, Department of Electrical Engineering and Computer Science. BioKansas Winner. Presented at the 12th Annual Capitol Graduate Research Summit, Topeka, KS, February 12, 2015.