• Login
    View Item 
    •   Shocker Open Access Repository Home
    • Graduate Student Research
    • GRASP: Graduate Research and Scholarly Projects
    • Proceedings 2015: 11th Annual Symposium on Graduate Research and Scholarly Projects
    • View Item
    •   Shocker Open Access Repository Home
    • Graduate Student Research
    • GRASP: Graduate Research and Scholarly Projects
    • Proceedings 2015: 11th Annual Symposium on Graduate Research and Scholarly Projects
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Automatic talk back and exoskeleton system using brain neural computer interface (BNCI) for ALS patients

    View/Open
    Abstract (8.750Kb)
    Date
    2015-04-24
    Author
    Telakapalli, Abhignan
    Advisor
    Lakshmikanth, Geethalakshmi S.
    Metadata
    Show full item record
    Citation
    Telakapalli, Abhignan. Automatic Talk Back and Exoskeleton System Using Brain Neural Computer Interface (BNCI) for ALS Patients. --In Proceedings: 11th Annual Symposium on Graduate Research and Scholarly Projects. Wichita, KS: Wichita State University, p. 27
    Abstract
    In this project we execute an effective model for ALS patients to enhance their mobility and increase the possibilities of interaction such as talking, sending an email, and playing video games. The model additionally caters to their everyday requirements empowering a simple cooperation without controls such as joysticks, buttons and switches. In this model we design a talk back system and exoskeleton interfaced to the human brain using Brain Neural Computer Interface (BNCI). The system converts EEG signals generated by the human brain to electronic commands that are processed by the computer. The logic unit then differentiates between a control signal and a communication signal. Accordingly, the output of the logic unit is directed either to the exoskeleton or the text to speech converter. The subjects are initially trained to ensure maximum effectiveness of the system. In the training, signals are collected while they concentrate on various objects. Secondly, they are given a simple task to be accomplished, such as thinking about a word, moving right hand finger, etc. Once the effectiveness is ensured, the system is deployed to the patient. The control unit is responsible to control the whole system. After deployment the control unit compares with predefined logic and is used to control legs and limbs fixtures automatically. The system uses non-invasive EEG-electrodes placed on the scalp to detect and receive brain signals. In addition to EEG electrodes, force detecting sensors are placed on the body to maintain stability. For example if the volunteer leans forward, the sensor detects and moves forward (just like a segway). In addition to the system we also design a virtual gaming software where volunteer learns how to move and control the actuators by practicing them using a gaming environment.
    Description
    Presented to the 11th Annual Symposium on Graduate Research and Scholarly Projects (GRASP) held at the Heskett Center, Wichita State University, April 24, 2015.

    Research completed at Department of Electrical Engineering and Computer Science, College of Engineering
    URI
    http://hdl.handle.net/10057/11400
    Collections
    • Proceedings 2015: 11th Annual Symposium on Graduate Research and Scholarly Projects

    Browse

    All of Shocker Open Access RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsBy TypeThis CollectionBy Issue DateAuthorsTitlesSubjectsBy Type

    My Account

    LoginRegister

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    DSpace software copyright © 2002-2023  DuraSpace
    DSpace Express is a service operated by 
    Atmire NV