• Login
    View Item 
    •   Shocker Open Access Repository Home
    • Engineering
    • Industrial, Systems, and Manufacturing Engineering
    • IEMS: International Conference on Industry, Engineering, and Management Systems
    • IEMS 2021
    • View Item
    •   Shocker Open Access Repository Home
    • Engineering
    • Industrial, Systems, and Manufacturing Engineering
    • IEMS: International Conference on Industry, Engineering, and Management Systems
    • IEMS 2021
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    A multipurpose robotic glove designed to teach sign language through guided manual motions

    View/Open
    Conference paper (285.9Kb)
    Date
    2021-03
    Author
    Joshi, Soham
    Malik, Raaghav
    Metadata
    Show full item record
    Citation
    Joshi, S., Malik, R. (2021). A multipurpose robotic glove designed to teach sign language through guided manual motions. Proceedings of the 2021 IEMS Conference, 27, 84-91.
    Abstract
    Projections show the numbers of deaf and deafblind individuals are quickly increasing, as both the number of deaf and blind individuals are expected to double within the next 50 years. Additionally, there is limited support for deafblind individuals, as they have little sensory access to the world. Most existing solutions are inadequate because they are either not portable, readily accessible or are very expensive. The goal of our project is to create a low cost, accessible solution for those with these disabilities: a portable device that utilizes the sense of touch to teach sign language, instead of conventional sign language learning methods that rely on sight. By doing this, a more immersive and personalized experience is created. The implementation of this solution is two-pronged: first, the physical portion of the solution is equipped with servo motors that control the pulling and release of a cord threaded through rings on the fingers that moves the user's hands into various sign language positions, which is closely modeled after human hand anatomy. Secondly, feasibility is determined for an AI algorithm to take in data from an external camera to efficiently add new signs to the glove, and also to track the user's sign language patterns and rate the user's accuracy with various signs. Through these mechanisms the user is engaged in learning new signs and expanding their sign language vocabulary, all without their sense of sight. We were successful in creating a working prototype analyzed through hand-tracking mechanisms, and achieving a 91-92% accuracy for the American Sign Language alphabet.
    Description
    Published in SOAR: Shocker Open Access Repository by Wichita State University Libraries Technical Services, May 2022.

    The IEMS'21 conference committee: Wichita State University, College of Engineering (Sponsor); Gamal Weheba (Conference Chair); Hesham Mahgoub (Program Chair); Dalia Mahgoub (Technical Director); Ed Sawan (Publications Editor)
    URI
    https://soar.wichita.edu/handle/10057/24730
    Collections
    • IEMS 2021

    Browse

    All of Shocker Open Access RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsBy TypeThis CollectionBy Issue DateAuthorsTitlesSubjectsBy Type

    My Account

    LoginRegister

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    DSpace software copyright © 2002-2023  DuraSpace
    DSpace Express is a service operated by 
    Atmire NV