Show simple item record

dc.contributor.authorJoshi, Soham
dc.contributor.authorMalik, Raaghav
dc.date.accessioned2022-12-05T21:15:07Z
dc.date.available2022-12-05T21:15:07Z
dc.date.issued2021-03
dc.identifier.citationJoshi, S., Malik, R. (2021). A multipurpose robotic glove designed to teach sign language through guided manual motions. Proceedings of the 2021 IEMS Conference, 27, 84-91.
dc.identifier.issn2690-3210 (print)
dc.identifier.issn2690-3229 (online)
dc.identifier.urihttps://soar.wichita.edu/handle/10057/24730
dc.descriptionPublished in SOAR: Shocker Open Access Repository by Wichita State University Libraries Technical Services, May 2022.
dc.descriptionThe IEMS'21 conference committee: Wichita State University, College of Engineering (Sponsor); Gamal Weheba (Conference Chair); Hesham Mahgoub (Program Chair); Dalia Mahgoub (Technical Director); Ed Sawan (Publications Editor)
dc.description.abstractProjections show the numbers of deaf and deafblind individuals are quickly increasing, as both the number of deaf and blind individuals are expected to double within the next 50 years. Additionally, there is limited support for deafblind individuals, as they have little sensory access to the world. Most existing solutions are inadequate because they are either not portable, readily accessible or are very expensive. The goal of our project is to create a low cost, accessible solution for those with these disabilities: a portable device that utilizes the sense of touch to teach sign language, instead of conventional sign language learning methods that rely on sight. By doing this, a more immersive and personalized experience is created. The implementation of this solution is two-pronged: first, the physical portion of the solution is equipped with servo motors that control the pulling and release of a cord threaded through rings on the fingers that moves the user's hands into various sign language positions, which is closely modeled after human hand anatomy. Secondly, feasibility is determined for an AI algorithm to take in data from an external camera to efficiently add new signs to the glove, and also to track the user's sign language patterns and rate the user's accuracy with various signs. Through these mechanisms the user is engaged in learning new signs and expanding their sign language vocabulary, all without their sense of sight. We were successful in creating a working prototype analyzed through hand-tracking mechanisms, and achieving a 91-92% accuracy for the American Sign Language alphabet.
dc.format.extent8 pages
dc.language.isoen_US
dc.publisherIndustry, Engineering & Conference Management Systems Conference
dc.relation.ispartofseriesProceedings of the 2021 IEMS Conference
dc.relation.ispartofseriesv.27
dc.subjectMultipurpose robotic glove - Design
dc.subjectSign language - Teaching
dc.subjectDeafblindness
dc.subjectBiomedical engineering
dc.titleA multipurpose robotic glove designed to teach sign language through guided manual motions
dc.typeConference paper
dc.rights.holderInternational Conference on Industry, Engineering, and Management Systems


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • IEMS 2021
    The 2021 International Conference on Industry, Engineering, and Management Systems

Show simple item record