A multipurpose robotic glove designed to teach sign language through guided manual motions

Loading...
Thumbnail Image
Authors
Joshi, Soham
Malik, Raaghav
Advisors
Issue Date
2021-03
Type
Conference paper
Keywords
Multipurpose robotic glove - Design , Sign language - Teaching , Deafblindness , Biomedical engineering
Research Projects
Organizational Units
Journal Issue
Citation
Joshi, S., Malik, R. (2021). A multipurpose robotic glove designed to teach sign language through guided manual motions. Proceedings of the 2021 IEMS Conference, 27, 84-91. https://doi.org/10.62704/10057/24730
Abstract

Projections show the numbers of deaf and deafblind individuals are quickly increasing, as both the number of deaf and blind individuals are expected to double within the next 50 years. Additionally, there is limited support for deafblind individuals, as they have little sensory access to the world. Most existing solutions are inadequate because they are either not portable, readily accessible or are very expensive. The goal of our project is to create a low cost, accessible solution for those with these disabilities: a portable device that utilizes the sense of touch to teach sign language, instead of conventional sign language learning methods that rely on sight. By doing this, a more immersive and personalized experience is created. The implementation of this solution is two-pronged: first, the physical portion of the solution is equipped with servo motors that control the pulling and release of a cord threaded through rings on the fingers that moves the user's hands into various sign language positions, which is closely modeled after human hand anatomy. Secondly, feasibility is determined for an AI algorithm to take in data from an external camera to efficiently add new signs to the glove, and also to track the user's sign language patterns and rate the user's accuracy with various signs. Through these mechanisms the user is engaged in learning new signs and expanding their sign language vocabulary, all without their sense of sight. We were successful in creating a working prototype analyzed through hand-tracking mechanisms, and achieving a 91-92% accuracy for the American Sign Language alphabet.

Table of Contents
Description
Published in SOAR: Shocker Open Access Repository by Wichita State University Libraries Technical Services, May 2022.
The IEMS'21 conference committee: Wichita State University, College of Engineering (Sponsor); Gamal Weheba (Conference Chair); Hesham Mahgoub (Program Chair); Dalia Mahgoub (Technical Director); Ed Sawan (Publications Editor)
Publisher
Industry, Engineering & Conference Management Systems Conference
Journal
Book Title
Series
Proceedings of the 2021 IEMS Conference
v.27
PubMed ID
ISSN
2690-3210 (print)
2690-3229 (online)
EISSN
Collections