Hand gesture-based artificial neural network trained hybrid human–machine interface system to navigate a powered wheelchair
Desai, Jaydip M.
MetadataShow full item record
Stroh, A., & Desai, J. (2021). Hand gesture-based artificial neural network trained hybrid Human–machine interface system to navigate a powered wheelchair. Journal of Bionic Engineering, doi:10.1007/s42235-021-00074-z
Individuals with cerebral palsy and muscular dystrophy often lack fine motor control of their fingers which makes it difficult to control traditional powered wheelchairs using a joystick. Studies have shown the use of surface electromyography to steer powered wheelchairs or automobiles either through simulations or gaming controllers. However, these studies significantly lack issues with real world scenarios such as user’s safety, real-time control, and efficiency of the controller mechanism. The purpose of this study was to design, evaluate, and implement a hybrid human–machine interface system for a powered wheelchair that can detect human intent based on artificial neural network trained hand gesture recognition and navigate a powered wheelchair without colliding with objects around the path. Scaled Conjugate Gradient (SCG), Bayesian Regularization (BR), and Levenberg Marquart (LM) supervised artificial neural networks were trained in offline testing on eight participants without disability followed by online testing using the classifier with highest accuracy. Bayesian Regularization architecture showed highest accuracy at 98.4% across all participants and hidden layers. All participants successfully completed the path in an average of 5 min and 50 s, touching an average of 22.1% of the obstacles. The proposed hybrid system can be implemented to assist people with neuromuscular disabilities in near future.
Click on the URL link to access the article (may not be free).