Touch detection in augmented Omni-surface for human-robot teaming

Loading...
Thumbnail Image
Authors
Yan, Fujian
Chavez, Edgar
Yihun, Yimesker S.
He, Hongsheng
Advisors
Issue Date
2022-12
Type
Article
Keywords
Touching tracking , Human-robot teaming , Sensor fusion
Research Projects
Organizational Units
Journal Issue
Citation
Yan, F., Chavez, E., Yihun, Y.S., He, H. (2022). Touch detection in augmented Omni-surface for human-robot teaming. Journal of Management & Engineering Integration, 15(2), 92-99. https://doi.org/10.62704/10057/24828
Abstract

This paper proposes an architecture that augments arbitrary surfaces into an interactive touching interface. The proposed architecture can detect the number of touching fingertips of human operators by detecting and recognizing the fingertips with a convolutional neural network (CNN). The inputs of the CNN model are images that are acquired by an RGB-D sensor. The aligned depth information acquired by the RGB-D sensor generates the plane model, which determines whether fingertips are touching the surface or not. Instead of the traditional plane modeling method that can only be used for flat surfaces, the proposed system can also work on curved surfaces. Corresponding gestures are defined based on the detected touching fingers on the surface. The feedback of the robots is projected on the working surface accordingly with an interactive projector. Compared to conventional programming interfaces, directly touching is much more natural for human beings. Based on the experiments, the proposed system could reduce the massive training time of operators.

Table of Contents
Description
Published in SOAR: Shocker Open Access Repository by Wichita State University Libraries Technical Services, November 2022.
Publisher
Association for Industry, Engineering and Management Systems (AIEMS)
Journal
Book Title
Series
Journal of Management & Engineering Integration
v.15 no.2
PubMed ID
ISSN
1939-7984
EISSN