Touch detection in augmented Omni-surface for human-robot teaming

Loading...
Thumbnail Image
Issue Date
2022-12
Embargo End Date
Authors
Yan, Fujian
Chavez, Edgar
Yihun, Yimesker S.
He, Hongsheng
Advisor
Citation

Yan, F., Chavez, E., Yihun, Y.S., He, H. (2022). Touch detection in augmented Omni-surface for human-robot teaming. Journal of Management & Engineering Integration, 15(2), 92-99.

Abstract

This paper proposes an architecture that augments arbitrary surfaces into an interactive touching interface. The proposed architecture can detect the number of touching fingertips of human operators by detecting and recognizing the fingertips with a convolutional neural network (CNN). The inputs of the CNN model are images that are acquired by an RGB-D sensor. The aligned depth information acquired by the RGB-D sensor generates the plane model, which determines whether fingertips are touching the surface or not. Instead of the traditional plane modeling method that can only be used for flat surfaces, the proposed system can also work on curved surfaces. Corresponding gestures are defined based on the detected touching fingers on the surface. The feedback of the robots is projected on the working surface accordingly with an interactive projector. Compared to conventional programming interfaces, directly touching is much more natural for human beings. Based on the experiments, the proposed system could reduce the massive training time of operators.

Table of Content
Description
Published in SOAR: Shocker Open Access Repository by Wichita State University Libraries Technical Services, November 2022.
publication.page.dc.relation.uri