Omnisurface: Common reality for intuitive human-robot collaboration

No Thumbnail Available
Authors
Zaman, Akhlak Uz
Li, Hui
Yan, Fujian
Zhang, Yinlong
He, Hongsheng
Advisors
Issue Date
2025
Type
Book chapter
Keywords
Human-robot teaming , Common reality
Research Projects
Organizational Units
Journal Issue
Citation
Zaman, A. U., Li, H., Yan, F., Zhang, Y., & He, H. (2025). Omnisurface: Common reality for intuitive human-robot collaboration. In Social robotics. Springer, Singapore.
Abstract

Effective communication and information projection are essential for human-robot teaming. The projection of images on nonplanar surfaces using a conventional projector is challenging due to the inherent problem of distortion. The projection distortion occurs due to the variations in depth across the surface of the teaming workspace. As a result, the projected image, information, or symbols lose their original shape and create confusion during human-robot teaming. In this paper, we presented an innovative approach to perform distortion-free projections in the teaming workspace. A pre-warped image is constructed based on the surface geometry that the projector displays and accurately replicates the original projection image. Beyond the technical achievement, this research highlights the social acceptance of improved spatial augmented reality in human-robot teams. It fosters better teamwork, trust, and efficiency by enabling more intuitive and reliable interactions.

Table of Contents
Description
Chapter 34 of Social Robotics
Publisher
Springer Singapore
Journal
Book Title
Social robotics
Series
PubMed ID
ISSN
EISSN