Remote path planning of UR5 robot via video demonstration

Loading...
Thumbnail Image
Authors
Acharya, Aaditya
Advisors
Yiun, Yikmesker
Issue Date
2024-04-26
Type
Abstract
Keywords
Research Projects
Organizational Units
Journal Issue
Citation
Acharya, A. 2024. Remote path planning of UR5 robot via video demonstration. -- In Proceedings: 20th Annual Symposium on Graduate Research and Scholarly Projects. Wichita, KS: Wichita State University
Abstract

The utilization of pre-edited and recorded videos for instructing robotic manipulators offers several benefits, including improved precision and simplified replication. These videos allow robots to handle complex situations that are difficult to mimic in real-time production settings due to logistical challenges. By focusing on the crucial aspects of the task, extraneous elements are eliminated. Additionally, this approach eliminates the need for expensive high-precision cameras attached to the robot, as the learning is based on pre-recorded content. The project will utilize computer vision techniques to process these videos, with a focus on tasks such as object recognition, motion mimicry, and interest point identification. Machine learning and the theoretical principles of Learning from Demonstrations (LfDs) will be used to analyze video frames and extract a relevant dataset. This project will introduce the development of a system that generates robot motion trajectories using learning-based motion planning methods. This approach not only makes robot training more accessible to industry workers with limited robotics expertise but also ensures accurate task execution in industrial settings. By addressing challenges such as the need for extensive interaction with the environment and high-quality datasets for optimal motion paths, the project offers a practical solution to traditional training limitations. It also explores deep learning techniques to convert video frames into a digital format, enabling robots to replicate learned trajectories. The project's effectiveness will be evaluated using performance metrics such as accuracy, learning speed, and success rate, comparing it to conventional camerabased teaching methods. Furthermore, it considers a hybrid approach that combines pre-recorded video learning with real-time visual camera teaching to enhance the overall efficiency and effectiveness of robotic training in industrial applications.

Table of Contents
Description
Presented to the 20th Annual Symposium on Graduate Research and Scholarly Projects (GRASP) held at the Rhatigan Student Center, Wichita State University, April 26, 2024.
Research completed in the Department of Mechanical Engineering, College of Engineering.
Publisher
Wichita State University
Journal
Book Title
Series
GRASP
v. 20
PubMed ID
DOI
ISSN
EISSN