Internet of things based cyber-physical system framework for real-time operations

Loading...
Thumbnail Image
Issue Date
2020-02-26
Embargo End Date
Authors
Maru, Vatsal K.
Advisor
Krishnan, Krishna K.
Nannapaneni, Saideep
Citation
Abstract

Automation on the production floor can improve production efficiency and human safety when performing hazardous tasks, particularly in Kansas where aircraft manufacturing requires handling large aerospace structures. To increase the edge in our aircraft manufacturing sector, incorporating intelligence into the robotic systems can improve their effectiveness on the production floor. Therefore, the objective of this research is to create an intelligent control system that performs operations based on object detection using machine vision. A Deep Learning (DL) technique was used to train a model to identify different types of objects and implement different control actions accordingly. We use a variant of Convolutional Neural Network (CNN) known as Faster R-CNN (R stands for the region proposals) for improved efficiency in the object detection process in Object Detection and Control Algorithm (ODCA). The faster R-CNN model was able to correctly identify different types of objects, which enabled a Universal Robot (UR5) robotic arm to perform different control actions. We demonstrate the proposed intelligent cyber-physical system framework to perform pick and place operations as they are one of the most widely performed operations on a production floor. This research immensely decreases the manufacturing and assembling costs by implementing intelligent robots compared to the sensor-based robotic systems.

Table of Content
Description
Poster project completed at the Wichita State University Department of Industrial, Systems, and Manufacturing Engineering. Presented at the 17th Annual Capitol Graduate Research Summit, Topeka, KS, February 26, 2020.
publication.page.dc.relation.uri
DOI