Holographic augmented reality visualization interface for exploration

Loading...
Thumbnail Image
Authors
Bui, Bill
Hutton, Abbie
Adekalu, Oluwasayo
Hubener, Valerie
Zavala, P.
Hinshaw, Ramil
Karim, Radeef Ashhab Bin
Advisors
Schoonover, Maggie
Smith, Kristyn
Patterson, Jeremy A.
Issue Date
2022-04-29
Type
Abstract
Keywords
Research Projects
Organizational Units
Journal Issue
Citation
Bui, B.; Hutton, A.; Adekalu, O.; Hubener, V.; Zavala, P.; Hinshaw, R.; Karim, R. A. B. 2022. Holographic augmented reality visualization interface for exploration -- In Proceedings: 18th Annual Symposium on Graduate Research and Scholarly Projects. Wichita, KS: Wichita State University
Abstract

Based on Wichita's wheat harvesting nickname, Harvesters, Wichita State University NASA SUITS (Spacesuit User Interface Technologies for Students) team Harvestars proposed the integrated system H. A. R. V. I. E. (Holographic Augmented Reality Visualization Interface for Exploration) to prepare for the next Artemis moon landing. This design solution will assist astronauts with elevated demands of the lunar surface through navigation, terrain sensing, and an optimal display of suit status elements. Considering environmental constraints, the system architecture promotes efficient cross modal communication between the mission control center, other astronauts, and the user interface. Hands-free modality options are utilized such as gaze and speech recognition. To promote spatial learning, waypoint markers are displayed both in an allocentric world view map and egocentric first-person viewpoint. Spatial mapping, using depth sensing and 3D modeling, will read changes in displacement and elevation, and calculate the user's height to categorize hazardous objects. For pathfinding, our navigational system will create a directional arrow with the use of A* algorithm combined with spatial anchors. In the case of emergencies, distress beacons with color coded warning messages are displayed on navigational maps and displays. Throughout the design process, we conducted heuristic evaluations and Streamlined Cognitive Walkthroughs on a low fidelity prototype. Then, we implemented H.A.R.V.I.E into the HoloLens 2 and utilized the Rapid Iterative Testing & Evaluation method for human-in-the-loop testing. Our interface serves as a novel approach to enhance how astronauts navigate on missions using augmented reality. Final in-person testing will be conducted at NASA's Johnson Space Center.

Table of Contents
Description
3rd place award winner in the poster presentations at the 18th Annual Symposium on Graduate Research and Scholarly Projects (GRASP) held at the Rhatigan Student Center, Wichita State University, April 29, 2022
Research completed in the Department of Psychology, Fairmount College of Liberal Arts and Sciences; College of Innovation and Design; School of Computing, College of Engineering; Department of Aerospace Engineering, College of Engineering
Publisher
Wichita State University
Journal
Book Title
Series
GRASP
v. 18
PubMed ID
DOI
ISSN
EISSN