Daniel Stouch, Charles River Analytics; Umanga Balasuriya, Charles River Analytics; Rob Hyland, Charles River Analytics; Les Bird, Charles River Analytics; Michael Jenkins, Charles River Analytics; Caroline Kingsley, Charles River Analytics
Keywords: augmented reality (AR), space domain awareness (SDA), human machine interaction (HCI), space education, space policy, virtual reality, data fusion, machine learning (ML), space situational awareness (SSA), space traffic management (STM)
Abstract:
To help educators, students, space professionals, scientists, and policymakers better understand the complex interactions of orbital mechanics, we developed the means to visualize and interact with the space domain in three dimensions (3D) using augmented reality (AR). AR is a form of virtual environment (VE) where the human interacts naturally in real time with both true reality and a synthetic overlayed reality model.
This model of reality supports interaction, getting information from the model through ordinary human senses such as sight, sound, and touch, and controlling the model using natural human actions such as gestures, voice, and head movements. We will demonstrate live interaction with multiple users in a collaborative virtual environment in real time using Magic Leap One headsets to show the existing capabilities and future possibilities that will help users build an intuitive understanding of complex astrodynamics.
The AR capability enables users to observe and interact directly with the space enterprise via applied AR for 3D spatiotemporal visualization. Users can literally walk around the globe and through satellite orbits to see their shapes. Users can see and interact with two line element (TLE) based resident space objects (RSO) and their orbital paths, including the ability to rewind and fast forward projected RSO positions on their orbits. They can toggle between Earth-centered, Earth-fixed (ECEF) and Earth-centered inertial (ECI) coordinate frames, and dynamically edit TLEs in the VE to understand the effect of parameter changes to orbital trajectories. User configurable content positioning, orientation, and scaling provides the means to reposition the globe and selected orbits to better understand their properties.
Networked multiuser synchronized viewing supports co-located or distributed operators viewing the same content, such as for custom built scenes for educational instructor/student interactions. User driven scene annotations include free drawing in 3D space and custom 3D shape laydowns. 3D shape laydowns include sensor fields of view projected on the Earth and 3D CAD models of satellites to let users zoom in to inspect the their structure and makeup. This real time 3D telestration supports the ability to annotate RSOs, orbital trajectories, unusual conditions, and scenarios of interest.
To be effective, we dont just translate traditional content or design guidelines to XR, rather we study the virtual work environment as a component of the overall system and use novel methods for early stage prototyping and advanced prototype evaluation. We apply psychophysical AR head mounted display (HMD) limitations to inform device selection of for context of use and display requirements, determining fidelity recommendations for virtual environments based on empirical experiments.
The implementation approach uses cross-domain software development expertise informed by industry leading empirical research, human factors engineering, and key partnerships with users, researchers, and technology vendors. The development approach leverages a portfolio of applied AR programs to develop tools and capabilities to enable more immersive, accessible, and intuitive AR, not just toy applications for entertainment value. This work is further enabled by a broader portfolio of multiplier technology, such as computer vision, machine learning, and data fusion to support analytics and scenario development. Computer vision and machine learning expertise, for example, enhance XR applications beyond commercial off the shelf offerings with capabilities such as scene registration, recognition, and localization; fiducial marker tracking for better object tracking in VR; and custom gesture and speech recognition libraries for improved HMI reliability.
Custom AR engineering tools reduce development times and increase capability effectiveness. These include flexible interaction libraries for more ecologically valid HMI experiences in virtual environments, advanced haptic development and interface customization libraries, integrated hooks for distributed AR and live training, synchronous and asynchronous networking libraries to enable co-located and distributed virtual environment collaboration in AR, and massively scalable modeling and simulation capabilities to back persist virtual environments.
Novel methods for iterative evaluation of AR at different stages of development are necessary because AR evaluation is particularly challenging since traditional methods must evolve with the design formalisms (e.g., evaluating AR within virtual reality). Integrated physio/behavioral sensors for assessing ecological validity of augmented content are also used.
The outcome of this paper and presentation is an understanding of how augmented reality will be used in the near future to help educators, students, space professionals, scientists, and policymakers better understand the complex astrodynamics of resident space objects in support of learning, understanding, and decision making by space enthusiasts.
Date of Conference: September 14-17, 2021
Track: Astrodynamics