Steven Morad, University of Arizona, SpaceTREx; Ravi Teja Nallapu, University of Arizona, SpaceTREx; Himangshu Kalita, University of Arizona, SpaceTREx; Byong Kwon, Arizona State University, SpaceTREx; Vishnu Reddy, University of Arizona; Roberto Furfaro, University of Arizona; Erik Asphaug, University of Arizona; Jekan Thangavelautham, University of Arizona, SpaceTREx
Keywords: Camera; meteor; observation; vision algorithms
Abstract:
The wide availability of Commercial Off-The-Shelf (COTS) electronics that can withstand Low Earth Orbit conditions has opened avenue for wide deployment of CubeSats and small-satellites. CubeSats thanks to their low developmental and launch costs offer new opportunities for rapidly demonstrating on-orbit surveillance capabilities. In our earlier work, we proposed development of SWIMSat (Space based Wide-angle Imaging of Meteors) a 3U CubeSat demonstrator that is designed to observe illuminated objects entering the Earths atmosphere. The spacecraft would operate autonomously using a smart camera with vision algorithms to detect, track and report of objects. Several CubeSats can track an object in a coordinated fashion to pinpoint an objects trajectory. An extension of this smart camera capability is to track unilluminated objects utilizing capabilities we have been developing to track and navigate to Near Earth Asteroids (NEAs). SWIMSats smart camera system uses an onboard image processing algorithm to detect objects entering the Earths atmosphere. The object detection algorithm uses a multilayered approach to autonomously detect and track entering objects using a single camera. The object detection approach at its base relies on simple vision-feature detection methods to filter and identify events of interest. This is followed by obtaining dynamic information of the illuminated objects including velocity, acceleration, trajectory. Using the hyperspectral and thermal imagers it may be possible to obtain first-order estimates of the composition of the reentering object. Using these estimated statistics, our approach develops a physical simulation model of the observed system and predicts its entry trajectory. Our approach is now extended to detecting unilluminated objects. The system maintains a dense star map of the night sky and performs round the clock observations. Standard optical flow algorithms are used to obtain trajectories of all moving objects in the camera field of view. Through a process of elimination, certain stars maybe occluded by a transiting unilluminated object which is then used to first detect and obtain a trajectory of the object. Using multiple cameras observing the event from different points of view, it may be possible then to triangulate the position of the object in space and obtain its orbital trajectory. In this work, the performance of our space object detection algorithm coupled with a spacecraft guidance, navigation, and control system is demonstrated by simulating both the physical world and orbiting observers. These experiments are conducted in the laboratory using two mock spacecrafts, each mounted on a separate robotic arm to facilitate 2-axis rotations. Each system is connected to a computer equivalent to a CubeSat computer. When the experiment starts, each camera system detects incoming meteors and unilluminated objects simulated on two different high resolution TV-screens. The detection algorithm coupled with spacecraft control system then tracks the objects using the two cameras simulating two different locations to predict the object trajectory in three-dimensions. A thorough description of the detection algorithm, along with the tracking controller is presented in this work. The results of the laboratory hardware-in-the-loop experiments are presented. Our work suggests both a critical need and the promise of such a tracking algorithm for implementation of an autonomous, low-cost constellation for performing Space Situational Awareness (SSA).
Date of Conference: September 11-14, 2018
Track: Optical Systems Instrumentation