Stefan Doucette, The MITRE Corporation; Nicole Lape, The MITRE Corporation; Thomas Swinicki, The MITRE Corporation; Kevin Dickey, The MITRE Corporation; Tim Welsh, The MITRE Corporation; Geoff McHarg, United States Air Force Academy; Gregory Cohen, Western Sydney University
Keywords: neuromorphic, sensor, vision, remote, space based, international space station, machine learning, artificial intelligence
Abstract:
Unlike traditional cameras that generate image frames at a fixed rate, neuromorphic sensors generate event streams based on changes of light intensity at each pixel. These novel sensors also have unusual characteristics, including asynchronous pixel operation, high dynamic range exceeding 120 dB, microsecond time resolution, and potentially lower data generation rates compared to traditional focal plane arrays. MITRE-funded research has focused on the development of novel algorithms for the processing and exploitation of the unique data produced by these detectors. A collaborative agreement between MITRE and the Space Physics and Atmospheric Research Center (SPARC) at the United States Air Force Academy (USAFA) provides access to a rare data source: the USAFA Falcon Neuro experiment, which provides neuromorphic imagery from cameras aboard the International Space Station (ISS).
The MITRE-developed algorithms described herein broadly aim to correlate neuromorphic events with categorizable semantic information, including sensor noise, geographic features, and in-orbit objects, with a goal of demonstrating the contribution of space-based event-based sensors and edge-processing to the space domain.
Date of Conference: September 27-20, 2022
Track: Machine Learning for SSA Applications