Quick Radar / Optical Super Observation for Deep Space

Josh Minneti, MITRE; Andy Laffely, MITRE; Nathan Stallworth, MITRE; Rich Ghrist, MITRE; Edward Fernadez, MITRE; Justin Wilmes, MITRE

Keywords: Space Situational Awareness, Radar, Observation, Data Fusion

Abstract:

Traditionally the Space Domain Awareness architecture has used hand full of ‘exquisite’ sensors to collect information on deep space objects.  Radar systems can provide excellent range and good range rate out to Geo distances, but absolute positional errors can still be in the 10’s of kilometers due to the radar beam width at those ranges.  Conversely, optical systems produce highly accurate angle measurements but with the lack of range data they require long tracks to produce orbit determinations.  In either case, using the data independently can cause slow or inaccurate orbit determination for deep space objects.  With the advent of inexpensive optical devices, we now can dedicate one to existing radar systems.  Using the same mission planner for pointing and search for both devices and combining the data (near) real-time we can reduce the absolute positional error of radar/optical joint “super” observations to fractions of the radar systems alone.  This paper outlines the algorithm needed to fuse radar and optical data real time for co-located and geographically separated radar and optical assets.  We present sensor based improved situational awareness improved to a fraction of the original positional error backed by experimental results and detailed modeling and simulation. 

Date of Conference: September 15-18, 2020

Track: Optical Systems Instrumentation

View Paper