Yasir Latif, University of Adelaide; Peter Anastasiou, Inovor Technologies; Yonhon Ng, Australian National University; Zebb Prime, Inovor Technologies; Tien-Fu Lu, University of Adelaide; Matthew Tetlow, Inovor Technologies; Robert Mahony, Australian National Univerty; Tat-Jun Chin, University of Adelaide
Keywords: neuromorphic sensing, event sensor, pointing accuracy, piezoelectric, star tracking
Abstract:
As satellites become smaller, the ability to maintain stable pointing decreases as external forces acting on the satellite come into play. At the same time, reaction wheels used in the attitude determination and control system (ADCS) introduce high frequency jitter which can disrupt pointing stability. For space domain awareness (SDA) tasks that track objects tens of thousands of kilometers away, the pointing accuracy offered by current nanosats, typically in the range of 10 to 100 arcseconds, is not sufficient. In this work, we develop a novel payload that utilizes a neuromorphic event sensor for high frequency and highly accurate relative attitude estimation paired in a closed loop with a piezoelectric stage for active attitude corrections to provide a stable pointing direction. Event sensors are especially suited for space applications due to their desirable characteristics. We use the event stream to estimate the reference background star field from the event data which is subsequently used to compute instantaneous relative attitude at high frequency. The piezoelectric stage works in a closed control loop with the event sensor to perform attitude corrections based on the compute difference between the current and desired attitude. Simulation results show that we can achieve a pointing accuracy in the range of 1-5 arcseconds using our novel payload.
Event sensors are a novel neuromorphic sensing modality that mimics the human eye. Instead of observing absolute brightness like conventional optical sensors, an event camera generates events — an binary indication of whether intensity has increased or decreased at a pixel location in the sensor array — only on intensity change at a particular pixel location. Since each pixel operates asynchronously and independently of others, event sensors offer microsecond accurate timestamps for each event. Therefore, the event sensor has an effective frame rate of thousands of frames per second when compared to a conventional optical sensor which can offer the highest frame rate in the lower hundreds.
Neuromorphic event sensors offer an exciting new opportunity in the area of space sensing. Since events are generated only on illumination change, the event stream is sparse by definition, leading to efficient processing for downstream applications. Compared to conventional optical sensors, neuromorphic event sensors consume less power and their high dynamic range offers a larger observation window due to a smaller sun exclusion angle, all of which are well suited for Space based SDA applications.
Event sensors have already seen applications in the domain of SDA where they have been mainly used for Resident Space Object (RSO) detection from ground-based observations. Our proposal is the first one to use them for relative attitude estimation using background stars with the aim of improving pointing accuracy.
Our novel payload focuses on providing stable pointing for a single sensor independent of the ADCS, rather than aiming to correct the attitude of the whole spacecraft. The payload consists of the event sensor along with an optical sensor lying in the focal plane of a telescope. The optical sensor is the main SDA sensor and requires more stable pointing while the event sensor is used to provide estimates of the instantaneous attitude deviation. Both of these sensors are mounted on top of the piezoelectric motion stage, allowing to stabilize both using the computed error from the neuromorphic event sensor. This dual mounted setup allows for the optical sensor to work uninhibited for the SDA task while the event sensor provides estimates of deviation from the pointing direction.
Using the event sensor, relative attitude is computed by detecting and tracking the background starfield from the event sensors data stream at a frequency much higher than a conventional optical sensor. At the start of the exposure window, the event stream is used to construct a reference view of the background star field. These stars serve as the fixed reference against which subsequent data received from the event sensor is aligned to compute the relative attitude. This is akin to the problem of Simultaneous Localization and Mapping (SLAM) used extensively in robotics and autonomous vehicles. When there is significant motion, the reference view is expanded to add newly observed stars. The relative attitude computed this way is compared to the desired attitude and the difference between the two drives the piezoelectric motion stage using a PID controller to reduce pointing discrepancy over time.
The prototype for the payload has been tested in a dark room using an off-the-shelf edge compute device and a simulated star field from the Tycho2 catalog. Detailed experiments demonstrate that the instantaneous relative attitude estimation can run as fast as 50 Hz on an embedded device and 300 Hz on a desktop level machine, which would not be possible with traditional optical sensors. In addition, the developed algorithms offer pointing accuracy within the range of 1-5 arcsecond range as opposed to the 10-100 arcseconds range offered by commercial ADCS, making long-range SDA tasks more effective.
Date of Conference: September 19-22, 2023
Track: Space-Based Assets