Rachel Oliver, Cornell University; Dmitry Savransky, Cornell University
Keywords: event-based sensors, neuromorphic electronics, modeling, parameterization, disaggregate data, solid-state circuits
Abstract:
Over the past three decades biologically inspired vision sensors developed from a theory to a commercial product. The event-based sensors pixels operate asynchronously by only recording data when an individual pixels photocurrent passes a defined threshold. The advantages over traditional charged-coupled device (CCD) frames include higher temporal resolution, improved dynamic range, and discretized output. Now that this technology is maturing, it presents opportunities to improve operational Space Domain Awareness (SDA) by extending optical observational periods, enabling optical analysis of low earth orbit (LEO) satellites, and limiting data generation enough to consider space-based SDA networks. The first step towards those goals and the objective of this research is the demonstration of a physics-based end-to-end model of an event-based sensor that collects reflected visible spectrum energy from a satellite on its focal plane. The model improves understanding of the design parameters to create event-based sensors uniquely suited for SDA by detecting objects with low visual magnitude. Additionally, the ability to produce consistent simulated output will assist in writing effective algorithms to interpret the sensors disaggregated output data.
There are other available event-based sensor simulators. These simulations take computer rendered or standard video as input and interpolate between frames to simulate higher temporal resolution. Then the simulations apply noise and losses based on the characteristics of the modeled hardware. The output includes a stream of events including x and y pixel location, polarity of an event, and an event timestamp. Polarity is defined as 1 for a positive event and -1 for a negative event. Finally, two-dimensional mapping of possible events at discrete frame timesteps produces a simulated event video. The model discussed here fundamentally differs from the existing simulators because its goal is not to provide a tool to convert video input into estimated event output. Instead, this model is dedicated to SDA applications. It provides a mathematical framework to convert a satellites orbit into an event output without needing to reconstruct a video image from discrete frames and inform design decisions for event-based sensors with this mission set.
This new event-based sensor model starts with the simulation of an object of interest crossing the focal plane. Initially assuming an unresolved image, incident flux from a satellite is distributed via a point spread function on receiving pixels associated with the projection of the satellite position on the focal plane. Given orbital parameters, a camera fixed frame definition, and optical properties of a telescope, the satellite dynamical model is propagated across the focal plane as a function of time at the temporal resolution of the sensor of interest to create a list of pixels and their integrated flux over time.
After determining incident flux on individual pixels, the model then conducts a series of transformations rooted in underlying physics of the pixel circuitry to determine if an event has occurred. The logical flow starts when a pixel has a large enough cumulative flux to induce a photoelectric current based on the work functions of the substrate material selected for the photodiodes. Transduction of the photoelectric and dark current produces a voltage that logarithmically grows with incident energy. This voltage then amplifies either a nominal voltage or value memorized after the previous event. When the change to the memorized voltage passes a positive or negative threshold value an event is triggered. While the aforementioned pipeline is applied to the pixels experiencing enough flux to trigger on and off events, a more realistic approach is to consider signal leaks and noise to capture late or early event firings on irradiated pixels and event noise on unstimulated ones. Thus, this model includes the following phenomena that result in erroneous event firings: dark current manipulation of the event threshold, junction current leakage and parasitic photocurrent in the differencing amplifier reset switch circuit, threshold gaussian variance between pixels, refractory period delays, shot noise, and hot pixels.
The new models final step is to capture performance associated with reading events from individual pixels. On completely asynchronous models, an event-based sensor arbiter is subject to read delays as it checks and records one pixel at a time. The arbiter process systematically checks for an event condition in each pixel along a row before moving onto the next row. There are multiple possible losses from this process. First, it may produce time delays between an event occurrence and recording. Modeling the arbiter shows if the recording process leads to error on the timestamp associated with an event. Additionally, the model can quantify information lost during the refractory period prior to the voltage reset. The final possible loss during the arbitration process is due to the maximum number of recorded events the arbiter can process per second. When the focal plane is saturated by events, events past the threshold are not recorded. This model provides insight when the arbiter will proceed recording events.
Date of Conference: September 14-17, 2021
Track: Optical Systems & Instrumentation