Brian McReynolds, Institute of Neuroinformatics; Rui Graca, Institute of Neuroinformatics; Rachel Oliver, Cornell University; Masashi Nishiguchi, Purdue University; Tobi Delbruck, Sensor’s Group, Institute of Neuroinformatics, UZH/ETH Zurich
Keywords: Event Based Sensor, Dynamic Vision Sensor, Neuromorphic Camera,
Abstract:
Neuromorphic dynamic vision sensors (DVS), often called event-based sensors (EBS), are a novel class of cameras that have recently shown potential to make a significant impact in the SDA community. Their biologically-inspired design simultaneously achieves high temporal resolution, wide dynamic range, low power consumption and sparse data output, making them an ideal fit for space applications. Although initial results for SDA are promising, they typically exhibit elevated noise rates in dim conditions and have thus far failed to outperform conventional cameras in terms of limiting visual magnitude and sensitivity with high telescope scan rates. A hurdle for widespread adoption is a lack of general guidance regarding optimal camera biases (settings) for SDA. Prior studies either serve as proof of concept or focus on algorithm development; however, to date, none have provided detailed guidance on biasing EBS to optimize signal to noise ratio (SNR) for SDA tasks. The goal of this paper is to narrow the knowledge gap between EBS pixel biasing and resulting performance to optimize their capabilities for SDA.
To accomplish this, we adopt a bottom-up approach, revisiting the pixel architecture to consider physics-based performance limitations. In an EBS, each pixel responds autonomously, generating events in response to local brightness changes within its field of view (FOV), and outputs a sparse representation of the visual scene where each event is encoded by a pixel address (x,y), a microsecond resolution timestamp (t), and a single bit polarity value (p) indicating either an increase or decrease in brightness by a defined threshold. In most camera models, behavior is fine-tuned by adjusting roughly a half-dozen biases, including threshold levels (sensitivity), bandwidth (speed of the front-end photoreceptor), and refractory period (dead-time between events in a given pixel). These parameters make EBS cameras adaptable for varied applications, but many degrees of freedom presents a challenge for optimization.
Researchers unfamiliar with the technology can be overwhelmed by the myriad of biasing options and must either rely on a prescribed set of biases or manually adjust them to achieve desired performance; the latter is not typically recommended for non-experts due to 2nd-order effects such as excessive noise rates. Manufacturer default biases are considered optimized for a broad range of applications, but recent studies have demonstrated non-conventional bias techniques can significantly reduce background noise in dim conditions while still retaining signal, suggesting that SDA capabilities could be improved by a more sophisticated biasing strategy. By conducting a detailed study of how sensitivity, response speed, and noise rates scale with varied bias configurations, we aim to approach an optimal SNR bias configuration and demonstrate the maximal capabilities of current generation COTS EBS cameras for SDA.
To systematically analyze and benchmark performance against a calibrated and repeatable stimulus, we developed a custom SDA test-bench to simulate stars/satellites as sub-pixel point source targets of variable speed and brightness. The set-up includes an integrating light box to provide a calibrated flat-field illumination source, a custom 170 mm radius anodized aluminum disk with precision drilled holes of diameters ranging from 100 to 250 microns, and a digitally programmable motor capable of precise speed control from ~0.1 to 800 RPM. The disk is backlit by the flat-field illumination source and connected to the motor shaft, and a 7 x 10 cm region is viewed through a Fujinon 1:1.8/7-70mm CS mount lens at a distance of 50 cm. The FOV and zoom are chosen such that the dimension of the largest holes is still sub-pixel in diameter when in focus.
Even with the ability to rapidly collect measurements with this setup, the overall parameter space is still too large to fully explore without any a-priori knowledge about how the sensor responds to signal and noise, and how this depends on biases. As a result, we consider fundamental pixel behaviors to devise an efficient test strategy. We first consider strategies to limit noise rates, as these can overwhelm sensor readout when the background is dark. In prior work, this was presumably accomplished by either reducing the bandwidth biases or increasing threshold biases, but these approaches inherently limit signal. Instead of this naive approach, we draw inspiration from two recent studies: the first demonstrated an optimal balance between two bandwidth related biases accessible in some camera prototypes, and the second relies on a key observation about the statistical distribution of noise events to devise two additional biasing techniques to enhance SNR by allowing either lower thresholds or broader bandwidth settings.
Using these techniques as a starting point, we examine the performance the DAVIS346 EBS. We first report baseline performance using manufacturer default biases. To quantify performance, we measure sensitivity (dimmest point source detected) and bandwidth (fastest point source detected). Next, we tune bias settings with specific detection goals (i.e. maximum velocity and/or minimum brightness) and analyze the results. Finally, we apply newly developed low-noise bias techniques and attempt to identify general principles that can be applied universally to any EBS camera to improve performance in SDA tasks.
This paper provides a baseline for understanding EBS performance characteristics and will significantly lower the entry barrier for new researchers in the field of event-based SDA. More importantly, it adds insight for optimizing EBS behavior for SDA tasks and demonstrates the absolute performance limits of current generation cameras for detecting calibrated point source targets against a dark background. Finally, this study will enable follow-on work including the development of customized denoising, detection, and tracking algorithms that consider signal response and noise statistics as a function of the selected camera and bias configuration.
Date of Conference: September 19-22, 2023
Track: SDA Systems & Instrumentation