Theoretical and Empirical Models for Predicting Radiometric Performance of Event Cameras

Benjamin Schmachtenberger, Johns Hopkins University Applied Physics Laboratory, University of Colorado Boulder; Zach Watson, Johns Hopkins University Applied Physics Laboratory; Conor Benson, University of Colorado Boulder; Marcus Holzinger, University of Colorado Boulder

Keywords: Event Camera, Performance Model, Photonics, Remote Sensing, Space Situational Awareness

Abstract:

Event Cameras, or Neuromorphic Imagers, have grown popular in the robotics community due to their inherit response to motion and change, high dynamic range and small data footprint. For many of the same reasons, these devices have recently found traction for use in space situational awareness. Space situational awareness involves leveraging observations and data of the space environment to track and characterize any resident space objects (RSOs) in orbit.

               Event cameras offer a fundamentally new way of interacting with and understanding the space environment.  Traditional framing cameras require high exposure times to capture enough light to detect satellites, which can often be quite dim and difficult to detect. Event cameras operate on contrast – as long as there is a large enough difference between the background sky and the RSO, event cameras will register an event, alerting their user to the presence of a satellite. Fundamental optical sensing quantities, such as exposure time, dark current and shot noise, take on entirely different meanings or disappear entirely when considering event-based observations. Additionally, event cameras have sparingly been used in a remote sensing context, looking down towards Earth from a space-based platform [1].

               The Johns Hopkins University Applied Physics Laboratory, in collaboration with the University of Colorado Boulder, have been working on developing a cohesive understanding of how event cameras perform under various conditions. Past publications on event camera performance assessments have been limited to first order logarithmic ratios, field experiments of overhead spacecraft and laboratory experimentation. Current modeling and simulation capabilities for event cameras rely on these first order models, such as the ESIM [2]. Recent developments from Benson and Holzinger [3] have demonstrated how a first order ODE model captures unique characteristic of how RSOs appear to event cameras, but are missing values defined by device characteristics and operational setting, which was explored by McMahon- Crabtree and Monet in their steady-state model [4]. Finally, no model to date has been able to capture how to understand the performance of event cameras for on-orbit operation.

               This talk seeks to address these questions by introducing a new way of understanding and predicting the performance of event cameras for space situational awareness. First, we will introduce a novel method of predicting event camera performance for remote sensing settings by capturing the orbit of the observing spacecraft, the radiometric appearance, characteristics and location of the target being sensed, as well as the driving parameters for the optical payload. Second, we will present a combination of various performance models to capture both the evolution of the circuit dynamics and internal device characteristics in order to better inform future modeling and simulation efforts. Finally, we will share our progress on developing a thorough radiometric characterization of event cameras as part of JHU/APL’s series of photonic experiments. We will present a suite of experiments designed to probe the sensitivity of event cameras in order to determine characteristics such as the required bi-directional contrast ratios for detection at various device settings. These experiments include flood illumination experiments, point source generation and the development of a novel scene projection technology, capable of presenting a radiometrically calibrated and temporally commensurate scene to COTS event cameras. 

Works Cited

[1]

M. McHarg, R. Balthazor, B. McReynolds, D. Howe, C. Malony, D. O’Keefe, R. Bam, G. Wilson, P. Karki, A. Marcireau and G. Cohen, “Falcon Neuro: an event-based sensor on the International Space Station,” Optical Engineering, 2022.

[2]

H. Rebecq, D. Gehrig and D. Scaramuzza, “ESIM: An Open Event Camera Simulator,” in Conference on Robotics Learning, 2018.

[3]

C. Benson and M. Holzinger, “Simulation and Analysis of Event Camera Data for Non-resolved Objects,” The Journal of the Astronautical Sciences, 2023.

[4]

P. McMahon-Crabtree and D. Monet, “Commercial-off-the-shelf event-based cameras for space surveillance applications,” Applied Optics, p. G144, 2021.

Date of Conference: September 16-19, 2025

Track: SDA Systems & Instrumentation

View Paper