Ryan Jochims, University of Houston; Mike Bastidas, University of Houston; Samantha Gurney, University of Houston
Keywords: Optics, Machine Learning, Convoluted Neural Network, Airglow detection, real-time imaging, gravity waves, atmospheric monitoring, all-sky imaging, autonomous optical systems, HAARP, Poker Flat Research Range, remote sensing, space weather, aeronomy
Abstract:
Airglow is an atmospheric phenomenon characterized by its faint glow similar to that of the aurora. However, unlike the aurora, airglow exhibits a global occurrence rather than localized at the Earth’s magnetic poles. All-sky imagers aim at measuring airglow intensity using extreme wide angle optics. The targeted emission spectrum includes atomic oxygen, sodium, and hydroxyl compounds at 558-630, 589, and 700-900 nm respectively. Airglow provides a useful tool for studying gravity waves, oscillations of air parcels caused by frontal systems and small-scale weather events, because gravity waves cause airglow emission intensity to fluctuate. Gravity waves are of special interest due to their role in momentum transfer across different layers of the atmosphere. Researching gravity waves is important for weather forecasting, climate modeling, and aviation safety. However, traditional all-sky imaging systems are limited in their effectiveness for this research, mainly due to systems being stationary, manual, and non-specific in its data collection.
To overcome these limitations, the All-Sky Imaging science team from the University of Houston’s sixth iteration of the Undergraduate Student Instrumentation Project (USIP VI) has developed a dual optical array system for automatic real-time image collection. This proof of concept system allows for improved imaging by implementing a primary camera with an approximated 180-degree field of view and a secondary camera with an approximated 80-degree field of view. The primary camera is tasked with initial detection of airglow by utilizing a layered convoluted neural network trained on over a decade of historical all-sky image data. The data used for the network originates from the Poker Flat Research Range optical observatory building near Fairbanks, Alaska. The system begins with a ResNet50 architecture to give binary classification of airglow presence, optimizing object detection by the You Only Look Once (YOLO) algorithm. Once the YOLO algorithm is trained, it has the ability to determine the prominent lines of airglow and create bounding boxes of specified regions from the real-time image. Then, the secondary camera automatically reorients itself after positive detection of airglow activity for direct imaging. Doing so allows data collection to be focused on the airglow for high-resolution imagery. The 80° wide angle view camera system is equipped with six different narrow band filters in the following wavelengths(nm): 550±10, 600±5, 630±10, 650±2, 670±4, 705±4. Each narrowband filter allows for a specified molecule or atomic species to be observed as potential gravity waves perturb through the skies above Poker Flat.
Reorientation of the secondary camera is achieved with algorithmic region detection of ultra wide images to calculate azimuth and altitude from origin. The dual cameras and hierarchical neural networks operate on two commercial single board computers (SBCs) communicating over serial and transmission control protocol (TCP) configurations. Using azimuth and altitude results from a positive activity detection, two motors in a pan and tilt configuration are triggered to target degrees of orientation.
In the continuation of this proof of concept, the prototype system was initially deployed during the January 2025 High-frequency Active Auroral Research Program (HAARP) campaign located in Gakona, Alaska. The system was additionally installed at the Poker Flat Research Range after the conclusion of the campaign with further improvements to be implemented during a 3-week campaign conducted by the USIP VI team in March 2025 to Fairbanks,Alaska. This continuously developing system will allow for new approaches to automatic image evaluation for optical and atmospheric applications.
Date of Conference: September 16-19, 2025
Track: Atmospherics/Space Weather