Ruth Stilwell, Aerospace Policy Solutions, LLC; Kyle Quakenbush, MITRE; John Giles, Space Domain Strategies; Nathaniel Dailey, MITRE & Space Force Association
Keywords: Human Factors, Space Operations, Space Domain Awareness (SDA), Space Situational Awareness (SSA), Safety Management, Automation, Crew Resource Management (CRM), Threat and Error Management (TEM), Fatigue Risk Management (FRMS), Safety Reporting, Human-Aut
Abstract:
Human Factors in Space Operators-Leveraging Aviation Safety Frameworks for Enhanced Orbital Domain Resilience
Dr. Ruth Stilwell, John Giles, Dr. Nate Dailey
This paper examines the human factors parallels between Space Domain Awareness (SDA) occupations and other continuous (24/7/365) safety-critical services, offering insights into safety management successes and best practices. It draws on NASA’s System-Wide Safety program in aviation, which addresses scaling safety management from human-centric operations to a digitally transformed, automation-integrated infrastructure. These findings are highly relevant to the evolving space domain, particularly as commercial space operations reshape orbital management and ground-based oversight.
The rise of commercial space operators and Space Situational Awareness (SSA) providers, coupled with increasing automation, highlights a gap: this emerging field lacks the institutional frameworks that guide human factors considerations in military or automation-reliant services like air traffic control. In contrast, aviation’s decades of human factors research—encompassing Crew Resource Management (CRM), Threat and Error Management (TEM), and Fatigue Risk Management (FRMS) within a Safety Management System (SMS)—have driven continuous safety improvements, even amid rapid growth.
NASA’s System-Wide Safety Project builds on this foundation to tackle challenges from autonomous aircraft, Advanced Air Mobility (AAM), and automation’s growing decision-making role. Leveraging these efforts informs human factors risk management for space operators as their operational environments converge with aviation’s. The shift from reactive to proactive and predictive safety intelligence hinges on the availability of safety risk data and a deeper understanding of human-autonomy teaming in increasingly automated settings. The 2018 National Academies report, In-Time Aviation Safety Management, underscores the need to collect human performance data, emphasizing “elevated risk states” (e.g., fatigue, inattention) and integrating behavioral psychology into safety systems.
Proactive safety reporting is vital for capturing human performance data, serving as an early warning system for threats, and measuring mitigation effectiveness. Yet, its success depends on a safety culture that encourages anti-error behaviors rather than punishing mistakes or reporting itself. Understanding the root causes of human error is as critical as analyzing the errors themselves– because “Satellites don’t have accidents in orbit—people fail on the ground.”
Unlike aviation, where human factors research often follows catastrophic events, the satellite operator community cannot afford a reactive approach. A single orbital mishap has lasting consequences for the domain’s safety, underscoring the urgency of proactive human factors strategies.
Date of Conference: September 16-19, 2025
Track: Space Domain Awareness