a FRAM- STPA approach | David Slater and James Pomeroy
David Slater, Cardiff University, dslater@cambrensis.org
James Pomeroy, Cranfield Safety and Accident Investigation Centre, steadypom@gmail.com
ABSTRACT
The Challenger disaster remains a critical study in the consequences of organizational “culture” on safety, with previous analyses often focusing on singular causes like “normalization of deviance.” This paper seeks to provide a more nuanced understanding through a dual application of two systemic analysis methods: System-Theoretic Process Analysis (STPA) and Functional Resonance Analysis Method (FRAM). The analysis re-examines the Challenger disaster by mapping the hierarchical structure of NASA and its contractors, highlighting decision-making processes at macro, meso, and micro levels. STPA reveals specific Unsafe Control Actions (UCAs) and control loop deficiencies, exposing gaps in NASA’s risk management and communication. Simultaneously, FRAM models trace critical functional variability within NASA’s organizational levels, The combined approach uncovers how political and budgetary constraints, normalized risk-taking, and diluted engineering feedback cumulatively degraded decision-making integrity, ultimately contributing to the Challenger’s tragic launch decision. Lessons drawn emphasize the need for resilient structures, adaptable feedback mechanisms, and a culture that values caution alongside achievement. This analysis underscores the potential of integrating STPA and FRAM in complex systems to facilitate successful operations, to enhance safety and anticipate organizational issues.
Key Words- Accidents, Organisations, Systemic analysis, STPA, FRAM
INTRODUCTION
Pomeroy (1) notes that studying safety and risk often means examining disasters through the lens of theories, which can narrow our understanding of their complexity. In this view, well-known catastrophes are often paired with catchy, simplified explanations. For example, Chernobyl’s disaster is frequently attributed to a lack of “safety culture,”(1), Aberfan is described as a “man-made disaster” caused by a “failure of foresight,”(2), and the Challenger explosion is linked to an organization that became “normalized to deviance.” (3)
He adds that although such theories provide useful frameworks, they risk oversimplifying disasters into neat abstractions that fail to capture the full context, conditions, and personalities involved. This narrowed focus can limit our perspective, framing our understanding of events through others’ interpretations and potentially stifling deeper insights.