Quantifying human factors in complex sociotechnical systems using the FRAM

Fram Functional Resonance

How do they make them work? | David Slater

David Slater and Rees Hill


ABSTRACT


The Functional Resonance Analysis Method is being used more widely, but not just a visualisation to aid understanding of how teams in complex sociotechnical systems work together and adapt to challenges that can arise. There has now been a series of studies that attempt to use the visualisation as a legitimate complex system metamodel. The results of these studies have been quantitative estimates of system behaviours and functional effectiveness. The key aim of the original method was additionally to recognise and track the effects of real-world variability in the adequacy, reliability, and individual behaviours of component functions. The method also enables the same scrutiny of human contributed functions, which is allowing more insights into the importance of human adaptability in making the systems work in practice.


This is redolent historically of the work of Human Factors specialists tasked with producing Human Error data for engineering systems’ probabilistic logic tree safety studies. This paper sets out to trace the influence of Human factors thinking in the development of the FRAM approach and to propose a way to produce credible human factors insights through the quantitative system modelling perspectives offered by FRAM.


To implement this in FRAM analyses, we propose extending the metadata approach and utilising more sophisticated algorithms in the equations specified. For example,

  • We define a common probability of effectiveness to model the expected performance of functions in specific interactions.
  • We define Limits of Tolerability – maximum levels of variability in Aspect effectiveness tolerated to trigger successful execution or utilise back up functions.
  • This approach shows how a Human Function has an ability to adapt to variability, not normally found (unless designed in) in technological functions.
  • By monitoring and learning trends and patterns in variabilities to respond intelligently and pre-emptorily.
  • Borrowing a concept from Bow ties – Layers of Protection Analysis (LOPA), by identifying the checks and balances as “Barriers” with probabilities of Failure on Demand (PFD).(ref)
  • By leveraging concepts from distributed computing, particularly through the application of algorithms inspired by the Byzantine Generals Problem.