By Kevin Meyer
Thanks to Twitter, again, I recently came across one of the most insightful articles I've read in a long time – on a safety blog no less. Steven Shorrock takes on the concept of "human error" and adds considerable more perspective to this oft-used term.
In the aftermath of the [Spain rail] accident, initial investigations ruled out mechanical or technical failure, sabotage and terrorism. That appeared to leave only two possible explanations – ‘human error’ or ‘recklessness’, or both. When society demands someone to blame, the difference – whatever it might be – can seem trivial. What followed was a display of our instinct to find a simple explanation and someone to blame.
When something bad happens – an accident like this, an inappropriate comment, a disastrous rollout of a website – we seek action, fast. We want accountability, and usually by that we mean rolling heads. Rarely do we probe further into the context of the event, and especially into the system. As just one example:
Several claims appeared about the driver in the media, often without relevant context. It was reported that the driver “admitted speeding” on the occasion of the crash. However, there appears to be no evidence that the ‘speeding’ involved conscious disregard for, or indifference to, the dangers of the situation or for the consequences of his actions. This would have been an extreme act. Rather, it seems that the driver was unaware of the context.
Shorrock goes on to bemoan the effect of overuse of the term "human error."
Indeed, the popularisation of the term ‘human error’ has provided perhaps the biggest spur to the development of human factors in safety-related industries – with a downside. When something goes wrong, complexity is reduced to this simple, pernicious, term. ‘Human error’ has become a shapeshifting persona that can morph into an explanation of almost any unwanted event. It is now almost guaranteed to be found in news stories pertaining to major accidents.
And then eventually comes up with the reasons why he is abandoning the term:
- ‘Human error’ is a often a post hoc social judgement. ‘Human error’ is one of few things that often cannot be defined unambiguously in advance of it happening.
- ‘Human error’ requires a standard. To know that something is an error, it must be possible to describe a non-error. This can be surprisingly difficult, partly because there are so many “it depends”. In the context of complex interacting systems such as ATC, there are many ways to get an acceptable result.
- ‘Human error’ points to individuals in a complex system. In complex systems, system behaviour is driven fundamentally by the goals of the system and the system structure. People provide the flexibility to make it work.
- ‘Human error’ stigmatises actions that could have been heroic in slightly different circumstances. What are described as heroic actions could often have been described as tragic errors if the circumstances were only slightly different. The consequences of heroic actions are not known in advance.
- Underlying processes of ‘human error’ are often vital for task performance. In the context of error, we often refer to psychological activity involved in perception, memory, decision making or action. Taking one example, without expectation, radio-telephony would be very inefficient. Occasionally, one may hear what one expects instead of what is said, but this must be set against improved efficiency during thousands of other occasions.
- ‘Human error’ is an inevitable by-product of the pursuit of successful performance in a variable world. The context and conditions of performance are often vague, shifting and suboptimal. The ability to adapt and compensate comes at a cost.
The need for standards, the importance of context, the fine line between error and heroism, and perhaps even the positive aspects of error when pushing boundaries.
So what does Shorrock now use instead of "human error"?
Left with a ‘human error’-shaped hole in my vocabulary several years ago, I found an alternative concept thanks to Erik Hollnagel: performance variability. This is not simply a replacement term or a euphemism, but a new way of thinking that acknowledges how systems really work. Performance variability, both at an individual level and at a system or organisational level, is both normal and necessary, and it is mostly deliberate.
Something to think about next time an incident happens and you feel that instinctual urge to blame, instead of understand.
Mark Graban says
It sounds like the author understands the difference between “common cause” variation and “special cause” variation. The driver and their behavior might be within the bounds of “common cause” variation amongst drivers, meaning his skill, attentiveness, etc. isn’t really an outlier on the bad side. He could be just unlucky. The accident might have occurred with a different driver in the same circumstances.
That’s where I’ve found the “Just Culture” methodology helpful and I think it applies outside of healthcare. It’s a flow chart to help determine if it’s a system problem or an individual problem (it’s most always a system problem). There’s the “substitution test,” asking if a similarly skilled and trained professional might have made the same error or taken the same actions in the same setting… if so, it’s a system problem.
Scott Maruna says
Several years ago I went through some training that was marketed to get to the root cause of safety issues. I will not mention the name of the resource but the logic drove roughly 80% of the issue to human error. What I found in using this tool was that the “human” who was actually involved did not create the “error.” It was usually a process problem or a management problem or what in PSM parlance is called normalization of deviation. In all those incidents the issue was rarely, if ever, the person across the table answering the questions during the investigation. Manifestation is rarely causation.
John Hunter says
Well said, I have posted on this idea several times, including:
Bart S says
Excellent observation! Asking 5 Whys will also help us come up with concrete improvements/poka-yokes, if we give honest answers, and do not cop out with “human error.”
Elite Machinery says
I don’t know if human error is 100% avoidable, but one trait that is helpful in avoiding human errors is to be apathetic. If you can view things from the varying perspectives of others it leaves much less room for error. I believe this trait is inherent and is less likely to be developed in an individual.