By Kevin Meyer
Regular readers know that the "respect for people" pillar of lean is particularly important to me – in fact I consider it to be the most critical. My organization often hears me say that 90% of problems are the result of poor processes, not people, and 9% of the remaining are probably due to poor leadership.
The FAA appears to agree. A couple years ago I told you how the organization had created a nonpunitive reporting system for air traffic controllers to report incidents. That led to a dramatic change, which might initially be seen as a scary negative but as most of us immediately realize is a huge positive:
New numbers released by the Federal Aviation Administration show reports of air-traffic errors have nearly doubled in three years. The number of reported incidents in 2007 was 1040, and that number rose to 1887 in 2010, an 81 percent increase. This cultural change in safety reporting has produced a wealth of information to help the FAA identify potential risks in the system and take swift action to address them.
The FAA was apparently so impressed that they have now expanded the program to other functions.
Federal Aviation Administration officials announced that as part of a new "safety culture" at the agency, they would fully embrace nonpunitive reporting systems, in an effort to generate information that could expose bigger dangers.
The FAA took a half-step in that direction in 2008, creating a nonpunitive reporting system for air traffic controllers. On Wednesday, the FAA said it was expanding the program to employees who maintain radar installations and other systems.
"Make no mistake about it: We don't condone (errors)," said David Grizzle, the FAA's Chief Operating Officer. "However, we presume the good intent of our controllers and are more interested in the free flow of information than we are in punishing for errors."
The FAA's decision also points to a problem many organizations face as they improve – it's easy but very dangerous to rest on positive improvement.
In announcing the new reporting systems, the FAA cast itself as a victim of its success. FAA officials said they used to be able to measure risk by counting accidents. The fewer crashes, the better the agency was doing.
But with the commercial aviation accident rate at historically low levels — there has not been a fatal commercial crash in three years — the agency needs to look at other data to identify risky behaviors and incidents, and to address them.
It's analogous to a common misperception. As a process becomes more stable, the tendency and desire is to measure less and move on to something else. But in reality the black swans, the long tail events that hide in broad statistical analysis, are the most disruptive and dangerous. Instead of measuring less frequently, you should measure more frequently to uncover and address the next layer of process variation.
The FAA seems to understand that – and the importance of enabling and ensuring people are part of the data collection process. With rare exceptions the problems are in the processes, not the people.
Mark Graban says
Healthcare needs to learn these lessons. Healthcare author Naida Grunden (friends with the famed Capt. Sully Sullenberger) is always advocating for a centralized non-punitive reporting system and mechanism for healthcare errors, near misses, etc.
We have to address risky behaviors (and the system!) before they turn into near misses or patient harm.
Jack Parsons says
Individual negligence and sabotage are very rarely the causes for problems seen in a system as most people want to do a job well. When we reward and punish workers for process and system variation we encourage the hiding of problems. Of course leaders are responsible for the system, but many do not understand the principle of system variation and therefore may aggravate the situation.
Aaron says
Within the aviation industry as a whole, this has been pushed through by ICAO (Internaional Civil Aviaion Organisation) and the FAA has picked up on this, although still lags some way behind Europe.
The Safety Management System that is being implemented internationally still allows for punishment of deliberate and knowing wrong-doing, but also takes into account motivation and company culture before deciding on any punitive action. But this decision-making process is clearly-defined and known to all.
But, it is still a culture change, and as such takes time to fully implement. It has taken almost 5 years to implement in the Air Traffic Controllers, and this is a realtively small community. I hope people realise that it will take longer to fully implement through engineers and pilots.
Karen Martin says
Great post, Kevin! This is true re: reporting medication errors as well. When I work with hospitals to reduce medication errors, I simultaneously work the cultural angle to reduce fear and blame re: reporting with actual process change that incorporates error-proofing.
In a recent engagement, the number of reported errors rose 40% overnight and has risen from there. While this is fantastic news to me, I occasionally have to talk a few of the leadership team “off the ledge” because they fear the problem’s getting worse, not better.
Of course the team is continually improving the process based on far better data than they’ve ever had.
Robert Drescher says
Congrats to the FAA
Mistakes will happen, but by knowing what mistakes have happened and the circumstances around them, you can take action to prevent them in the future. When you do nothing but blame and ounish people for them people will find ways to hide them, and they will stay hidden until something every big happens to expose them. It is better they get dealt with sooner so that actions can be taken to prevent them reoccurring.
John Hunter says
Good post. Should “9% of the remaining are probably due to poor leadership” be 9%. As it stands 90% system, .9% leadership (9% of 10% = .9%). Silly point I know (or maybe I miss your intention), I just thought I would mention it (I am a bit overly focused on data and understanding data).
The point of using in-process and process result measures on well functioning processes is often overlooked. You don’t want to spend too many resources collecting data that has little value, but proper process measures are very useful and should be monitored. Also this helps when you decide to improve (or radically change something somewhat related) and can catch things (unintended consequences) very quickly.
The point of understanding the data (in context) is critical. Brian Joiner did a very good job of emphasizing this idea I think. If you want to reduce complaints it is usually pretty easy to do so, by making it really hard to complain. When you really care about customer focus, understanding if complaints are up do to better processes to encourage complaints or because your service is lousy is critical.
Rick Bohan says
As I tell my clients, if finger pointing worked, all US organizations would be perfect by now.