By Kevin Meyer
Regular readers know that the "respect for people" pillar of lean is particularly important to me – in fact I consider it to be the most critical. My organization often hears me say that 90% of problems are the result of poor processes, not people, and 9% of the remaining are probably due to poor leadership.
The FAA appears to agree. A couple years ago I told you how the organization had created a nonpunitive reporting system for air traffic controllers to report incidents. That led to a dramatic change, which might initially be seen as a scary negative but as most of us immediately realize is a huge positive:
New numbers released by the Federal Aviation Administration show reports of air-traffic errors have nearly doubled in three years. The number of reported incidents in 2007 was 1040, and that number rose to 1887 in 2010, an 81 percent increase. This cultural change in safety reporting has produced a wealth of information to help the FAA identify potential risks in the system and take swift action to address them.
The FAA was apparently so impressed that they have now expanded the program to other functions.
Federal Aviation Administration officials announced that as part of a new "safety culture" at the agency, they would fully embrace nonpunitive reporting systems, in an effort to generate information that could expose bigger dangers.
The FAA took a half-step in that direction in 2008, creating a nonpunitive reporting system for air traffic controllers. On Wednesday, the FAA said it was expanding the program to employees who maintain radar installations and other systems.
"Make no mistake about it: We don't condone (errors)," said David Grizzle, the FAA's Chief Operating Officer. "However, we presume the good intent of our controllers and are more interested in the free flow of information than we are in punishing for errors."
The FAA's decision also points to a problem many organizations face as they improve – it's easy but very dangerous to rest on positive improvement.
In announcing the new reporting systems, the FAA cast itself as a victim of its success. FAA officials said they used to be able to measure risk by counting accidents. The fewer crashes, the better the agency was doing.
But with the commercial aviation accident rate at historically low levels — there has not been a fatal commercial crash in three years — the agency needs to look at other data to identify risky behaviors and incidents, and to address them.
It's analogous to a common misperception. As a process becomes more stable, the tendency and desire is to measure less and move on to something else. But in reality the black swans, the long tail events that hide in broad statistical analysis, are the most disruptive and dangerous. Instead of measuring less frequently, you should measure more frequently to uncover and address the next layer of process variation.
The FAA seems to understand that – and the importance of enabling and ensuring people are part of the data collection process. With rare exceptions the problems are in the processes, not the people.