I myself wrote on Wed, 2 Aug 1995 21:48:53 -0400 (EDT)
> The FAA monitors near misses and incidents, and has an elaborate
> to get reports of unsafe situations. These are monitored as closely
> accidents themselves (although a real loss-of-life accident is
> more carefully, of course).
I've thought a little more about the FAA system and decided it has some
elements that might be useful for Geof's safety case and help make the US
aviation system more learningful.
As background, the FAA appears to pilots as pretty strict and tough. For a
professional pilot, the worst thing is to be discovered doing something
wrong by the FAA and have your license taken away. These stories are in the
flying magazines all the time.
There is a safety reporting system operated by NASA in which pilots can
report any unsafe instance, including those created by one's own actions. If
you report the incident promptly on the NASA form, you are *immune* from
most FAA actions against your license based on that instance.
As a result, professional pilots routinely carry a couple of the NASA forms
and send them in questionable situation. I'm sure that the immunity causes a
*lot* more reports to flow into the system and makes it more learningful.
I thought about the mental models underlying this system: The designers must
have been thinking, "Unsafe behaviors flow from problems with the system,
not just pilot error." If they were thinking "It's the people!" then they
would have a much "tougher" system focused on enforcement and punishment.
What can we do in our organizations to improve the flow of feedback about
unsafe or just ineffective things so our systems can be improved?
Richard Karash ("Rick") | <http://world.std.com/~rkarash>
Innovation Associates, Inc. | email: firstname.lastname@example.org
3 Speen St, Framingham MA 01701 | Host for Learning-Org Mailing List
(508) 879-8301 and fax 626-2205 | <http://world.std.com/~lo>