Pilot Error Doesn't Matter

  • Published
  • Air Force Safety Center
What!? Why is a human factors "expert" telling me that pilot error doesn't matter? Isn't that his job to preach the importance of human error? Well actually no. I am sure at one point or another you have heard the phrase "Errare humanum est" or for you non-Latin types "to err is human." The issue is not that we screw things up. Humans will always error. We need to focus on how we limit the total number of screw-ups and minimize their effects. 

You have all been in a safety briefing or commander's call and heard the proverbial, "Ya da, ya da, ya da, and see, if they would have just followed their procedures/TO this never would have happened." This is all too often followed by the solution to preach the need for better decisions, better attention, and strict checklist adherence. That's not to say that we shouldn't always strive to improve our decision making processes and checklist discipline, but it places the emphasis in the wrong area. It places an emphasis on individuals with the exclusion of the operational contexts which influences those errors. It also fails to utilize the most effective methods available to ensure those errors do not occur again. 

When I first arrived here at the Safety Center, I read the Department of Defense Standard Practice For System Safety Identification, MIL-STD-882D. Buried within this brief and obscure instruction is a standard for identifying potential mishap risk mitigation methods. It's called the Design Order of Precedence (DOP) and its key steps include, in order of precedence: eliminating hazards through design selection; incorporating safety devices; providing warning devices; and lastly if safety devices do not adequately lower the mishap risk, developing procedures and training. 

Recently, I experienced an operational situation in which the DOP was employed to prevent the near permanent hearing loss of the mishap crew. The family (my wife, I and three kids) began our daily trip to Wal-Mart in the trusty minivan when I heard the all too often scream from my four year old. Having just transitioned to a booster seat and now being responsible for her own buckling, my daughter became quickly frustrated with her inability to get the buckle to snap closed. Amidst her high pitched, ear piercing shrieks of frustration my six year old replied calmly with a sigh of annoyance, "Samantha, you always try to use the wrong buckle." She was sitting next to the window, so he took the center lap belt and snapped it into the corresponding receiver removing all possibility of her trying to buckle her shoulder harness into the wrong buckle. I turned to the wife and remarked, "Our six year old is genius. He identified a source of human error, and used the highest level, in the system safety design order of precedence for mitigating identified hazards, by completely eliminating the hazard through design selection." And of course my wife's reply was, "You're a dork." But this example begs the question, if my six year old gets it, why do we as an Air Force continually attempt to first use the least effective method by developing new procedures or training? The Air Force in me wanted to yell at my daughter that if she just used the correct receiver she wouldn't have a problem.

So how do we ensure a transition to a culture that adequately addresses the operational contexts that influence human errors and properly employ the most effective methods to mitigate those errors? I believe the Department of Defense Human Factors Analysis and Classification System (DoD HFACS) will be fundamental to this needed paradigm shift. DoD HFACS is currently employed by all the services as a mishap classification tool, but its concepts could also be applied to mishap prevention, performance enhancement, and process streamlining efforts. What makes this model so useful is that it forces individuals to look beyond the actions or inactions of the operators. Investigators are forced to also address the underlying conditions of the operators and the supervisory/organizational contexts in which they were placed. For instance, if an Operations Officer were to routinely authorize a mission or mission element that was unnecessarily hazardous without sufficient need, it would be critical to identify this unnecessary source of risk. Likewise at the organization level numerous factors
can affect the risk of an operation. For example, inadequate program oversight/program management for a particular MDS could lead supervisors to task crews with missions they are inadequately equipped for, and thus contribute to crew members seeking unapproved off-the-shelf equipment which could lead to a dangerous situation. 

During a recent class A mishap board, I made the mistake of writing a sentence weighted heavily in DoD HFACS jargon, "In an effort to further explain the contributing context, the latent preconditions of the operators will be discussed." The lead investigator quickly informed me this was a dork statement and to reword in normal English. I tried to convince him that we all would be talking like this someday, but he didn't buy it. My objective with this article is not necessarily to try to get Joe Aircrew to start using terms like "latent preconditions," but to begin thinking in the general concepts this term (and others like it) seek to describe. We need to seek to always address not just the poor actions of individuals, but also their underlying conditions and the failed supervisory/organizational defenses. 

Even if one can identify the deficiencies it can sometimes be a daunting task to believe that you can actually affect change in such a large and bureaucratic system. There are actually numerous programs and advocates to assist even unit level Airmen in affecting real change. The challenge is being aware of these various resources. Case in point-how many of us are adequately familiar with the growing initiatives of Human Systems Integration or AFSO21? 

Are you familiar with the efforts and resources provided by Team Aerospace? On a daily basis your Flight Medicine, Public Health, Bioenvironmental Engineering, Aerospace Physiology, and Health Promotion counterparts work to optimize and sustain your performance. For example, under the larger umbrella of Human Systems Integration (HSI), Team Aerospace works with operators to develop a "comprehensive strategy used early in the acquisition process to optimize total system performance, minimize total ownership costs, and ensure that the system is built to accommodate the characteristics of the user population that will operate, maintain, and support the system." One initiative to accomplish the goal of cradle to grave integration of the human weapon system is the Capability Gap Analysis (Cap Gap) program described in AFI 48-101, Aerospace Medicine Operations. Through their Cap Gap process, Team Aerospace works to develop solutions for operational needs identified by individuals in the field. Working with Team Aerospace provides just one more additional avenue to identify and develop local solutions and get those needs systematically forwarded up for evaluation, prioritization and eventual resolution. 

AFSO21 provides another example of the many avenues for change. In general, AFSO21 asks, "Can this organization manned with these people with this training and equipment perform these tasks to the right standard under these conditions?" Key to the program is ensuring that all Airmen understand their role, develop the ability to affect change and continuously learn new ways to improve processes in their daily activities in order to save resources, eliminate waste, and increase performance. AFSO21 provides just another example among a list of numerous avenues available to those faced with a seemingly bureaucratic and immovable system. Other examples include the Air Force Idea Program, formal suggestions through one's chain of command, AF Form 847/AFTO Form 22 submissions, and the list goes on and on. 

In summary we need to move away from a "blame the individual" culture to one that sources human error along all the levels outlined by DoD HFACS. Then and only then can we begin to systematically address the errors which lead to organizational failure (major accidents). Although a daunting task, especially for those young Airmen "turning the wrench" or "flying the line," however we can and do have the resources and programs available to affect real change. I am not the first to suggest the need for such a paradigm shift, but someone reciting the thoughts of those much smarter than I. If you want a short, easy read on this subject review the work of Sidney Dekker, "The Field Guide to Understanding Human Error." For those more ambitious, the lengthy psychobabble of James Reason's "Human Error" is good. These two books serve as a staple to any serious student of human error. And we are all students of human error. For me at least, this is one test that I don't want to fail.