What are "Human Factors?"

  • Published
  • By SEFL Staff
How did that tool get in the aircraft intake? Why was that tire flat? How did that bomb fall off the forklift? How did you crash your car in broad daylight? Why did you land short? How could you be so human? According to Air Force Safety Center analysis, human factors are the cause of 60-80 percent of accidents in a complex system. But what are human factors? 

Human factors are scientific facts about human characteristics. The term covers all biomedical and psychosocial considerations. In other words, the brain, the body, and stuff we humans use. Human factors allow us to look at not just individual human failures, but the failures in the systems that humans design, build, operate and maintain. When we hear of a mishap, we often ask the question, "How could that person have been such an idiot?" However, human beings, even well-intentioned ones, will invariably make errors in a complex system. Most, if not all, the work we do in the U.S. Air Force is considered working in complex systems. Humans make errors -- that is a fact. We cannot change that condition. Humans will continue to get distracted, have a finite attention span, get tired, be confused and have fluctuations in motivation. We can change the conditions in which humans work, though. By better understanding how mishaps in complex systems occur, we can better change these conditions to prevent future mishaps. 

Mishaps are rarely attributed to a single cause, or in most instances, even to a single individual. Latent failures are errors in the design, organization, acquisition or training that leads to operator errors, and whose effects typically lie dormant in the system for long periods. For example, if a major command or a wing fails to supplement instructions based on the publication of a significantly revised Air Force instruction, the troops may not receive the training requirements specified in the revised AFI. Viewed from this perspective, the actions of individuals are the end result of a chain of factors originating in other parts (often the upper echelons) of the organization. The problem is that these latent failures or conditions may lie dormant or undetected for a while before they manifest as mishaps. 

In FY07 Class A Aviation mishaps, we continued to see the presence of Procedures, Publications and Training issues. Procedures and Publications will never be perfect. Competent members of the USAF and supporting agencies will continue to dedicate valuable time and effort to this guidance, but inadequate and/or absent guidance will continue to be a latent failure in our operations. The same holds true for training issues. The USAF dedicates a considerable amount of resources to training; however, there will continue to be areas that are not or cannot be covered in training. We are by no means suggesting that Procedures, Publications and Training are exempt from scrutiny and cause in a humanerror chain. However, humans are involved in these programs and, as humans do, they will continually introduce latent conditions into the system as they work on Procedures, Publications and Training. 

What can you do? Errors in a complex system don't strike like a lightning bolt -- they develop gradually. As you focus on procedures, publications and training, look for the existence of latent conditions in your organization and fix them. Ask yourself why something is the way it is, and then continue to ask yourself why, until you develop a prudent answer. When you set out to change something, pay attention to what you want to remain unchanged. Change often has a domino effect and influences things we intended to remain unchanged. 

When an accident occurs, the human is the final link in the chain. The human is generally the final domino that allows latent conditions to culminate in an active failure. "Active Failures" are the actions or inactions of operators that are believed to cause the mishap. These failures are related to the broader categories like Procedures, Publications and Training mentioned above, and are also related to preconditions that exist in a system. Preconditions can be related to the environment in which people operate or related to how they function. Humans get tired, need to eat, have communication and learning issues, have physiological responses to environmental conditions, and have attentionmanagement problems. Mishaps, whether they are flight or ground, continually have some form of attention-management problems. We continually see attention-management problems like Channelized Attention, Task Oversaturation and Confusion. Attention management problems were cited in 50 percent of the Class A Aviation mishaps for FY07. You are a human, and these attention management problems are part of the package. We will continue to have these problems, but must also continue to combat these problems. When you attempt to delete a file on your computer and popup windows say, "Hey dummy, are you sure you want to delete this?," we are combating attention-management issues. Designers attempt to integrate this into the development of a system to help you, the operator, function better. But what can you do? Increased understanding of these factors can help you perform better and prevent human error. Understanding these issues and how they can affect you and your environment will help prevent human error. Self-analyze your operations and take action. Do you find yourself less attentive after lunch? Have someone double-check your work. Do you get overwhelmed in certain situations? Take a look at these operations and understand the circumstances that lead to you becoming overwhelmed. The bottom line is to understand the situations in which your attention-management limitations led to errors, and in most instances, these are Judgment and Decision-Making errors. 

The conditions described above weigh on your judgment and decision-making, generally the final link before an accident. Judgment and decision-making errors were present in 55 percent of FY07 Aviation Class A mishaps. Four-wheel private motor vehicle accidents generally result from the decision to speed, and two-wheel private motor vehicle accidents generally result from judgment when negotiating a turn. Be aware of operational risk management. The steps of ORM are not new to the human factors community, because they are general steps that all humans use when trying to reason or think through a situation. Whether you're walking to the dining facility, driving to work, carrying your toolbox on the flight line, or engaged in combat operations, you're constantly assessing the hazards in your environment and making decisions and judgment based on these hazards. Sometimes, you make the wrong decision, leading to an error. The error may not cause an accident, and then you reassess the situation based on this error and the current hazards, and make a new decision. The key to this process is to truly understand the hazards and errors that drive your decisions and judgments, and to manage them properly. 

Comprehending how all these functions link up to create errors that lead to accidents is to fully appreciate Human Factors. The Procedures, Publications and Training provided by an organization and managed by supervisors, coupled with the preconditions that exist in our environment and are conditions of us as humans, all link together when we make decisions and judgment calls. Understanding these relationships and the hazards and errors present in your operation can help you manage your performance and prevent an accident. 

Human-error management is a complex subject and cannot be fully described here, but this article can provide some key takeaway information. (1) When thinking in terms of mishap prevention, we should not think of why the error occurred, but why it failed to be corrected in the first place. We should ask, "What did I do or not do that could have prevented this mishap?" (2) Rules are good. It's your responsibility to ensure, first and foremost, that rules are valid, that they are followed, and people are held accountable for not following the rules. If the rule is not smart, then use the appropriate avenue to change the rule. (3) Go to http://afsafety.af.mil/SEF/SEFL_home.shtml and educate yourself on the DoD Human Factors Analysis and Classification System (DoD HFACS), the taxonomy we use to investigate the human factors present in mishaps. Remember: you are the Human Factor!