Beyond Blame: Understanding Human Error in Aviation
Accidents involving aircraft occur due to a complex web of factors. While “human error” is frequently cited as a contributing element, it is rarely – if ever – the root cause of accidents. Rather, human error is the outcome of a longer chain of systemic, procedural, environmental, and human factors.
To truly improve safety, we must shift the conversation from blame to understanding why human errors occur and how we can better support human performance.
Man-made Procedures & Technologies
The aviation industry relies heavily on procedures, checklists, technologies, and systems in the daily operation. All of which are conceived, built, and operated by humans. While humans possess incredible strengths and knowledge, we are also bound by our natural limitations. Pilots, cabin crew, and maintenance engineers interact with man-made procedures, systems, and technologies all the time – also when they are under pressure, fatigued, or afraid to speak up, either because they lack assertiveness, or because they do not feel psychologically safe.
Therefore, we must train them to understand the negative impact these factors can have on safety.
The Dirty Dozen: Paving the Way for Human Errors
The Dirty Dozen refers to the twelve most common factors which can create the conditions for errors to occur. In other words, they constitute potential precursors of human error.
The Dirty Dozen include:
Lack of communication |
Complacency |
Lack of knowledge |
Distractions |
Lack of teamwork |
Fatigue |
Lack of resources |
Pressure |
Lack of assertiveness |
Stress |
Lack of awareness |
Norms |
Each factor poses a risk to safety if aviation professionals fail to identify, recognise, and mitigate in the name of safety.
Root Cause or Symptom?
It is easy to default to “human error” as the leading cause of an accident. Human error is essentially a symptom of flaws or vulnerabilities within an existing system. If we attribute accidents to “human error” without digging deeper, we miss the opportunity to learn and adapt.
Why did the error occur? What conditions led to the error? What can we do to prevent similar errors from happening in the future?
Embrace Human-Centred Learning
The emergence of Crew Resource Management (CRM) and Human Factors training over the past five decades has fundamentally transformed the industry’s understanding of human error. Learning programmes that reflect scenario-based and human-centred approaches move beyond ‘blaming the individual’. Instead, the training emphasises how flawed systems, communication gaps, and team dynamics contribute to errors. Similarly, these programmes enable pilots, cabin crew, and maintenance staff to recognise safety risks, manage human factors more effectively, speak up, and make safer decisions under pressure. The more realistic the training environment is, the better it prepares people for real life.
CRM and Human Factors training helps foster a more robust and proactive safety culture with emphasis on identifying, mitigating, and recovering from threats and errors, before they escalate. However, we can never eliminate human error in a human-centred industry.
Human Factors: Tales of Fatality
Historically, there have been several accidents in which human factors was a major contributor to error.
The Tenerife Airport Disaster in 1977 is perhaps the most well-known and deadliest accident in aviation history. On a foggy day in March, Two Boeing 747s collided on the ground and killed a total of 583 people. The subsequent investigation concluded that miscommunication and intense time pressure were contributing factors. As a result, CRM became a fundamental part of training, emphasising the importance of communication, crew assertiveness, and mutual cross-checking.
Another tragic example of an accident involving human factors was Air France Flight 447 which crashed into the Atlantic Ocean in 2009, killing all 228 passengers and crew on board. The plane’s pitot tubes got blocked by ice, causing the autopilot to disengage, and the airspeed measurements to be unreliable. The crew misunderstood the situation and made incorrect decisions. Key factors in the accident were miscommunication, lack of situational awareness, and lack of crew coordination, emphasising the need for practical, hands-on training and the reinforcement of crew resource management principles.
No Blame in a Just Culture
Human error is not simply ‘one person’s fault’. That is a crucial lesson in human factors. A just culture – as opposed to a blame culture – enables us to recognise the factors that contribute to error and intervene before the error escalates. As Sidney Dekker’s Bad Apple Theory suggests, it is not just a matter of removing the one bad apple, but more about looking at the barrel itself. Without pointing fingers or imposing punishment.
Aviation professionals must be trained to understand, address, and mitigate the human conditions that may lead to error. When people understand the why behind errors, they are more likely to reflect on their own behaviours and attitudes in the interest of safety.