Get in touch

The Day It Almost Went Wrong: Near-Misses from a Human Factors Perspective

In aviation, the most valuable safety lessons often come from the “almost” events. Near-misses or safety hazards are not just technical details in a report. The days where things almost went wrong often reveal how Human Factors influence daily operations.
Human Factors

It was a day like any other. The pilots had flown the route countless times. The maintenance tasks had been performed according to procedure. From an operational standpoint, nothing seemed out of the ordinary.


But that day, something happened which could have ended in a disaster.


Pilots, technicians, and cabin crew encounter near-misses and safety hazards every day. That is why we have Safety Management Systems (SMS), encourage continuous reporting, and adhere to quality assurance procedures. We want to discover the symptoms and capture the signals before situations escalate.


In that respect, Human Factors insights may help us understand why these situations occur in the first place. It does not point towards individual blame or failure, but rather how human performance is shaped by workload, disruptions, pressure, complacency, and systemic conditions.


When we analyse near-misses and safety hazards through the lens of human factors and SMS, we can begin to understand the complexity and pitfalls of human-system interactions.


Read more: Is Your SMS Reactive, Proactive, or Predictive?


Small Deviations with Significant Impact

In aviation, we expect people to be well-trained and competent. Aside from being a regulatory requirement, this expectation aligns well with the industry’s emphasis on safety.


The procedures are in place. Best practices are being followed. People want to do a good job. All of this is true. Yet performance is constantly put to a test due to operational pressure, natural human limitations, cultural norms, and resource constraints.


These factors remind us that “successful” safety is not simply about the absence of error. Even the smallest deviation can trigger significant consequences within the system. Near-misses are examples of deviations which may reveal gaps between procedure and practice. This is where investigation becomes critical. It is not about the fact that “someone made a mistake”. We do not want to blame people. We want to investigate the event itself in order to understand how people and procedures sometimes come into conflict with one another. This understanding can help us mitigate similar events in the future.


Human error is not a root cause but must be considered whenever people are involved. Near-misses or safety hazards are not isolated events. They are indicators of how resilient or fragile the operational system truly is.


Read more: Beyond Blame: Understanding Human Error in Aviation

The Philosophy of Continuous Improvement

“Continuous improvement is not about the things you do well – that’s work. Continuous improvement is about removing the things that get in the way of your work. The headaches, the things that slow you down, that’s what continuous improvement is all about.”
Journalist Bruce Hamilton


So, what can we do to reduce the number of days where things almost go wrong? In aviation, zero-incident policies are unattainable. If there are people involved, there is always a possibility of human error.



Instead, airlines, MROs, and OEMs should strive for continuous improvement rather than a spotless safety record. When we dedicate resources to reporting and analysing near-misses and safety hazards, we can learn from them. In other words, this process provides the organisation with a unique insight into the systemic framework under which they operate.


During that process, the following questions may come up:


  • Did the procedures reflect reality?
  • Was there pressure to get the job done?
  • Were the roles and responsibilities clearly defined?
  • Were people encouraged to speak up?

The true measure of an organisation’s safety culture and approach to human factors is not whether near-misses occur. They always will. The real measure is whether the events are reported without hesitation and fear of blame, whether feedback is encouraged, and these findings are used constructively and proactively to improve the organisation.


The day it almost went wrong may never appear in an official incident report – even though an open reporting culture will encourage this.


If the safety culture is mature and just, the organisation will not perceive near-misses and hazards are not signs of failure. Instead, they are indicators of how human performance is shaped by systemic conditions. Conditions which can always be improved.


Read more: The Dirty Dozen: When Routine Takes a Wrong Turn

The Dirty Dozen: The Hidden Dangers of Norms

Human Factors

The Dirty Dozen: Awareness Reduces Situational Blind Spots

Human Factors

The Dirty Dozen: Stress Increases the Risk of Errors

Human Factors

Human Factors in Aviation: The Crucial Role of Training

Human Factors
+45 7950 8000
info@aeroteam.dk