Criminalizing an officer’s explicit error sends the wrong message

The Norwegian Navy frigate Helge Ingstad was declared a total loss after colliding with a tanker on November 8, 2018. But the biggest mistake was not related to the collision itself.

Knowingly, and in contradiction to the official findings of the Norwegian Safety Investigation Authority (NSIA), the Public Prosecutor dragged the officer-in-charge on the frigate into court.

Last week’s sentence was a two-month suspended prison sentence for the officer, although the accident involving the 113,700 dwt Sola TS (2017 make) has been shown to be the result of a complex combination of multiple causes.

This legal process appears to criminalize something very human – doing an honest wrong. A side effect of criminalization is a growing reluctance to admit wrongdoing, and its impact may extend far beyond the Norwegian Navy to affect the maritime industry and society at large.

The result is a world with more failures and undetected errors that are likely to escalate into serious accidents, when it is surely the duty of the courts to make society safer and better.

It is typical in disasters that the person always knows what went wrong before the accident. Hence, the wisest thing is to lower the bar for expressing concerns and highlighting errors, to address problems before it is too late.

The article continues below the advertisement

We have to admit that we all make mistakes – it’s simply human, and we can’t make humans free from failure. But by acknowledging mistakes, we can address failures before things get worse. This makes it natural to share our mistakes and try to understand others’ mistakes, precisely because we can all learn from them. This mindset is the opposite of criminalizing honest mistakes.

There have been examples in naval investigations where watch officers watched a football game while on duty, deliberately performed risky maneuvers or were intoxicated. This behavior can be called negligent. But this is very different from what happened aboard the Helge Ingstad.

The first thing the condemned officer did when he reached the bridge was to question a bright light on the horizon. Discuss this with the previous probation officer. The two men and two observers each concluded that the light was an object on Earth. Only the helmsman thought it was another ship, but he kept his suspicions to himself. Eight minutes later, the accident was real.

They tried their best. Unfortunately, they all fail.

We all tend to think that we are good at reporting bugs and defects. But the truth is, we choose to remain silent when the stakes are high – and stats show we’re becoming less willing to speak up.

One reason for this tendency is not listening to those who express their concerns. One of the prerequisites for listening to someone who speaks loudly is realizing that it is difficult to do so. This requires openness about mistakes, trust, organization, maturity, will, and time. Criminalizing errors eliminates this openness.

The problems revealed in the NSIA investigation are not mitigated by the imprisonment of the officer of the watch. Instead of looking for someone to blame, we as an industry should ask ourselves what we can learn from this incident and from others.

It is not enough to understand why and how the accident occurs. Real learning means implementing interventions to ensure similar things don’t happen again.

However, the challenge of doing this across time zones, ships, shifts and national cultures must be acknowledged by all stakeholders, especially regulators and prosecuting authorities. Implementation requires confidence, alertness and openness.

Encouraging an open mind and learning from mistakes is the best way to handle failures and reduce the risk of accidents.

Turkel Somma is Chief Scientific Officer of the Oslo-based safety culture organization SAYFR

Do you have an opinion to share?

Leave a Comment