Why do we still experience serious incidents when doing standard operations? Because we forget the process and only look at the results!
Why do seafarers still get injured during mooring operations? Why do seafarers still suffocate when entering into enclosed spaces? Why do we still experience serious incidents when doing standard operations? Why can’t we seem to get rid of these reoccurring accidents?
Some time ago Green-Jakobsen was asked to join a vessel to carry out an accident investigation after a very serious accident. Seen in a slightly philosophical view some of our post-investigation reflections on how we humans get misled in our risk perception have been enshrined in this article.
On a lovely May evening in a Northern European port a gas carrier crew were engaged in a tug boat and mooring operation. The conditions for this type of operation were perfect, the crew were well trained and experienced. Nevertheless – before the vessel was properly moored a Filipino rating had lost the lower part of his left leg ripped off by the tug boat messenger line. What went wrong is always the question that follows, but more importantly how can maritime leaders (officers and managers) in the future recognise mind-sets, behaviour or attitudes that need to be corrected before they lead up to an accident?
When Green-Jakobsen carries out Company Safety Maturity Assessment, a safety behaviour survey is carried out – assessing the respondents’ perception of own and their colleagues’ safety behaviour. One of the main and reoccurring survey conclusions is that the respondents predominantly rate their own performance to be better or at least equivalent to that of their colleagues. In other words – the respondents believe that ‘colleagues can do better but I’m ok’.
Similar to this a Danish business newspaper recently showed that research carried out in this area supports this conclusion. Numbers show below people overrate their performance.
American high school students:
American university teachers:
The question is why human beings see things this way? Well, some argue that we assess ourselves this way to avoid getting depressed and the poorer our performance is the more we highlight our own infallibility. Others argue that human beings are subject to antecedents that serve as triggers to observable behaviour. The consequences of this behaviour either enforce or discourage repetition of certain behaviour patterns. If the way we do things has a positive result – I wasn’t hurt during work – we tend to repeat this specific behaviour next time we are exposed to a similar situation. E.g. if the way we have conducted ourselves during a mooring operation didn’t have any negative consequences we tend to believe that we did the job safely.
The consequence of this mindset and behaviour is that we forget to reflect on the process and focus on the result. The belief is that if no one got hurt during a work task then the job must have been done safely. In other words we forget the words of German philosopher and mathematician Edmund Husserl: ‘Experience is subject to assumptions and biases. So experience by itself is not science’. Stealing, translating and relating Husserl’s word to the content of this article it could be expressed as follows: We might not have experienced any serious incidents before, but this doesn’t mean that the way we conduct ourselves is safe!
Seen in a safety perspective the result of this human mindset is that it often leads to safety behavioural complacency, perceptual and cognitive biases and self-satisfaction. Or as one Captain once said: I must be better than my colleagues and be doing a safe job! Just look at my safety record. But a perception like this is when the trouble starts and is the result of a cognitive bias. Maybe the Captain ‘has just been safe by accident’? A performance evaluation based on the result only and not on how the process towards the result was managed, can potentially become the downfall of many a good human being.
Cognitive biases are automatic and unconscious. They shape how human beings select, process information and subsequently direct our safety behaviour. But there are sometimes critical decision points at which cognitive bias can be disastrous. While any single decision may be insignificant by itself, a series of small decisions can create a path to disaster.
On that May evening when the Filipino rating got his leg torn off the accident investigation among the crew clearly indicates that a mooring and tug boat operation was not perceived as a task demanding thorough risk management (Read: We have performed this task so many times so there is no need to discuss the process (Risk Management) on beforehand).
Despite the fact that the tug boat’s conduct was a major contributing factor there were numerous indications of crew behaviour and mindset controlled by earlier experiences and results rather than constant reflections on the process. But the fact that a rating had his leg ripped off clearly indicates that the most important hazard mitigating mindset and behaviour is constant articulation, communication and evaluation of the working process. ‘The result is not irrelevant but evaluating the process towards the goal is more important’
The American Philosopher John Dewey once claimed that ‘We (Human beings) only think when we are confronted with problems’. However, when dealing with the safety of human beings, potential problems have to be identified before they show themselves as an accident. And the only way to do this is constant, dynamic and on-going evaluation of the processes. Not getting hurt doesn’t necessarily mean that we have done a good job and to drive the employee safety mind-set and behaviour in this direction, it is of particular importance that (safety) leaders (officers and managers) act as role models of this belief.
Safety leadership is subject to the same traps of thinking that affect all human beings: that is, we tend to make inaccurate judgments about future probabilities. In a complex world, cognitive biases allow us to establish shortcuts that simplify decision making, make our world more predictable, and absorb new information consistently with what we already know. But an oversimplification of ‘what is’ can be disastrous. Strong safety leaders fight bias and understand that incident free working process is not necessarily a safe working process. Evaluating the process and potentially making corrections is more important than the result.
Understanding and acknowledging cognitive bias is important leadership knowledge. Strong safety leaders (and of course other employees too) understand this risk and will fight it. They put this knowledge to use when weighing important safety issues, and monitor themselves and others. Strong and effective safety leaders are constantly in a mode of evaluation. Leaders who monitor themselves for the effect of biases in their thinking, and who enlist others in the effort to check for bias, can improve the quality of safety decisions and the outcomes they produce. While understanding cognitive bias will not change every decision they make, knowledge of its effects can improve the decision making process.
The Filipino rating who suffered the serious injury might not have avoided the accident. It would be very wrong to make this kind of conclusion. But due to the fact that seafarers on a regular basis during standard operations still experience serious injuries the approach to standard operations has to change. The fact is that the crew members on board practiced a very superficial evaluation of a very dangerous work processes. This is a problem. Good safety statistics (KPI’s) doesn’t mean that you are working safely. Maybe you have been safe by accident.
So in the future: Remember to evaluate the process and forget about the statistics result.