Common misunderstandings about the human element
Dr Malcolm Cook examines the changes taking place in maritime, as a human element expert and member of the IMarEST’s Human Element Working Group, dispelling myths about how people act and think, in the context of preventing accidents and promoting safety.
The changes that are happening in maritime will present new problems and new challenges. Greater awareness about the role of accidents in the learning process, promoting safety culture and understanding how the human operators respond to accidents will make for a safer transition.
Myth 1: People are to blame 80% of the time
With the drive towards autonomous shipping, I read articles saying 80% of accidents are caused by people, so reducing headcount will reduce accident rates. Wrong! 100% is people-related – and it’s down to the user, the crew, the captain. And the problems with the task. And the design people not understanding the task. And the changes in the task.
Myth 2: The user is always at fault
If you make a poorly-designed product, why are you surprised that people can’t actually use it effectively? Concluding that operators are incompetent isn't satisfactory. We need to understand how and why they were employed, how they were trained, and what procedures they followed. But most people don't understand how to analyse a mistake or error, or to trace how it influenced the outcomes, so mistakes persist as interventions address the wrong problem.
The introduction of new technology often creates the conditions for accidents. Designers fail to understand this or are complicit in accident development. And because the time afforded to development is limited by cost, the critical evaluation of equipment under development is limited in time and effort. This is often worse in automated systems because developers argue it doesn't or shouldn't require human input, so extensive training and changes to procedures are considered superficially. Or training in a new systems is rendered ineffective when a design team fails to appreciate the conditions of usage. Users can also fail to understand the conditions of usage, typically switching off repeated nuisance alarms, which means they don't operate when they are required and accidents occur.
Myth 3: Automated systems are safer
Do you think removing people from a system makes the system more accurate and less prone to failure? You’re wrong! They may make it more prone to failure and more likely to have problems because when a person is removed from a system, part of the error checking process is removed too.
It's very easy to say ‘let’s lean-man the numbers’ but supervisory arrangements are important. Most safety critical systems have both an operator and then someone that supervises - to avoid automation reliance and bias creeping in.
Stress and fatigue are big issues in aviation and maritime, so if you've got less people, you’ve got more fatigue and higher workloads. If the system collapses, the user is less able to respond, and has poorer situation awareness, so is more likely to have an accident.
And recruitment gets harder when less people are on board. What's the chance of persuading someone who is superbly skilled to work on a ship in isolated conditions for low wages for six months of the year? Training requirements go up – meaning higher costs.
And what happens when there's a fire with fewer crew on board to put it out?
Myth 4: Collaboration is simpler now
The navigation process is intended to be a joint decision making task with captain, executive officer and navigators sharing the task and agreeing the safety of the course. But how is this done crowded around a small screen? People imagine that there's a lot more flexibility in electronic systems than there is, but flexibility only comes when you understand how the system works, and how people execute the task.
The difficulties accessing new systems encourages reliance on a single operator to report and use the system. The ensuing lack of effective supervision scrutinising the routes, influences the likelihood of an error persisting and having a consequence.
Myth 5: Training is an easy add-on
One of the really subtle effects of new technology is that it influences the pattern of new learning. So you may see different issues arising from different periods of technology use. Legacy operators may struggle to make the best use of the new technology, which is meant to be ‘easy to use’. New operators, only familiar with the new technology, may become dependent and rely on the technology, until they experience a significant failure.
When the ECDIS systems were introduced, trainers didn't fully understand the task. And if you don't understand the task, you don't understand how to train the task. If you don't understand how people make mistakes or how the tasks are used to warn them, they have to discover it by accident.
Myth 6: It will never happen to us
The customer will often say their system will never fail. People think ‘it won’t happen to us’ and ‘we know what we’re doing’. But it will fail - so plan for it.
Dr Malcolm Cook is a member of the Human Element Working Group and has been combining forensic psychology with military human factors for 25 years.
Read the article on Human element on autonomous vessels that explores the commonspace between the Maritime Autonomous Surface Ships Special Interest Group (MASSIG) and Human Element. To become a member of the Human Element Working Group, log into your My account, click on My Special Interest Groups and then tick the boxes of the SIGs you’d like to join. You can then also join the group on Nexus, our networking platform.
Dr Malcolm Cook was in conversation with Pamela Cahill.