Navigating human error
Guy Hirst explains the processes industries like aviation and healthcare use to manage the risk of human error
My 34 year career as a professional pilot with British Airways (BA) was evenly split, first as a co-pilot and then as a captain. It had always puzzled me why some crews I was a part of appeared to perform more effectively than other equally qualified crews. In the late 1980s I was appointed as a training captain. In that role I had the opportunity to observe, train, and assess other crews. It became glaringly obvious that crew performance was more than just the sum of the capabilities of the individual crew members. Some teams of less naturally gifted individuals would deal with complex scenarios more effectively than crews comprising those with more natural talent.
At that time BA had made a ground breaking decision to start running Crew Resource Management (CRM) courses for all its pilots and flight engineers. I was fortunate to be one of those selected to assist in developing such courses and delivering them to my peers. CRM courses have developed and become much more sophisticated over the last 30 years as research has helped us to understand human limitations. The early years concentrated the training on how to help crew members understand their interpersonal styles and adapt them appropriately. The industry is now experiencing the sixth generation of CRM, which focuses on systems threats and how they must be managed by crews to ensure a safe operation. Current CRM courses also focus on an understanding of the human-machine interface as well as developing crew skills in leadership, co-operation, decision making, and situation awareness. Accident data indicates that approximately three quarters of aviation accidents are as a result of human error as opposed to technical malfunctions.1
Over the past 15 years I have broadened my interests and have worked extensively in developing an understanding of human factors in other safety critical industries, particularly healthcare. I have been honoured to work with many people who are world experts in understanding human fallibility. One such person is James Reason, Professor Emeritus of Psychology at Manchester University, who is widely recognised as the world-leading expert on human error. In his foreword to the excellent book called Patient Safety in Emergency Medicine,2 Reason explains that there is a paradox at the heart of the patient safety problem. Medical education, almost uniquely, is predicated on an assumption of trained 'perfectibility'. After a long, arduous, and expensive education, doctors are expected to get it right. But they are fallible human beings like the rest of us. For many of them error equates to incompetence or worse. Mistakes may be stigmatised or ignored, rather than seen as chances for learning. The other part of the paradox is that healthcare, by its very nature, is highly error-provoking. Knowing this, it is not unreasonable to substitute the words 'lawyer' for 'doctor' and 'law' for 'healthcare'. Lawyers, like doctors, are highly intelligent, well educated people who have been successful from school days through university and law school to partner level in top law firms.
Many other high risk industries have learned that trained perfectibility does not guarantee a safe culture. Aviation accidents by their very nature receive instant press attention. Several high profile accidents ignited the aviation regulators' attention some 30 or 40 years ago. Accidents were tagged as being caused by 'human error' or 'pilot error'. The authorities finally decided that such a status quo was unsustainable and thus research into understanding human error began.
Aviation is very fortunate in being able to employ such high fidelity simulations to aid research into crew behaviour in crisis situations. It is for that reason that a great deal of the development of human factors understanding has emanated from aviation. Although it would be unwise to attempt to replicate the training of human factors, aviation style, into the medical or legal setting, it does seem sensible to use some of the principles to short circuit the arduous path that aviation has followed in this regard.
Skills in leadership and teamworking
Law is probably more complex than many other fields of human endeavour, and legal situations are perhaps more complex and idiosyncratic than aircraft, ships, or power stations. The critical similarity is that they all rely on teams of professionals working together, and effective communication is vital in environments that are often highly stressful. Although it is clear that flight crews and lawyers are very different, their work shows some similarities:
-
They work in highly complex organisations;
-
They lead multi-disciplinary teams;
-
They operate in potentially stressful situations;
-
They have to encompass new technology;
-
They are required to perform management roles within their organisations;
-
They have unique responsibilities for the wellbeing of their team and their passengers/clients; and
-
They often have and generally need a 'can-do' mentality.
Errors are an inevitable outcome of the human cognitive system working within the complex and sometimes chaotic global business world and this is
perhaps
because the modern brain is the same brain that was designed for our genetic predecessors - predecessors who had to hunt for the family to survive, evading the attentions of a multitude of predators. Little surprise therefore that without any 'firmware update', human cognitive adaption has not been completely successful.
We still get the physical symptoms of fear that were the property of the caveman to assist in the 'fight or flight' decision:3 the sweating to aid cooling, raised heart-rate for oxygenation, dilation of the pupils for increased vision, and the visceral response to make the body lighter for increased speed. These same physical responses are alive and kicking in many clients when they have need to seek a legal professional!
So why has the adaption of human cognition systems been inadequate for the demands of modern life? Professor James Reason explains in his illuminating books that error can never be eliminated but can be managed.4 In simple terms there are two distinct cognitive processes: first there is the conscious cognitive process which is used when a task is novel; and second there is an automatic cognitive process where the task has been practiced and perfected and this process occurs at a subconscious level. The salient point is that the working memory is extremely capacity limited. It is also a lot more effort for the brain to use working memory and it is the least preferred option.
Same error but different causes
Conscious cognition uses the working (short term) memory. The old adage is that you can only recall between two and seven independent facts for up to about 30 seconds within the working memory. That being the case, when working memory is required because a task has not been perfected, the limited nature of the conscious cognition mechanism is prone to making mistakes - these are known as knowledge-based mistakes.
When a task has been perfected it is known as a skill which is then 'parked' in the long-term memory portion of our cognitive mechanism. When required the skill is subconsciously extracted from within the cognition mechanisms. The positive aspect of this skill-based behaviour is that it is automatic and does not need to use the very limited resources of the conscious mechanisms. However, the potential for error in skill-based behaviour is the cause of slips or lapses.
A wonderful everyday example of a task becoming a skill is the analogy of car driving. Recall the first driving lesson? The task of balancing the clutch and the accelerator while also remembering to check mirrors, select the handbrake to off, and turn the steering wheel becomes all-consuming and the capacity to listen to advice from the instructor is almost impossible. However, a few weeks after the driving test is passed, the task of driving becomes virtually automatic and there is capacity to tune radios, chat with passengers and attend to other cognitive processes.
In The Multitasking Myth Loukia Loukopoulos et al5 have researched high performing teams handling complexity in real-world operations. They identified four prototypical situations that experts were found to be more error prone:
-
Interruptions and distractions;
-
Tasks required out of normal sequence;
-
Unanticipated new tasks arise; and
-
I
-
nterleaving multiple tasks.
In a busy law firm the four situations above are perhaps commonplace. This indicates how individual error can hardly be eliminated, but perhaps with a team approach the inevitable errors that occur may be trapped or the consequences mitigated.Lazy brains?
The human brain has certain safety mechanisms that protect it from overload. Human attention mechanisms are fallible (e.g. the hearing sense closes down under conditions of extreme task focus). This is shortly followed by a condition known as black-holing or tunnel vision where peripheral vision reduces while focusing on a difficult task.
Other attention mechanisms also show vulnerability. Humans are not ideally suited to monitor situations, but prefer to be actively involved. We attend to what is interesting to us as opposed to what we should be attending to; hence the fact that distractions are error provoking.
One of the positive aspects of human cognition is the ability of the human brain to match patterns that have been stored in the long-term memory. It is by that method that experts make decisions, but what happens when the pattern that is matched is very similar to the stored pattern? Our brain then matches a best-fit or near fit.
Many of you will be conversant with Daniel Kahneman's outstanding book, Thinking Fast and Slow.6 Kahneman explains, with outstanding examples and research evidence, the dual process model. He argues that System 1 (intuitive) processes are fast, automatic, and unconscious and System 2 (analytical) processes are slow, deliberative, and conscious. The neuroanatomical loci of System 1 and System 2 processes are now known. System 1 are located in the older parts of the brain, and importantly, involve the amygdala and parts of the limbic system that process emotion, whereas System 2 processes are in the newer parts of the brain, the neocortex. Experiments have shown that the processing speeds of System 1 are twice as fast as those of System 2. 7,8
A final example of the lazy brain is a phenomenon known as 'confirmation bias'. Confirmation bias is a tendency for humans to favour information that confirms their initial hypothesis and conversely to ignore information that disaffirms the initial hypothesis. Many of the disaffirming cues are rejected at a subconscious level. There are examples of confirmation bias in many different professions: police who make an early arrest and subsequently ignore contrary evidence; social workers who ignore or excuse injuries on children who have been designated 'not at risk'; and pilots who 'talk themselves' into making the real world fit their skewed mental picture.
When conscious and automatic processes collide
Learning a new task requires conscious processing, which is why it can appear overwhelming to try to master a new skill. Sometimes the demands of a task can exceed the capacity. However, once the skill is mastered it becomes automatic, requiring little of the cognitive capacity, thus allowing attention to be paid to other tasks. There are however four particular situations when a degree of conscious control is required. These situations are:
-
When a task is novel;
-
When a task is perceived to be critical; difficult or dangerous;
-
When a habitual task needs overriding; and
-
Prioritising among competing tasks.
It is crucial to understand that when any of these four situations occur, a proportion of the working memory will be required to engage in the conscious processing of the task, thus allowing less capacity for other demands.
Conclusion
Research into the areas of human cognition and the effect it has on human performance is gathering pace. One particular area of research is looking at the prospective memory,9 that is, the memory that allows humans to return to a deferred or suspended task at a later time.
Generally tasks that have to be undertaken while awaiting a return to the deferred task take up so much of the attention mechanism and the working memory that there is little capacity left to remember to return to that task. Therefore the deferred task must be retrieved from the long term memory - unfortunately those retrieval processes seem to be rather fragile. It does however appear that a physical trigger point for retrieval of a deferred task is somewhat more robust than a trigger point that relies on a moment in time. Once again, as with all the limitations of human information processing, the way to reduce the potential for error provoking situations is by effective team communication and the design of systems and protocols that appreciate the inadequacies of human cognitive processes.
I have run a number of highly successful conferences with a colleague from Great Ormond Street on risk. All the talks from a myriad of outstanding speakers are available to view, free of charge, at www.risky-business.com.
-
Guy Hirst is a former pilot and has been instrumental in introducing human factors training to a variety of organisations from British Airways, to the Merchant Navy, to Great Ormond Street and the John Radcliffe, in conjunction with Oxford University.
References
-
ampbell RD and Bagshaw M. Human Performance and Limitations in Aviation (Oxford: Blackwell 2002)