Everything in aviation we know because someone somewhere died… We have purchased, at great cost, lessons literally bought with blood… We cannot have the moral failure of forgetting these lessons and have to relearn them.”
Sully Sullenberger
Pilot of Flight 1549, ‘The Miracle on the Hudson’
All frontline healthcare warriors will bear scars from emotionally distressing experiences in the workplace (e.g. major incidents with multiple casualties, unsuccessful paediatric resuscitations, personal mistakes resulting in patient harm). For the most part, members of the public will only rehearse being exposed to these flavours of horror by watching movies or having nightmares. For us, it is a potential reality every shift.
In the aftermath, the way one processes these events heavily influences future commitment to similar causes and cognitive appraisals (challenge vs threat mindset) – the key determinants of mental toughness.
Adaptive processing should incorporate ‘Black Box Thinking’ and self-compassion.
‘Black Box Thinking’
Consider the aviation vs healthcare discussion for a moment – arguably the two most safety-critical industries in the world.
On average, just one commercial flight goes down for every 8.3 million take-offs worldwide. In the US alone, there are approximately 400, 000 avoidable medical errors every year, which is the equivalent of two jumbo jet crashes every day [1, 2]. That is a gargantuan discrepancy in passenger versus patient safety.
Of course, it is well documented that the two industries are not directly comparable. There are far more reasons for a patient to die than there are varieties of plane crash, and medics do not yet have the option to switch on a mental bandwidth-sparing machine that’s able to mop up routine tasks. Nonetheless, the statistics illustrate an indisputable point – we have a huge amount to learn from our aviation counterparts, whether we like it or not.
Why is aviation such a staggeringly high performance industry? The answer is simple: there is an institutional culture of learning from failure. Every plane is equipped with two sturdy black boxes which record conversation in the cockpit, and electronic decision-making (i.e. which buttons were pushed). In the case of an accident, the black boxes are promptly retrieved from the battered fuselage, opened, and the contained data interrogated. Every aspect of the crash gets the fine-tooth-comb-treatment to identify exactly what went wrong. Protocols are subsequently modified so the same mistake can never happen again. Error is not viewed as a sign of weakness or inadequacy – on the contrary, it is treated as a precious (even exciting) learning opportunity for everyone who might benefit.
Healthcare culture is largely the polar-opposite. Failure is stigmatised because doctors are supposed to be infallible in the eyes of the public. Mistakes get ‘swept under the carpet’ by the guilty to avoid being held accountable and where that is not possible, the blame-game ensues [3]. When one’s professional credibility is at stake, a successful escape from the situation is higher up the priority list than learning from the failure; and the omnipresent threat of litigation only serves to further entrench this defensive, maladaptive institutional culture. The immediate gratification of reputation-preservation trumps the potential for professional growth that naturally follows acknowledgement of personal failure. We routinely blind ourselves to the best possible signposting for getting better at our jobs – our mistakes.
Whilst this growth-stunting phenomenon will vary in severity across the spectrum of healthcare environments, you would be hard–pressed to find a doctor, anywhere in the world, not regularly exposed to this embarrassing peculiarity of our profession.
Be a black box thinker. Own your mistakes. Share your lessons. Interrogate every performance with the curiosity and tenacity of the Air Accidents Investigation Branch. Re-conceptualise your relationship with failure so that it no longer represents an existential threat, but acts as a guide for your ‘practice’ phase.
‘Reflective practice’ is an overused and misunderstood term in medical training (in my opinion). Often, written evidence of it is a requirement for career progression, and when one ‘reflects’ for that reason alone, it ceases to be useful. Furthermore, documented reflections will too frequently centre around what went well – a less lucrative training exercise.
Apply the black box philosophy to your reflective practice and force yourself to face potentially ugly truths. Embrace being criticised and never back down from asking a ‘stupid question’ – it tees you up for focused training and subsequent accelerated improvement. Have the bravery to be the detective leading the warts-and-all investigation on yourself.
Self-Compassion
In frontline healthcare, we are routinely exposed to life-changing injury and acute illness. If we take our workplace goggles off, and dare to view the worst aspects of our jobs through the eyes of a ‘normal’ person, it can be intensely disturbing. Furthermore, subscribing to the highest professional standards can make us prone to gratuitous suffering as we’ll mistakenly convince ourselves that we could have done more for unsalvageable patients. Our keenness to take full responsibility can render us vulnerable to unnecessary self-punishment.
Without appropriate perspective and personal support, our view of the world, and indeed of ourselves, can become warped. Long-term self-neglect in our line of work will eat away at our commitment to the job, potentially invite long-term psychological damage (PTSD), and ultimately, harm our patients.
When a particularly traumatising incident occurs, many institutions will employ a ‘critical incident stress management’ (CISM) protocol, which encompasses a range of supportive interventions aimed at preventing PTSD [4]. This includes a formal group debrief, led by an outside party (usually a psychologist) within 72 hours of the event. Despite being widely practiced, this approach is controversial as no definitive benefit has been demonstrated in the literature. However, widely accepted to be of critical importance for psychological wellbeing in the immediate aftermath of an emotionally traumatising incident is a ‘defusion’ process [4, 5, 6].
‘Defusion’ is a team get-together where thoughts and feelings are shared in confidence. When threat appraisals drench our brains in cortisol and distort our perceptions, defusion allows for piecing together the chronology and specifics of the event through organic, informal discussion with team-mates. It is an opportunity for emotional support, having a collective laugh/cry at the absurdity of the job, and an accurate information gathering exercise in a safe environment. The team pull together in the aftermath, are honest about their emotional frailties, and find strength in each other. It lacks the rigidity and intrusion of an uninvited formal debrief led by an ‘outsider’.
Pain shared = pain divided
Joy shared = joy multiplied [7]
In the hospital setting, it can be as simple as insisting on a chat in the coffee room after a big resus, or a quick get-together after work. It might seem minor, but unnecessary guilt, anger, confusion and other damaging emotions can be thwarted by this process. However informal and insignificant it might appear on the surface, it is of fundamental importance, and must be sought out, however logistically difficult.
In more extreme environments, such as combat or the prehospital setting, sitting down to defuse should also be used as an opportunity to regain a feeling of physical safety, get warm, hydrate and refuel (eat something).
Self-compassion via defusion is a critical strategy for building mental toughness. Taking care of yourself and your team after an acute insult preserves commitment to the job, and prevents lasting psychological scars that will render you less able to cope emotionally with the inevitable acute stress that lies in wait.
Summary
Use mistakes as signposts for self-advancement as opposed to sources of embarrassment. Own your failures instead of hiding them, and use them to guide your ‘practice’ phase.
Always remember to ‘defuse’ with your team after emotionally challenging cases/incidents. Share the pain, and multiply the joy. Never underestimate the therapeutic value, and heavy dose of perspective, that humour offers.
‘My Mental Toughness Manifesto’ Roundup
You are mentally tough if able to state the following (Part 1):
“I am 100% committed”
“I feel challenged”
To build and maintain mental toughness, I propose seven strategies over three phases of the game:
‘Practice’ (Part 2)
- Immersion
- Deliberate Practice
- Visualisation
‘Perform’ (Part 3)
- Tactical Breathing
- Cognitive Reframing
‘Process’ (Part 4)
- ‘Black Box Thinking’
- Self-compassion
Own your performance.
Robert Lloyd
@PonderingEM
References
- Black Box Thinking. Matthew Syed.
- 2017 Royal Society of Medicine Easter Lecture: Creating a high performance revolution in healthcare. Matthew Syed.
- What do Emergency Medicine and Donald. J Trump have in common? Robert Lloyd, EMJ Blog.
- Mental health response to disasters and other critical incidents. BMJ Best Practice.
- Debriefing and Defusing. http://www.davellen.com/page21.htm
- Shoes, Sex and Secrets: Stress in EMS. Ashley Liebig. SMACC Chicago lecture.
- Grossman, L.C.D., On Combat: The Psychology and Physiology of Deadly Conflict in War and in Peace. 2008: Warrior Science Publications.
Pondering EM says
*comment emailed to Pondering EM by Dr. Chris Turner (EM Consultant, UHCW NHS Trust)
Great post- really enjoyed it.
I’d like to add that learning does not have to paid for in blood- if we were to do 2 things.
Firstly we should learn from the near misses- unfortunately most organisations ignore the “no harm” box in DATIX and just acknowledge these incidents without investigating. This means that near misses are rarely looked at. We have become consequentialists, and this is to our detriment. Imagine if we had found a way to inform a colleague of their poor handwriting before they wrote so illegibly that a drug error ensued?
My second suggestion is that we learn to respect complexity. When we look at the last step in a long process and hold the people at that point responsible we oversimplify error in healthcare to the point at which we lose the real learning. The impact of environment, and in particular behaviour, tends to be marginalised, when the evidence shows us that behaviour in particular has a huge impact (see Riskin and Erez or anything by Christine Porath).
Healthcare is similar in many ways to aviation but also very different- it’s like we swap the crew, and the airplane and the positions of all the switches each time we fly. As a general rule people save the day in healthcare (as well as making the errors) and process, though important, is not the panacea that it can be in aviation.
People don’t deliberately not learn from error in healthcare- it is just that a) the consequences of owning up to error can be extreme and b) the cognitive dissonance that we suffer when we make the big errors can take years to overcome.
Governance as we currently do it in the NHS is broken. We need to think long and hard as to how we might fix it.
thanks for taking the time to read a relatively long reply- I don’t think this is the type of issue for short posts.
Chris Turner @orangedis
Pondering EM says
Hi Chris, thank you for your fantastic comment – I completely agree with all of your points!
Michael Hall says
Fantastic series of posts, perfect level of depth and detail.
Totally agree with Chris’s comment about failing to use the “near misses”. Most hospital classification systems look only at “Actual Harms”. The patient getting the wrong dc letter classified as much, much higher than the accidental administration of a life threatening drug that doesn’t cause an issue, or the “almost”…..If 2 planes almost crash, but avoid at last minute, it’s a full enquiry. If 2 patients almost die, but suffer no harm, there’s not much discussion.
Love the concept of embracing and almost “enjoying” mistakes for opportunities it offers. Something I’ve used in this scenario is this line
“In the end, the ONLY question you have to ask is if the same situation happens again, what would you do”….Sometimes the answer is EXACTLY the same thing as you did the first time.
Just because something went wrong, does NOT mean that a mistake was made.
Congratulations on a great contribution to FOAMed. Look forward to hearing more in the future.
Mike Hall – The Canberra Hospital.
Pondering EM says
Hi Mike, many thanks for your kind words. Love your point that bad outcomes are not always the result of mistakes. We need to strike a healthy balance between black box thinking and self-compassion. For that, mentorship is crucial. Even for bosses. Thanks again. Rob.