How does collective mindfulness apply to workers compensation? How do our expectations get us into trouble? Oh, and what is the meaning of life?
I’m not so sure which I enjoy most anymore, being right or being wrong? My family, who are accustomed to my insatiable appetite for being right are (I expect) rolling on the ground laughing at this point. Obviously there’s something very satisfying about having our intuitive (and especially if they’re marginalised) beliefs, confirmed by good evidence. That smugness we feel when our gut feeling is supported by someone we respect or by a credible source of evidence. Paradoxically, since I went back to ‘school’ this year, I’ve been getting quite used to the warm, visceral and stimulating feeling that is induced when my long held assumptions are turned upside down. I have to say, the fallibility induced stimulation has been completely unexpected.
So why the warm fuzzy feeling when I’m wrong?
Maybe it’s attributed to the liberation I have felt in beginning to truly understand how little I actually know. And that that is ok. That being wrong is a portal to my holy grail – the meaning of life – that great source of stimulation that is infinitely available to the arational mind that drives our decision making….our capacity to learn stuff.
So now that I’ve so succinctly provide the answer to the meaning of life, let me turn my attention to a more difficult (some would say wicked) problem…the complex world of workers compensation. I’m afraid I’ll need more than two paragraphs, and I’ll certainly need more than one blog.
Weick & Sutcliffe’s book, Managing the Unexpected, has been a real eye opener. Their research has focussed on the characteristics of organisations with arguably the highest exposure to risk (e.g. nuclear power plants, aircraft carriers, fire fighters), and how they organise themselves. Weick calls them High Reliability Organisations (HROs). The book’s title gives us the immediate insight that unpredictability is ok. No amount of strategic planning can eliminate the unexpected so, rather than pursuing the illusion of control, HROs set about bracing themselves for, and managing, the unexpected.
They describe a suite of five characteristics that they call collective mindfulness;
- Preoccupation with failure….continually asking “what could go wrong?”
- Resistance to our tendency towards oversimplification
- Ongoing attention to what’s going on at the front line
- Recognition that those in authority are not always the best source of expertise and finally
- Resilience, or being mindful of errors or adverse events that have already occurred
To properly appreciate how we typically ‘manage the unexpected’ it’s important to understand confirmation bias. It’s crucial to understand how much we like to be right so we can appreciate the value of being constantly alert to the possibility of being wrong. Confirmation bias tells us that;
we actively seek out evidence that confirms our expectations and avoid evidence that disconfirms them (Weick and Sutcliffe, 2007, p. 93)
Think about this in the context of having an argument with your partner, a work colleague or a friend. How easily and freely do we rationalise away evidence that deviates from our point of view.
So how does collective mindfulness sit with our workers compensation system(s)?
Do we expect mistakes?
How does this ‘system’ cope with deviations from the expected pathway?
Confirmation bias weighs heavily on planning, and in workers compensation we do a lot of planning. We have psychosocial assessment plans, return to work plans and injury management plans. In our businesses we do strategic plans and our business coaches provide pithy advice like ‘failing to plan is planning to fail’. In workers compensation there is pressure to quickly settle on a diagnosis, usually with all eyes on the ineradicable question of attributability to work. Diagnosis is quickly followed by a treatment plan.
The thing about planning is it creates expectations.
Once we develop a plan we have a strong bias towards expecting that plan to pan out. Mintzberg calls this the “fallacy of predetermination” (Mintzberg, 1994). When we make plans we reinforce our expectations and this reduces our tendency to notice deviations from the plan. Developing plans and procedures assumes that reliability of actions leads to reliable outcomes and the problem with this assumption is that it desensitises us to unexpected events.
One of my team (a physiotherapist) recently assessed Barry, who had knee pain. The clinical examination suggested a meniscus tear so an MRI was sought by Barry’s treating doctor. The MRI showed no tear. So now we had a dilemma. Could the MRI be wrong? Is the clinician who diagnosed the tear wrong? The insurer, whose default plan was to decline liability, rigorously argued in favour of their own confirmation bias, seeking to decline based on the MRI. They argued proof of no pathology, despite Barry’s pain and discomfort. A referral to an orthopaedic surgeon was made and approval granted reluctantly under pressure from an assertive employer. The surgeon reviewed the MRI and made his own clinical diagnosis, agreeing that a meniscus tear was the most likely explanation…but his diagnosis and opinion wasn’t equivocal. In short, he made room for being wrong. Without a definitive conclusion, the recommended arthroscopy was questioned and approval hard fought. Interestingly, at arthroscopy, the surgeon found the meniscus to be fine, but a small (offending) bony fragment beneath the knee cap was found and removed. Consider this example from the perspective of Barry, his physio, the insurer, the surgeon and the employer.
Clearly none of them had enough information to reach a definitive conclusion.
How sure was Barry that he wanted an arthroscopy? How sure was the insurer that the claim should be declined? How invested was the physio in her clinical diagnosis? And how quickly did each develop a plan that they locked into pursuing?
In the end, to some extent, the clinicians and the insurer were all wrong at various stages. Ultimately however the worker was relieved of his pain and function was restored. Do you think that gave them a warm fuzzy feeling? Or did the expectation that their expertise should provide a more definitive answer, early on, leave them feeling uncomfortable?
Collective mindfulness doesn’t suggest we shouldn’t make plans, just that we should be wary of them. Collective mindfulness provides freedom to be wrong. In fact it encourages us to vigilantly consider how wrong we might be in the face of the pressure our plans provide to forge ahead….despite subtle signs we could be going the wrong way.
Think about that the next time you’re arguing with your partner about driving directions.
Planning can easily become an effort to control the unknown, but we know so little that the unexpected is inevitable. Our fallibility is inevitable.
I’ve taken to embracing my potential wrongness more readily of late. It’s very liberating. I’ve always been uneasy about planning because of the implicit pressure to consider an array of considerations I have no expertise in. Embracing our potential wrongness is liberating because it frees us up to learn stuff.
Of course however, I could be wrong.
Mintzberg, H., (1994) The Rise & Fall of Strategic Planning New York: Free Press. New York.
Weick, K. and Sutcliffe, K. (2007) Managing the Unexpected: Resilient Performance in the Age of Uncertainty 2nd edition John Wiley & Sons 2007
 Graduate certificate in the Psychology of Risk (Social Psychology) at ACU….6 months in and boring my family to distraction