I Just Don’t Know
I was talking with a friend recently who works in risk and safety. They shared a story about how a relatively serious incident had occurred at their company and despite a very thorough and detailed examination of the events that lead up to and followed the incident, the reason(s) that it occurred could not be found. It was a complete mystery as to why the things happened the way that they did. My friend said to me “I just don’t know how this came about, it’s got me buggered”.
I remember reading in Daniel Kahneman’s Thinking Fast and Slow his theory about how if people don’t know the answer to a question asked of them, one option they can take is provide an answer to a different question on a topic that they do know about. Politicians, through spin, are obvious examples of this, but Kahneman’s point is that this is something that we are all prone to do at times, particularly when there are social pressures in place that make it awkward not to provide no answer.
In your work in risk and safety, have you ever felt the pressure to provide an answer in a tight timeframe? Have you ever provided an answer that was plausible and ‘could be’ right, but you weren’t quite sure, because you didn’t have the time to think things through, if it was?
Have you ever made stuff up just to complete a report on time? Stuff that might have been right, sounded like it was right, and to others could be right, but to be honest, you couldn’t be sure? These may be challenging questions, but we can learn so much by reflecting on them and it can be useful to further examine the social arrangements and context in which we may make such decisions.
It can be a challenge in risk and safety at times to say “I just don’t know”. There are often social, cultural and organizational factors and expectations that ‘answers will be found’. Systematic and linear approaches to incident reviews are often mute on such factors and expectations, the focus is usually “Just Get to the Bottom of it”. When this is the focus, when our incident reviews are solely systematic reviews, do we limit opportunities for learning from the incident? Could it be that the more mechanistic our response, the less we ‘think’ and reflect humanly about what has happened?
So why is this? Why do we find it terribly difficult at times to say “I just don’t know”.
Let’s examine a case study to see if it can provide some clues. The case I’m thinking about occurred around October 2013. The short introduction to the story is that a truck (lorry) had an incident in Sydney’s Harbour Tunnel when the back part of the truck somehow lifted as it entered the tunnel. The incident caused chaos for the Sydney morning commute (> than one million drivers) and there were a lot of angry people that morning, none more so it seems than Duncan Gay who is the Roads Minister in NSW. Mr Gay’s response to the incident was recorded on the nightly television news. In the news footage he vents his frustration by suggesting (about truck drivers) that “these idiots should put their brain into gear when they put their truck into gear”. I suspect Duncan doesn’t have an extended knowledge of how people make decisions and judgments (that’s a hole other story…)
But the point of this piece is not to analyse Mr Gay’s words or to particularly criticize them, it is to raise the question of the social context in which they were made. This was the news report on the nightly television news on the day the incident occurred. While some ‘facts’ about the case would have been known; for example, footage of the incident would have been available, and the driver would have been spoken with (or to!), there is no way that you could understand the hole picture in this short period of time.
There would have been so many factors at play in the unconscious that only come about on reflection, but this is not the context in which Mr Gay was talking. Instead the context was a political one, where action and strong words are expected. Could you imagine the public outcry had Mr Gay’s response when questioned by a reporter been “I just don’t know”? Our social arrangements, context, cultural expectations and norms all play a part in our decisions and how we communicate. Sadly and disappointingly, such factors are not often the topic of conversation in risk and safety, an industry that seems fixated and wedded to a systems only approach to understanding events.
So if this resonates with you and instead of focusing solely on systems you are looking for a different approach, how can you go about things so you can get to better ‘know’ what happened? I wonder if these tips might be at all useful?
- Buy time – as my great mate James Ellis is well known for saying when talking about difficult or complex decisions ‘buy time’ (colloquially known as an ‘Ellis-ism’). Can you negotiate for the report to be provided at a later date? Why is it you need to report now, is this about you being perceived as more ‘expert’ because you found solutions sooner? I know I’ve felt like this at times. In what ways can you ‘buy time’?
- Create space for thinking – I’ve often wondered why we create deadlines for ourselves when developing procedures and policies for reporting incidents. Sure, there are often legislative timeframes for reporting incidents, but they are usually only about ‘notification’ rather than ‘close out’. How do you create an environment for more thinking time?
- Get messy and go crazy – no I don’t mean start running around the office and shouting at everyone, I mean consider ditching the linear charts and forms. Start your investigation with a large blank piece of paper and concept map your ideas, feelings, findings. Focus not just on events and objects, but people and relationships. By messy, I mean don’t worry if things look ‘all over the place’ and you back track over your ideas, that’s probably a sign that you’re on the right track, after all that’s how life is, isn’t it?
Sidney Dekker in his book Drift into Failure when referring to Columbian Accident Board investigation into NASA after the Challenger space craft accident notes that it was “not going down and in – it went up and out” (p.62). How can we go more ‘up and out’ in our investigations?
How would you feel going to your leader and admitting, “I just don’t know”. What would the response be? How would others feel? What are the social arrangements and contexts associated with this? Why do they matter?
How do you deal with the social pressure to just get to the bottom of things and find time to provide answers when they are known, not when they are expected?
We’d love to hear your thoughts, experiences and comments.
Author: Robert Sams
Phone: 0424 037 112
Facebook: Follow Dolphyn on Facebook