Originally posted on January 19, 2015 @ 7:33 AM
Could understanding Grey, be the ‘silver bullet’?
Since I started writing on Safetyrisk.net in November 2013, I’ve received lots of great feedback thoughts, questions and even criticisms about my posts which I appreciate.
A key theme of the feedback that I receive is that some of the concepts are difficult to fully comprehend because what I write about is not always black or white (or what we know of as binary thinking). That is, it can be ambiguous, grey and messy. People are often looking for a right or wrong way to do things, for the ‘method’ that works best, for the approach that ‘will sort things’. If you like, we seem to be on a mission for the elusive ‘silver bullet’.
When we get our heads around the fact that the world, and all of us (people that is) that inhabit it are complex, are not simple to program and are not motivated by control, it can help free us up to stop looking for the perfect solution and focus on better understanding people and how we can better discern risk.
As I wrote about in my last piece, all organisations and people, have to deal with, and make sense of risk, equivocality and subjectivity, despite a desire for certainty, clarity and objectivity. This is at the heart of the challenge for those of us in risk and safety. We desperately seek clarity and objective answers, views and process, yet the world is full of grey, messiness and bias. It can be frustrating when things are a constant wash of grey, especially when we seek clarity. So what does black and white look like in risk and safety?
There are many examples where right and wrong (binary) thinking is evident every day in risk and safety, here are a couple, I’m sure you can think of many more:
- The wearing of PPE – while working with a team this week on the ground with nothing overhead, the safety guy came by and said “C’mon fella’s, you know the rules, hard hats on, it doesn’t matter whether anything can fall or not”. Everyone did what they were told!
- Risk Assessment – we must reach consensus on the risk score, we can’t have different views about the same risk, we keep talking until we all agree
So, if binary thinking is not always that helpful, in fact, I argue that it is very dangerous, what can we do about this?
I wonder whether instead of thinking in a black and white, right or wrong way, we could learn something from the work of Karl Weick whose focus is on understanding ‘trade-offs’ and ‘by-products’, rather than the simplistic black and white? I thought it would be useful to explore these briefly for those who are not familiar with Weick’s work.
Understanding ‘by-products’ can be complex. One way to consider by-products is that they are the resulting actions that come about once ‘trade-offs’ (see next paragraph) are made. An example may be that two people may decide to meet at a certain time. Person A advises that they are not able to meet at the agreed time, so both people ‘trade-off’ and come up with a new agreed meeting time. The new meeting time is the ‘by-product’, or outcome of the trade-off. That’s a reasonably simple example.
A way to explain ‘trade-offs’ is that they are the outcomes of decisions that we make, or the things that we give up, in order to gain or do something else. Weick (1969, p.35), when discussing ‘trade-offs’ describes an example. “The more general a simple theory is, for example, the less accurate it will be in predicting specifics”. The trade off in this example is that when we choose to have a simple theory which more people may understand, the trade-off is that it will not be able to predict more difficult or specific factors. So how might we see this play out in risk and safety?
The following is a real life example where it would have been easier to suggest ‘right / wrong’ or ‘safe / unsafe’, but instead we considered the situation in the context of ‘trade-offs’ and ‘by-products’.
I was with Graham this week who had to do a job that required him to work on a residential roof. We were in a small country town and we travelled around two hours to get to the job. Graham was 10 minutes away from finishing the job when light rain started. His instinctive question to me (as a ‘safety’ person) was, do you think this is safe to keep working? Well how easy would it have been for me to say, “it’s starting to rain and that increases your chance of slipping”, or conversely, “no it’s only a light shower, I reckon you’re safe to continue”. The point is, that the question is not that useful when the answer is considered only in the context of ‘safe or unsafe’, there is far much more to it than that.
I resisted this easy course of action, controlled my inner Crusader (J), and instead we spoke about the situation in terms of trade-offs and by-products. The trade-off of stopping the job meant that Graham would be delayed for the next job and finished later that day. This would have meant that he missed seeing his child play sport.
A by-product of this, could have been that once he re-commenced work (after the rain), he may have rushed the job, because missing seeing his son play sport was not something that he was prepared to trade-off. Is the risk of Graham rushing for the rest of the day, greater than the risk of him falling off the roof in light rain? Of course, no-one can accurately answer that question, it is subjective, and so is making decisions about risk.
This is just one very simple example, made simpler by the fact that a) Graham and I had a conversation about it and b) I’m writing about it in retrospect and after reflection. The reality is that decisions like this (whether to stop work on the roof) happen constantly in our lives, and so many things shape these decisions, most of which are in our unconscious. Boy this understanding decision making about risk is tough work. Where’s that silver bullet?
When we think about this example, thinking about trade-offs and by-products, instead of in a simplistic, binary way, do you feel we make actually explore risk in a far more meaningful way?
Perhaps understanding ‘grey’ better is the ‘silver bullet’?
I’d love to hear your thoughts, experiences and comments.
References:
Weick, K.E., Sutcliffe,K. (2007) Managing the Unexpected; Resilient Performance in the Age of Uncertainty (Second Edition). San Francisco. John Wiley and Sons
Weick, K.E. (1995) Sensemaking in Organisations. California. SAGE Publications Inc.
Weick, K.E. (1969) The Social Psychology of Organising (Second Edition). New York. Random House, Inc.
Author: Robert Sams
Phone: 0424 037 112
Email: robert@dolphyn.com.au
Web: www.dolphyn.com.au
Do you have any thoughts? Please share them below