Predictably Arational, Safety as a Superstition
One of the most annoying things about Daniel Ariely’s book Predictably Irrational is that it allows risk and safety people to dismiss aspects of human decision making as stupid when they are not. The framing of the word ‘irrational’ is simply unhelpful. The hidden forces that govern our decisions are mostly non-rational rather than irrational. Whilst Ariely’s book is helpful in understanding human decision making it doesn’t help approach this understanding in a mature way. Ariely tried to rectify this in the following publication The Upside of Irrationality but, continues with the poor ‘anchoring’ and ‘framing’. The fact is many of our decisions are neither irrational nor rational but non-rational (aRational), that is, they should not be considered in a binary or judgmental frame. The way we ‘anchor’ and ‘frame’ our worldview shapes the way we see risk and safety eg. a binary ‘black and white’ frame views decision making as either rational or irrational. Until people in risk and safety get a hold of this distinction people will continue to deal with human decision making in a binary way and totally misunderstand what triggers risk taking and choice in safety. For example, the nonsense idea that ‘safety is a choice you make’ is sustained by this binary construct. The same binary construct promotes black and white goal setting and ‘blindness’ to by-products of target setting and non black and white thinking.
One of the best ways to understand human judgement and decision making is to research such things as gambling, cults, religion and superstitions. These activities are a magnifying glass to human judgment and decision making. If our worldview understands that decision making is a rational process then none of these activities ‘make sense’. The discussion of this blog is just with superstitions, which are arational. The power of superstition is most obvious in the arts, theatre, sport, travel, architecture, business and safety. Perhaps the greatest superstition in safety is the attribution of meaning to injury statistics but I will discuss that later in the blog.
Superstitions are about the sense of control needed in order the face uncertainty. People need to feel confident about risks in the face of uncertainty and so delve into history and patterns looking for causes and reasons why things happen. There is nothing particularly wrong with faith in superstition, most of our sporting stars believe in it, many of our business people think about it and many in the arts live their life by it. The idea that there are unpredictable ‘forces’ at work in the world is evidenced by so many peculiarities that we interpret and attribute these to mysterious causes and luck. Why do so many consult Feng Shui in architecture and design, ‘bless’ ships and transport craft, pray when they get sick or avoid certain numbers and colours in an age of science? You can read about superstitions here:
https://list25.com/25-strangest-superstitions-ever/
Why do so many people consult astrology, maintain traditions in theatre (don’t say the word ‘Macbeth’ but ‘break a leg’) and travel (http://travelblog.viator.com/travel-superstitions/) if the world is truly just binary, rational and scientific? Why do so many sensible and sane people avoid certain behaviours, colours, bad omens, times and language in critical moments if this is all irrational nonsense?
The trouble is science and rationality doesn’t match our experience and ‘feeling’ of risk. Science and rationality doesn’t seem to match the correlations humans make between patterns in life and outcomes? Little did we know that teams lose grand finals for many ‘hidden’ reasons that we are now only just beginning to understand, with the help of social psychology. Many people have a range of rituals and traditions they perform in order to provide comfort in the face of uncertainty. We look back at the grand final and find reasons why we lost or won, those socks will be worn from now on or, I won’t catch a bus to the game any more.
Adam Alter (Drunk Tank Pink) shows that there are many factors that affect risk decision making that are beyond our control. We are radically affected by names, labels, symbols, colour, location, weather and, the presence of others. The temperature and time of day can affect our decision making or even the colour of someone’s shirt. Even a superstition itself can heighten anxiety and hence change a decision. This is the source of OCD (Obsessive Compulsive Disorder), that is, excessive anxiety about uncertainty and the need to control outcomes by some activity. I have a number of friends who have various forms of OCD and the last thing they need to hear from me is judgmentalism that they are ‘irrational’. We all trust patterns in our living and these provide comfort and predictability. As Neitzsche observed as ‘the twilight of the idols’, we all fear losing a powerful symbol, activity, god or technology that provides comfort. So my friend who must touch all car door handles before he enters the car, will not be helped by me telling him he is an idiot. My friend who must not do shopping on Fridays and not walk on cracks, will not be helped by my labels of ‘irrationality’.
Jack Nicholson portrays the OCD disposition so well in the movie “As Good as it Gets’ and all these superstitions are based on Fundamental Attribution Error (FAE). FAE is also known as ‘confirmation bias’, ‘correspondence bias’ and ‘attribution bias’. These are some of the things we study in the social psychology of risk. There are so many non-rational causes of FAE and risk and safety people would be wise to understand these much better rather than attribute poor decision making to stupidity. The determinist view that ‘safety is a choice you make’ simply fosters blaming, superiority and misattribution of choice. We can’t influence OCD or FAE or, any non-rational worldview with a rationalist response. Neither can we change cultural beliefs with systems responses. The beginning of change and influence is through empathy and understanding and at this stage it seems that safety is not good at neither. ‘Safety engineering’ and ‘safety science’ don’t help safety people understand worldviews and non-rational decision making. Safety is so good at ‘telling’ people they are stupid rather than seeking to understand the choices people make. When we apply belief to Fundamental Attribution Error then OCD takes control. But safety itself is no better, it cannot take any superior view on OCD.
The ways safety delights in counting and injury data is nothing less than OCD and FAE. Counting injury data is a projected comfort that thinks it ‘measures’ everything under its control. In reality, injury data tells us nothing about culture nor provides any certainty about the future. You can’t predict the future by measuring mistakes of the past. Injury data tells us nothing about belief, sub-culture or reasons for decision making. Yet, safety can’t live without injury data. Safety needs injury data to justify its beliefs and its work. It attributes so much meaning to injury data patterns when no such meaning is there, it makes interpretations about ‘regression to the mean’ no less than a football fan attributes winning to a half time ‘pep talk’. The myth of injury data value is heightened by CEOs who also hold the same superstitions about injury data and this endorses deeper OCD and FAE, it’s a cyclic thing no less superstitious than a rabbit’s foot or avoiding the number 13. So much is attributed to injury data that doesn’t make sense but it is rarely challenged.
I was at a conference in Brussels recently where a presenter declared that injury data was motivational, and there was no challenge from the audience. I wonder how the presenter would have coped had someone asked, ‘can you please explain how injury data is motivational?’ ‘How does injury data motivate?’, ‘by what dynamic or force does data motivate?’ But no-one asks this question, most of safety believes the superstition. We keep on demanding reports on what went wrong but rarely learn from it because safety doesn’t understand he fundamental arational drivers of decision making.
Safety would be much better served if it knew more about what people believe and value than counting injuries. This can be done qualitatively by learning how to effectively consult or quantitatively by using such technology as the MiProfile (http://vimeo.com/24764673) diagnostic. Either way safety needs to move on from a deficit worldview and it is not likely until it is able to let go of calculative counting of injury data of course, so effectively ‘primed’ by zero. At the same conference presentation I referred to above, the presenter explained how the organisation was moving from ‘calculative’ thinking to ‘generative’ thinking (Hudson) and then used calculative language and discourse to explain the transition.
In the end most of the noise about being generative in safety is explained in calculative ways, which makes it calculative. The only way to really move from a ‘calculative’ culture in safety is to suspend the superstitious belief in injury data, stop making data a reference point and confess that your organisation is OCD in safety.
Do you have any thoughts? Please share them below