Predicting Earthquakes and the Fear of Uncertainty
Guest Post by Dr Rob Long – Scary Stuff!!!!
Enjoy the rest of his work: click here
The news this week that 6 scientists received goal sentences (http://www.guardian.co.uk/world/2012/oct/23/italian-scientist-earthquake-condemns-court) for not communicating effectively about risk should send a shiver up the spine of anyone interested in risk and safety. An Italian court’s decision to convict six scientists on manslaughter charges for failing to predict the deadly quake that devastated the city of L’Aquila redefines the very meaning of the words risk and science. Risk as defined by AS/NZS ISO 31000 Risk Management Principles and Guidelines is ‘the effect of uncertainty on objectives’. Science is neither objective nor absolute and involves the interpretation and beliefs of humans. Science changes over time. What a nonsense to propose that the occurrence and intensity of an earthquake must be predictable. Last time I looked on my insurance policy an earthquake was described as an ‘act of god’. In other words, earthquakes are ‘natural’ disasters that only gods know about.
The quest for certainty is a fundamentalist quest, the fear of doubt is the fear of fallibility and the lack of control. Any language about risk that asserts black and white knowledge says more about the ‘wishes’ of the communicator, than the reality of the risk. The Italian court’s decision to convict six scientists on manslaughter charges for failing to predict the deadly quake that devastated the city of L’Aquila is a denial of risk. L’Aquila, which still lies in ruins, was also damaged by earthquakes in 1349, 1461 and 1703. Any communication about risk is a communication about doubt and uncertainty. The six scientists in L’Aquila were convicted for failing to properly predict a deadly earthquake and for suggesting that the community had nothing to fear re the occurrence of an earthquake. Such language is a denial of history and the meaning of the word risk.
The handbook to AS/NZS ISO 31000 is HB 327 Communicating and Consulting about Risk. This concise manual ought to be compulsory reading for all safety and risk people. The manual gives a brief overview of some of the essentials of social psychology and the communication about risk. The manual starts by stating what should be obvious ‘ risk management takes place in a social context’. In other words how we assemble and engage with others affects the way we assess and manage risk. The assessment of risk is neither objective nor a neutral process despite all the nonsense projected about the value of a risk matrix. At best a risk matrix is a tool for conversation, there is no objective evaluation of risk. The assessment of risk is neither scientific nor reliable.
HB 327 sets out a list of all the important factors that effect the communication of risk (p. 6), these include:
1. Context
2. Culture
3. Knowledge
4. Language
5. Cognitive bias (there are over 200 human perception biases)
6. Heuristics
7. Rules of thumb
8. Political power
9. Motivation
10. Perceptions
11. Complexity
12. Timeframes
13. Interference
Risk and safety training that doesn’t cover these fundamentals will always be inadequate and simplistic. The language of ‘safety science’ or ‘safety engineering’ should not give anyone the idea that the discourse about risk and safety is about certainty. If any communication involves human interchange and exchange (in other words all communication), then human fallibility will affect the filtering of that information.
Once we know that many social psychological factors affect the perception, assessment and communication about risk we become much more realistic about the goals we set for managing risk.
I attended a safety conference recently where more fear was being peddled by lawyers, more fear was being preached by safety people and more claims to safety certainty were being marketed yet there was no mention of the 13 factors listed in HB 327. So much expertise in fear in one place is indeed a scary thing. Without a clear focus on these 13 factors in HB 327 the decision of the L’Aquila court makes sense. If the risk and safety community want a similar decision to L’Aquila made here in Australia then we are heading in the right direction. Yet despite all the fear the WHS Act still talks about ALARP (as low as reasonably practicable’) and ‘due diligence’, both extremely subjective and obscure terms. Isn’t it strange that the safety profession is so focused on paperwork when it is the 13 factors of HB 327 that matter to the court. The mythology that paperwork offers certainty in court over testimony about the 13 factors is fuelled by the climate of conferences like the one I just attended this week.
Any talk or language in absolute or perfectionist terms about risk is delusional talk. Claims of absolutes in conversations about risk is a denial of risk uncertainty and human fallibility. When people hear such nonsense people are neither motivated nor inspired to work effectively with risk.
Do you have any thoughts? Please share them below