Grappling with culture is one of the issues Safety does poorly. After all, the industry and its STEM gestalt are not well equipped to even define culture well. This is most evident in the OHS Body of Knowledge (https://www.ohsbok.org.au/). Once again, unless the industry is prepared to step outside of STEM, it isn’t going to get far in addressing the issue of culture. Indeed, it seems Safety finds it best to just put it into the too hard basket, simply because the industry is not prepared to tackle a transdisciplinary approach to knowledge. Perhaps, the current strategy demonstrates that culture is too much of a ‘wicked problem’ to tackle. Indeed, when STEM looks at ‘gaps’ in knowledge within its own paradigm it doesn’t even consider the critical issue of ‘knowledge cultures’ (https://www.tandfonline.com/doi/abs/10.1080/09540091.2016.1273880; https://sts.univie.ac.at/en/research/completed-research-projects/transdisciplinarity-as-culture-and-practice/; https://www.sciartmagazine.com/uploads/6/0/8/9/6089526/towards_a_transdiciplinary_culture_by_julia_buntaine.pdf)
Artistic knowledge, poetic knowledge, indigenous knowledge, anthropological knowledge, philosophical knowledge, educational knowledge, theological knowledge, transdisciplinary knowledge, phenomenological knowledge, existentialist knowledge, social psychological knowledge, discourse theory knowledge, cultural theory knowledge and semiotic knowledge are all represented in cultures that stand outside the knowledge culture of STEM. No wonder Safety struggles with culture and doesn’t seem to get anywhere. Even its maps of meaning (semiosis) in knowledge are infused with STEM metaphors (https://safetyrisk.net/why-metaphors-matter-in-risk/) and dominate the OHS BoK.
So, if you are challenged in your organisation by elusive cultural issues perhaps put down the OHS BoK and start with Yuri Lotman (1990) ‘Universe of the Mind, A Semiotic Theory of Culture’ or Lotman (2013) ‘The Unpredictable Workings of Culture’. Unfortunately, you won’t find any of the knowledge cultures listed above in the OHS BoK or even discussion of ‘knowledge cultures’ in any of the OHS BoK framework or any reference to the work of Lotman (and many others critical for understanding culture). Indeed, when one looks at the OHS BoK for essential texts on culture dozens of critical texts are missing.
The best way to achieve some success in cultural understanding and change is to expand and broaden ones understanding of culture itself. Perhaps this is why organisations struggling with the issue of culture and safety don’t seem to get anywhere. When an institution thinks resilience can be engineered (https://safetyrisk.net/why-resilience-cannot-be-engineered/ ), or that fallibility can be denied (http://visionzero.global/) or that culture is behaviourism (throughout the BoK) you really do have a problem in understanding humans and ecological foundations of culture.
One of the biggest issues for the safety industry in understanding culture is the adjective it puts before the word ‘culture’ that shapes its cultural discourse – ‘safety’. If one ‘frames’ ones’ understanding of culture through the worldview of safety then it is not likely that much about culture will be tackled in response to concerns. Indeed, most of what is about in the safety industry on culture is simply a confusion of culture for behaviourism or systems. The language of systems and behaviourist metaphors are used interchangeably for the word ‘culture’. The classic doofus understanding is that culture is ‘what we do around here’, the classic behaviourist language that makes Safety think that culture can be observed and measured.
When the SPoR team help organisations tackle the challenges of culture we approach the issue of culture and safety from a completely new view outside of STEM. We bring a broad transdisciplinary approach to the challenges of culture and risk utilising the diversity of the SPoR team. The trouble is, tackling a wicked problem is not simple, easy or quick and it seems that organisations and CEOs are seduced by the safety quick fix and safety silver bullet, which is why nothing changes.
We had some good news this week that one of the SPoR team in Sweden has been given the green light to pilot a SPoR approach to risk in a tier one organisation involved in nuclear power. Similarly, we are working with a two global tier one companies in forestry and oil and gas, in culture change in safety with good results at early stages. Culture shouldn’t sit in the ‘too hard’ basket, neither should it be perceived as some safety trinket. Tackling is doable but requires a different worldview and stepping outside the blinkered STEM paradigm.
bernardcorden says
I don’t expect too many safety sycophants have come across Thomas Merton, you won’t find any mention of him in the Ayn Rand Fountainhead of Safety or OHS BOK.
https://en.wikipedia.org/wiki/Thomas_Merton
Rob Long says
When your worldview is defined by an industry that measures and creates meaning through zero then, mechanistic, numeric and object metaphors abound. Knowledge is then defined by knowing-that, not knowing-how. This is why safety has become such an abstract paper-based phenomena. We make workers ruminate and reflect on SWMS, risk assessments, pyramids, bow ties and curves yet none of this is used in actually make decisions in daily practice. Nothing in the BoK on intuitive knowledge, habit and the unconscious is telling.
bernardcorden says
I found the recent class action and ruling on negligence following the Queensland 2011 floods very interesting. Their operating manual was used as evidence against SunWater et al.
Rob Long says
So engineers write a manual and then in the heat of the moment ignore it. I wonder if they are going to turn to engineering to find the resilience they need to understand human judgment and decision making that doesn’t make logical sense? Unfortunately, calculative thinking and paper-work rarely help in a crisis, but a an ethical framework in understanding intuitive knowledge from non-STEM knowledge culture is invaluable in such times.
bernardcorden says
I found the recent class action and ruling on negligence following the Queensland 2011 floods very interesting. Their operating manual was used as evidence against SunWater et al.
Rob Long says
So engineers write a manual and then in the heat of the moment ignore it. I wonder if they are going to turn to engineering to find the resilience they need to understand human judgment and decision making that doesn’t make logical sense? Unfortunately, calculative thinking and paper-work rarely help in a crisis, but a an ethical framework in understanding intuitive knowledge from non-STEM knowledge culture is invaluable in such times.
bernardcorden says
I don’t expect too many safety sycophants have come across Thomas Merton, you won’t find any mention of him in the Ayn Rand Fountainhead of Safety or OHS BOK.
https://en.wikipedia.org/wiki/Thomas_Merton
Rob Long says
When your worldview is defined by an industry that measures and creates meaning through zero then, mechanistic, numeric and object metaphors abound. Knowledge is then defined by knowing-that, not knowing-how. This is why safety has become such an abstract paper-based phenomena. We make workers ruminate and reflect on SWMS, risk assessments, pyramids, bow ties and curves yet none of this is used in actually make decisions in daily practice. Nothing in the BoK on intuitive knowledge, habit and the unconscious is telling.