Originally posted on June 28, 2021 @ 3:47 PM
Foresight Blindness, Hindsight Bias and Risk
One of the beauties of fallibility is what Alan Watts called The Wisdom of Insecurity . Watts describes how binary worldviews create fear and dogma, making people frightened and concocting securities that don’t exist. Watts thinking helps one step out of the binary worldview and accept a way of being that lives in dialectically in uncertainty, understanding the place where wisdom is experienced. Securities concocted out of fright and fear are generally felt by demonizing ‘the other’ and all that creates uncertainty. Such a worldview understands risk as the terror of fallibility not its beauty, it then creates the dogma of zero as if nothing gives certainty.
In such a worldview one looks back in hindsight and says ‘all accidents are preventable’, ‘safety is a choice you make’. Such a view always looks backwards not realising that everything they hope for looks forward and is founded on hope and faith.
Foresight blindness is the foundation of fallibility and risk, the reality of living in the world of life and learning. The very meaning of the word ‘risk’ is about what is NOT known. The fear of risk is the fear of not-knowing. This is why people find great comfort in Hindsight and adore the concocted delusions and dogma of predictive analytics and nonsense linguistics of zero.
If you want to understand Risk, here are some of my favourites:
- Bernstein, P., (1996) Against the Gods, The Remarkable Story About Risk. Wiley and Sons. New York. (https://matrixtrainings.files.wordpress.com/2014/09/against-the-gods-the-remarkable-story-of-risk-1996-peter-l-bernstein.pdf )
- Douglas, M., Risk and Blame, Essays in Cultural Theory. Routledge. London. (https://monoskop.org/images/1/1d/Douglas_Mary_Risk_and_Blame_Essays_in_Cultural_Theory_1994.pdf)
- Gardner, D., (2008) Risk, The Science and Politics of Fear. Virgin Books, London. (https://www.dangardner.ca/publication/risk )
- Houk, J., (2017) The Illusion of Certainty, How Flawed Beliefs of religion Harm Our Culture. Promethius Books, New York. (https://au1lib.org/book/4988960/0d0c62?id=4988960&secret=0d0c62 )
- Kierkegaard, S., (1844) The Concept of Anxiety. A Simple Psychologically Oriented Deliberation in View of the Dogmatic Problem of Hereditary Sin. Norton and Co. New York. (http://lib.stikes-mw.id/wp-content/uploads/2020/06/The-Concept-of-Anxiety_-A-Simple-Psychologically-Orienting-Deliberation-on-the-Dogmatic-Issue-of-Hereditary-Sin-PDFDrive.com-.pdf )
- Slovic, P., (2000) The Perception of Risk. Earthscan. London. (https://www.thecampbellinstitute.org/wp-content/uploads/2017/05/Campbell-Institute-Risk-Perception-WP.pdf )
- Slovic, P., (2010) The Feeling of Risk. Earthscan. London. (https://www.researchgate.net/publication/274532126_The_Feeling_of_Risk_New_Perspectives_on_Risk_Perception )
- Taleb, N., (2007) The Black Swan, The Impact of the Highly Improbable. Random House. New York. (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.695.4305&rep=rep1&type=pdf)
Just read any of these books and you will quickly dispose of the nonsense ideology of zero.
If you believe in risk and understand risk, you will not only know that people will be harmed but you will know that harm is inevitable. If harm is inevitable then the question is not the silly binary one of: ‘how many do you want harmed today’. But rather, if you are harmed: ‘what resilience, skills and wisdom have you cultivated to manage harm?’
The fear of harm is not just the fear of fallibility and learning, it is also the fear of fear. If risk doesn’t make sense (https://www.humandymensions.com/product/risk-makes-sense/) then its time you gave away the work of safety. Zero not only denies risk but deems Risk the enemy. The very existence of the industry of safety is premised on the reality and inevitability of harm. The ideology of zero is in complete denial of such reality, which is why it is a mental health disorder.
Foresight Blindness keeps human persons: humble, enquiring, imagining, discovering, learning, hopeful, shared conversation/listening and faithful. Shared Foresight Blindness should inhibit: blaming, false consciousness, hindsight confidence/arrogance, the linguistics of prediction and Naïve rationality.
It is only the delusional who set a goal they can’t achieve then spend the rest of their lives counting the number of times they don’t achieve it!
When we accept the inevitability of fallibility and the reality of uncertainty we don’t become resigned to fatalism. Fatalism like zero is the product of binary entrapment and fear. Wisdom in personhood comes from rejecting binary entrapment and focusing on the movement of being/learning and stepping forward in life by faith. Faith is not just a religious word but a word that defines how persons make decisions based on trust and hope in the face of risk. People who understand risk need not apologise for using the word ‘faith’. A little read of Mollering would be helpful: https://www.kent.ac.uk/scarr/events/Mollering.pdf
When we step forward looking backward we generally crash into things.
Psychological crashing is what happens when people put their faith and trust in paperwork as if this is some pathway to zero. It only takes a few questions from a lawyer in WHS to dismantle the delusions of faith in paperwork (https://www.booktopia.com.au/paper-safe-gregory-w-smith/book/9780987630001.html). Paperwork (https://vimeo.com/162034157 ) is not only dated as it is written, it draws the safety industry away from the immediacy and ongoing risk thinking of conversational listening, what we in SPoR call ‘iCue Listening’ (https://safetyrisk.net/conversational-icue/ ). This is the starting point for everyone who takes on the Introduction to SPoR. The next free module will start in September (https://cllr.com.au/product/an-introduction-to-the-social-psychology-of-risk-unit-1-free-online-module/). Every time we conduct this module, no-one from the safety industry knows how to listen in risk.
The same problem exists when one supposes that the reason why paperwork fails is that is hasn’t been composed to a certain worldview of useability. In order to tackle risk one has to think critically about worldviews not just technique. Knowledge about human literacy, linguistics, reading and comprehension in the Education and Teaching Profession understand these as a wicked problem. Useability Bias is similar to Hindsight Bias and is evidence of an undisclosed worldview. Professionals in Education and Teaching know the many schools of thought in the development of language in children and resist the declaration of simplistic mechanistic formulas (https://safetyrisk.net/its-always-about-paperwork/ ).
The real way forward in risk is the acceptance of risk outside of binary simplistic formulas. The last thing workers need is some hero spruiking the certainties of zero. Sometimes the last thing workers need is a form designed by an engineer telling them how to think about risk. The last thing workers need in facing uncertainty is some composition telling them that they are certain about eradicating uncertainty. This is why the way forward in SPoR is proposed through iCue Listening and we know it works (https://www.humandymensions.com/product/it-works-a-new-approach-to-risk-and-safety/) in bringing together the shared wisdom of workers who know about the uncertainties they have to face each day.
3. PAPERWORK from Human Dymensions on Vimeo.
Wynand says
About paperwork – I was told about a plant foreman who could tell if his plant was running as it should by listening to how it sounded. I told this to plant operators and foreman on a plant I visited, and they all supported the notion that experienced operators can hear, smell and feel if a plant operates normally. I also saw a plant manager open a sample point, put in a spade, sampled some material and by feeling the texture of the material he could say if the plant was running too hot, to cold or within ideal parameters. I would like to challenge anyone to put these observations into a number on a report. Imagine a foreman reporting “the plant sounds and smells right” in a production meeting. Yet, in his day to day work that is probably what he uses most to make sure that production is running smoothly. I imagine this goes for safety also – the experienced operator may predict and prevent an accident by knowing something is out of kilter, IF he is allowed by safety to act in this “instinct”. By the way, I have yet to see an account of how many accidents are prevented on a daily basis, since it is probably impossible to know. Often, just doing your job correctly involves a constant prevention of accidents. (Every turn you make on the steering controls of a moving car prevents an accident after all.)