The Sully Effect

by Dr Rob Long on December 2, 2016

in Psychology of Safety and Risk



The Sully Effect

imageI wrote recently about the Bradbury Effect that is, the delusion that there is no chance, luck, randomness and fundamental attribution error. A critical part of the Bradbury effect is the attribution of value to denial in the light of hindsight bias. It’s amazing how smart we become as humans when we are not in ‘the moment’. Another example of this is The Sully Effect.

The story of Sully is about the Miracle on the Hudson . For those who have watched the movie it is about how engineers and bureaucrats (after the event) seek to find cause and rational reasons for the decisions of the Captain Chesley Sullenberger, Sully. The movie effectively demonises the enquiry and the NTSB investigators (http://qz.com/778011/sully-ntsb-investigators-are-not-happy-about-being-made-the-villains-in-clint-eastwoods-film-starring-tom-hanks-as-chesley-sully-sullenberger/).

The Sully Effect demonstrates the power of heuristics (https://en.wikipedia.org/wiki/Heuristic) and implicit knowledge (https://en.wikipedia.org/wiki/Tacit_knowledge) in decision making. This is how we all make decisions most of the time. The idea that decision making is both rational and like a checklist is a nonsense in light of the evidence. Humans simply couldn’t live if every decision was about rationality (see further One Brain Three Minds https://vimeo.com/106770292) This doesn’t mean that human decisions are irrational but rather, most of our decisions are arational (non-rational). That is, we decide using ‘fast and frugal’ processes (neither rational or irrational) built on experience such as with heuristics and implicit knowledge (the best to read further on this is Gerd Giggerenzer https://en.wikipedia.org/wiki/Gerd_Gigerenzer ). In the pace of this world (Moore-Ede, M., 1993 The Twenty-Four Hour Society) there are countless examples of how major disasters have been averted by heuristics thinking and fast and frugal (collective) decision making (https://www.youtube.com/watch?v=seDPVqQKORE ).

Implicit or tacit knowledge (through heuristics) is not really about knowing by feeling but rather a form of knowing that resides in the unconscious and surfaces when millisecond decision-making is required. This is what the story of Sully is about.

Heuristics and implicit knowledge operate from the unconscious. When we say we made a decision ‘without thinking’ that is exactly right. Most of the time this form of unconscious decision making keeps us perfectly safe. Some also know this as auto-pilot (automaticity – https://en.wikipedia.org/wiki/Automaticity) What we really mean is, we made a decision without rationally thinking. We do this individually and collectively. It is in the collective unconscious that cultures and organisations collectively support group heuristics and tacit knowledge. This is what happened in the cockpit with Sully and crew.

What the Sully Effect also highlights is the post-event (rational) intelligence of those who have the luxury of hindsight and time. I have seen this time and time again in the various crises I have been involved in (eg. Beaconsfield, Canberra Bushfires). The idea that people collectively make unconscious decisions is really not the domain of engineers, safety people and bureaucrats and this surfaces always in post event investigations. (The SEEK Program (next available in Melbourne in June 2017 http://cllr.com.au/product/seek-the-social-psyvhology-of-event-investigations-unit-2/) tackles such problems in investigations and provides new tools in understanding how decisions are really made.)

It is in post-investigation that all the lovers of measurement surface, yet in the moment of a crisis, there is no time to measure, reflect or ponder. Most safety investigations work under this crazy assumption that decision-making is like using a checklist – individual and rational, totally ignoring the social-psychological realities of decision making. Then when someone like Sully succeeds by such decision making they are lauded as a hero, but if anything goes wrong they get crucified, and so by rational standards they are labelled ‘an idiot’.

Of course there is no education of this in any orthodox safety curriculum (https://www.safetyrisk.net/isnt-it-time-we-reformed-the-whs-curriculum/). WHS is still plagued by pyramids, swiss cheese and curves. The WHS curriculum keeps the safety non-profession locked into a rationalist/reductionist time warp (read Dekker Safety Differently), so removed from the reality of decision making that it remains irrelevant to the rest of the workplace. Safety still thinks ‘safety is a choice you make’, that ‘all injury is avoidable’, that dumb-down is good and speaks in perfectionist language (zero).

If you want to learn more about human judgment and decision-making and the collective unconscious in cultural decision making, then you might want to join in on the Introduction to the Social Psychology of Risk workshops being undertaken in January, February and March 2017 (Europe, Sydney and Adelaide).

Linz Austria Workshop 17,18 January 2017

http://cllr.com.au/product/international-workshop-introduction-social-psychology-risk/

Sydney Workshop 8,9,10 February 2017

http://cllr.com.au/product/an-introduction-to-the-social-psychology-of-risk-unit-1/

Adelaide Workshop 8,9,10 March 2017

Flyer to be available soon, for more information:

Contact admin@cllr.com.au

Dr Rob Long

Dr Rob Long

Expert in Social Psychology, Principal & Trainer at Human Dymensions
Dr Rob Long
PhD., MEd., MOH., BEd., BTh., Dip T., Dip Min., Cert IV TAA, MRMIA Rob is the founder of Human Dymensions and has extensive experience, qualifications and expertise across a range of sectors including government, education, corporate, industry and community sectors over 30 years. Rob has worked at all levels of the education and training sector including serving on various post graduate executive, post graduate supervision, post graduate course design and implementation programs.

Previous post:

Next post: