The Link Between Think and Blink

The Link Between Think and Blink

eyeI attended another engaging ‘thinking group’ meeting this morning on the Central Coast of NSW in Australia. The meeting was organised by James Ellis, and some new ‘thinkers’ joined us at today’s meeting including; a clinical psychologist who also practices in forensic psychology; a training manager with a background in community care; an accountant with an interest in behavioural economics; a quality/safety manager who proudly claimed “I just love learning from different people”; and a PhD doctor who teaches an MBA program.

Throw into the mix three mugs that are doing their best to learn about social psychology and risk and what comes out is some great thinking, from a diverse group of people, all with at least one thing in common, a desire to share, experience and learn as part of a community.

While our Thinking Groups usually run with no set agenda and with minimal formalities, we generally have a theme that is often around a book, a topic or an idea that one of the group is keen to discuss. Thinking Groups are not unlike a book club.

This morning’s theme, was loosely around Malcolm Gladwell’s book Blink, which on the website is described as:

“a book about how we think without thinking, about choices that seem to be made in an instant-in the blink of an eye-that actually aren’t as simple as they seem.”

I first read Blink around 12 months ago and it helped me cement the idea of conscious and non-conscious thinking and decision making. In particular, Gladwell refers to ‘snap judgments’ which are those judgments made in the ‘blink’ of an eye and without us even realising we are making them. I enjoyed talking through some examples of snap judgments this morning and it had me thinking about how the examples that Gladwell refers to in his book relate to risk and safety.

For example he refers to a story of a formulae one driver who, when leading a race, was about to turn into the final left hand corner and claim victory. What he didn’t know was that there had been an accident around the corner that he could not see. For no apparent or obvious reason to him at the time, the driver slowed down considerably as he headed into the corner, which meant he was able to narrowly avoid the smashed vehicles and cruise through to victory. But why did he slow down? Was this an act of God who shone down on him in a critical moment in his life? Was it just ‘not his turn yet’? The driver had no clue and could not explain his decision, until…….

A few months later while lying in bed, a thought came to the driver. He remembered seeing a crowd of people on race day, all looking to their right as he approached the corner. All of sudden he realised that this was no act of god, nor did he possess super powers. Instead, what he believed did happen was that he caught a glimpse of the crowd out of the corner of his eye and then made a non-conscious ‘snap judgment’ to slow down.

Could it be that reflection makes sense?

I wondered how this might play out in risk and safety? What would we in risk and safety do if a similar situation occurred at work? Would we try to develop and conduct a training program for all formulae one drivers to be better at observing crowds? Of course not, this sounds ridiculous right? But why is it that when we conduct investigations in safety that we must get to the bottom of everything and focus on fixing and closing things out? If the formulae one driver had taken this approach, what would that have meant for him? What impact could a process focused on ‘closing things out’ after an incident have on learning from incidents? Would we in risk and safety be mature enough to accept the type of situation that occurred for the driver?

Another story I loved is of a national harmonic orchestra in Europe and the process that they took each year for their new intake of musicians. The example Gladwell refers to is from the 1970’s when there was a strong tradition of choosing predominately male musicians. The theory was that male musicians were obviously more highly skilled. Perhaps their larger lands made it easier or could it be that males had a better ear for music, or was it that males worked harder, hence were just better. Or, maybe none of the above?

Gladwell goes on to describe how during one year of auditioning that a cousin of one of the judges was trying out. So they decided that to make things fair that they would put a screen between those auditioning, and the judges. What was intriguing is the outcome that was created from this approach. Would you believe more women were chosen in that particular audition?

This must have meant some extraordinary women, with amazing skills auditioned in that year, right? Nope! Rather, this was a demonstration of the long tradition of judges making decisions based on what they saw, not on what they heard. Yes that’s right, the judges were making decisions based on gender not skill. Just another example of a ‘snap judgement’ based on a non-conscious bias.

If you’re not convinced by this story, you should read Gladwell’s story about the high percentage of tall men (more than 35%) who are in employed the position of CEO in Fortune 500 companies. Surely experienced Board Directors, with many years in their role would be immune to the biases associated with our non-conscious. Nope again!

The final Gladwell story I’d like to share is one about research relating to doctors making decisions about heart conditions. The research revealed that the more information doctors were provided with, the worse their decision-making was. Sounds counter intuitive right? How could doctors make worst decisions when they had more information? Surely they got that wrong? There must have been a flaw in their research? Nope, it was right.

If you’re still struggling to come to grips with this, it might help to elaborate on what type of information was not helpful. Similarly to the orchestra judges choosing their candidates, so too can doctors make mistakes based on prejudice. How does this work? As Gladwell explains:

They instructed their doctors to gather less information on their patients: they encouraged them to zero in on just a few critical pieces of information about patients suffering from chest pain–like blood pressure and the ECG–while ignoring everything else, like the patient’s age and weight and medical history. And what happened? Cook County is now one of the best places in the United States at diagnosing chest pain.

I wonder if we can reflect on Malcolm Gladwell’s Blink, and consider this in risk and safety? For example:

· What common situations do we find ourselves in where we could make ‘snap judgments’? Do we allow ourselves time to reflect and consider how these impact on our decisions?

· How would we cope if we were faced with a situation where a reasonable explanation for an event came three months after it happened? What processes do we develop, implement or encourage in risk and safety that mean that we close out our thinking about incidents that might in turn mean we miss out on valuable learning?

· Are there situations where we non-consciously have biases based on prejudices? Are there times when we could put a metaphoric screen up, just like the orchestra judges, when we conduct investigations? What prejudices do you think might impact on your decisions?

What are your thoughts of the link between think and blink?

As usual, I’d love to hear your thoughts, experiences and comments.



Author: Robert Sams

Phone: 0424 037 112



Facebook: Follow Dolphyn on Facebook

Rob Sams
Rob Sams
Rob is an experienced safety and people professional, having worked in a broad range of industries and work environments, including manufacturing, professional services (building and facilities maintenance), healthcare, transport, automotive, sales and marketing. He is a passionate leader who enjoys supporting people and organizations through periods of change. Rob specializes in making the challenges of risk and safety more understandable in the workplace. He uses his substantial skills and formal training in leadership, social psychology of risk and coaching to help organizations understand how to better manage people, risk and performance. Rob builds relationships and "scaffolds" people development and change so that organizations can achieve the meaningful goals they set for themselves. While Rob has specialist knowledge in systems, his passion is in making systems useable for people and organizations. In many ways, Rob is a translator; he interprets the complex language of processes, regulations and legislation into meaningful and practical tasks. Rob uses his knowledge of social psychology to help people and organizations filter the many pressures they are made anxious about by regulators and various media. He is able to bring the many complexities of systems demands down to earth to a relevant and practical level.

Do you have any thoughts? Please share them below