Perception, Heuristics, Cognitive Bias and the Psychology of Safety
By Dr Robert Long
DOWNLOAD THE PAPER HERE: [download id=”106920″]
Introduction Managing safety is mostly dependent on judgment, perceptions and decision making. Whilst it is good and necessary to have legislation, standards and regulations in place, it is what humans decide to do with those regulations that determines whether they will be effective. Regulations of themselves do not make a safe workplace, it is when people comply with those regulations that their effectiveness is experienced. Similarly, education and training in themselves do not create safe behaviour, knowledge and information in themselves do not necessitate effective risk management. It is also good to have safety management systems in place but we need to understand that human interactions with regulations, knowledge and systems is just as important as the systems themselves. This paper explores the human aspect of risk and safety management and discusses key issues in understanding the psychological and cultural dimensions of safety management. How we make decisions, what we base our decisions on and, our behaviour, are all uniquely linked and interdependent. There is a view in safety and risk management which espouse that humans are the sum of their behaviours, this is the approach of Behaviour-Based Safety (BBS). There is another view which argues that human behaviour is evidence of thinking (cognition) and that behaviour is determined by right thinking, this is the approach of Cognitive Behavioural Theory (CBT). These views only partially account for how humans manage risk and safety. Humans are neither just machines (BBS) nor computers (CBT) but think and behave in context, in a culture and social environment, this is the approach of this paper and of the Psycho-Social Approach (PSA) to risk and safety management. T his paper discusses psychosocial factors which influence the judgement and the decision making process in managing safety. It examines three aspects of social psychology namely: perception, heuristics and cognitive bias and how these affect human judgement and decision making. Perception Perception is the process of awareness or understanding of sensory information, it is about receiving, collecting and apprehending with the mind or senses; what Weick calls “Sensemaking” (see figure 1). We tend to rely a great deal on our senses, and place a significant level of trust in our perceptions unfortunately, science has shown that our perceptions particularly, our visual cognition is highly unreliable. This is because we make sense of our world and the stimuli that come to us through a range of lenses and filters such as our: history, upbringing, culture, self-esteem, personality type, worldview and beliefs. It is for this reason that the idea of “common sense” is largely mythical, we share much less in common than we would like to believe. When people view something they bring their sensemaking lenses with them and tend to take those preconceived ideas and see them whether or not they are really there. This is because humans understand new information through the bias of their previous knowledge, history and worldview. This is why learning and change is most often “scaffolded” ie. new knowledge is developed through the support and comfort of what is already known. The extent of a person’s knowledge creates their reality because the human mind can only contemplate that which it has been exposed to. When objects are viewed without understanding, the mind will try to reach for something that it already understands, in order to process what it is sensing. That which most closely relates to the unfamiliar from our past experiences, makes up what we see when we look at things that we don’t comprehend. In order to manage the massive influx of information, demands for change and data which bombard us each day we tend to rely on simple “rules of thumb” (called heuristics) to help make decisions. These rules of thumb can be most helpful at times in managing the constant demand for decision making in a stressful and busy world however, in the area of safety management they can also be counterproductive. 2Perception, Heuristics, Cognitive Bias and the Psychology of Safety dymensions Figure 1 – Sensemaking Social context Behaviour People are affected by: Identity Retrospect Cues Ongoing Flows Plausibility Enactment Sensemaking T he concept of sensemaking is critical in navigating the issues of perception and decision making in sdafety management. Once we understand how humans make sense of things and how our perception is condictioned by that sensemaking we will be much better prepared to manage risk and safety. Weick calls sensemaking “the interpretation mindset”. It is this mindset which helps justify commitment to our worldview (beliefs we hold) and the way we interpret the things that we see and experience. Once committed to a particular mindset, and once justified, perceptions then often become self-confirming and, particularly difficult to shift or change. Heuristics Heuristics are educated guesses, simplistic judgements or “rules of thumb” (or thinking shortcuts), one such rule of thumb is the notion of “common sense”. A heuristic is a tool of thinking which helps individuals solve problems and come to a rapid solution when under pressure or when under stress or overload. These thinking tools (cognitions) are often based upon a long history or experiences which shape our attitudes, beliefs and values. Once a belief or value is in place they are often very hard to shift, pressure to shift from holding a belief or value triggers what is known as “cognitive dissonance”, a form of thinking stress. Cognitive dissonance is an uncomfortable feeling caused by holding two contradictory ideas simultaneously. For example: “smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day”. The “ideas” or “cognitions” in question may include attitudes and beliefs, and also the awareness of one’s behavior. T he idea of cognitive dissonance proposes that people have a motivational drive to reduce dissonance (contradiction stress, mental discomfort or state of tension) by changing their attitudes, beliefs, and behaviors, or by justifying or rationalizing their attitudes, beliefs, and behaviors. In the case of the smoker the best way to reduce the dissonance is to quit smoking but if they have tried and failed they must then reduce the dissonance by finding a justikfication of why smoking is OK or not that bad eg. reduces weight, calms nerves etc. the reality is that humans fail miserably when it comes to rational decision making. Plous discusses the following heuristics (“rules of thumb”) which are often used to interpret and manage perceptions: T he Representativeness Heuristic: people often judge probabilities by the degree to which one thing is representative of another, that is, by the degree to which A resembles B. Tversky and Kahneman called this 3Perception, Heuristics, Cognitive Bias and the Psychology of Safety dymensions the “conjunction fallacy” which is when people make assumptions about the relationship between certain facts without evidence. What this means is that based on what is represented to someone and based on their worldview (sensemaking) they are quite content to jump to conclusions. In safety this could mean that a worker may think that intelligence is linked to accidents, that people have accidents because they are “stupid” or “accidents only happen to idiots”. Similarly, if someone has never had an accident this can be attributed to superior intelligence or “I’m not an idiot so I don’t have accidents”. Unfortunately, the conjunction of the two is a heuristic with no evidence, incidents and accidents are not related to intelligence. T he Availability Heuristic: is when people access the frequency of something through the availability of information to them. In other words because people can bring to mind stronger memories of shark attack which they believe it is more widespread than being hit by lightening when in fact, the chance of being killed by lightening is three times more probable than fatality by shark attack. Common events are easier to imagine than uncommon events and are more available depending on their emotional significance. In safety, workers rely upon availability to calculate probability of an accident. For example, workers may remember the severity of a angle grinder accident but not the regular occurence of accidents with hammers and so develop complacency around the use of one tool over being careful with all tools. It should be clear that these dispositions have significance for risk perception and for decisions that need to be made about risk. It is easy to imagine how probability estimates associated with a new hazard that resembles an old hazard could be effected by the representative heuristic. Cognitive Bias T hese rules of thumb which we create allow humans to establish shortcuts to simplfy decision making. The attraction for the “simplistic paradigm” is quite strong whereas, the attraction to explain all things in complexity is quite stressful. We simplfy our experience of the world to make it more predictable and manageable and to help manage new information so it is aligned with old information. The power of heuristics distorts perceptions but semmingly helps us make decisions when confronted with complex decisions. Associated with this are a range of cognitive biases which distort the way we see the world. Cognitive biases are mental errors caused by our tendency to use simplified information processing strategies. The following are a selection fo some common cognitive biases: Anchoring and Adjustment Giving disproportionate weight to the first information received and that initial information anchors subsequent judgements, this is more influential when thinking about the unimaginable or unthinkable. For example, how thick would a piece of paper be if folded on itself 100 times, most people answer a few meters when in fact the correct answer is 800 trillion times greater than the distance between the Earth and the Sun. Attribution Associating success with personal ability and failure with bad luck, chance or poor intelligence. eg. the langauge of “near miss” lends itself to attribution rather than “near hit”. Compatability Bias Compatability between one piece of information and the required response effects the importance of that information determining the response. For example: prioritising severity over frequency in risk assessment. 4Perception, Heuristics, Cognitive Bias and the Psychology of Safety dymensions Confirmation Bias If new information is consistant with our beliefs we think it is well founded and useful. For example, after a debate between two political rivals most supporters are convinced that their representative won the debate. Fact/Value Confusion Regarding and representing strongly held values as facts. For example, accidents only happen to idiots. Overconfidence Effect Feeling overconfident in the face of abundant data. For example, strong data about hand injuries in LTI figures yet not wearing gloves for a dangerous hand held task. Order of Effects Remembering data more easily at the beginning than the end. T he Primacy Effect Characteristics appearing early in a series of information create stronger impressions thatn characteristics appearing later. For example, Ashe’s experiments found that the placement of words in a report made an impression on people’s emotions and could create negative or positive responses to data. Recency Effect Being partial to data that is most recent and easist to remember. For example, people can be made to think more positively by listing the positives for an argument first and the negatives second. So, if you are asked if you would like to speak first or last on a program, always take the first position. If there is a delay in time between the first and last presentation before an action is taken (eg. 2 weeks) than you should present last. Redundancy Increasing confidence level as the data becomes more redundant. For example, the deveopment of complacency as a time of extreme risk expires and people relax as the perception of risk decreases. Rosy Retrospection Looking back and remebering the “good times”. For example, workers remember an alliance project when relationships were rosy and incidents were low, and develop pessimism and negativity based on the recency effect. Sample Bias Placing high value on a small sample that is flawed due to an inadequate sampling methodology. For example, based on “common sense” one asserts that glasses do not reduce the probability of eye injuries because there were no eye injuries on a job where people did not wear safety glasses. 5Perception, Heuristics, Cognitive Bias and the Psychology of Safety dymensions Selective Perception Seeking data that will confirm your views. For example, looking at LTI statistics and finding no reported hand injuries with forworkers or steelfixers. Status Quo Bias Preferring alternatives that support the current conditions, because its a safer strategy and involves less risk. For exmaple, “if it ain’t broke don’t fix it” heuristic which argues against change, relying on the past record as if the probability of an incident has been reduced. Sunk Cost Effects Making choices that support past decisions, even when the chances appear no longer valid. For example, using past experiences to refute new procedures such as, we didn’t need to do this in the old days and we were all OK, noone got injured. Unfortunately, selective recall according to the Recency Effect. Wishful Thinking Preferring the decision because the outcome is desired. For example, reducing safety paperwork in the face of operational,. cost and production pressures. Conclusions – What can be done? In order to combat cognitve bias and heuristics the following are necessary: •Whenever faced with a decision that requires the assessment of future probabilities, remember to consider all the data before making a decision. •Gather data from various sources. •Intentionally seek out data that both supports and discounts your view. •Ensure that Mavericks, Mavens and Devil’s Advocates are not far from executive decision making. •Define alternayives clearly, the status quo should never be the only option. •Consider the problem on your own first, then gather input from others. •For all decisions with a long history, verify that you are not giving undue consideration to sunk costs. •Study the research on cognitive bias and require safety leaders to do the same. Humans distort their perceptions and decision making particularly when under stress. Heuristics and cognitive bias can help explain why a skilled operator will “filter out” hazards (through over confidence or complacency), why a senior leader will not terminate a porr performer he has coached or mentored without any measureable performance improvement (sunk cost effect), or why a leader will hold on to a safety system that delivers weak safety performance in the face of overwhelming evidence to the contrary (status quo bias).