Originally posted on June 1, 2013 @ 11:53 AM
A number of disciplines such as engineering, safety and mechanics have very limited focus in training on human uniqueness, people skills, communication, education, learning, motivation and human relationships. When one’s primary training has its focus on ‘things’ such as concrete, steel, mathematics, regulation and legislation it seems that ‘soft’ skills (I prefer the less pejorative ‘human’ skills’) are not necessary. Yet, when one leaves training and enters the workforce, the skills that are most required to be successful are human skills. There is nothing wrong with a foundation of mathematics, engineering, legislation and mechanics, but it doesn’t make sense to apply the fundamentals of this knowledge to humans as if humans are machines. Image Source
The mechanical, systematic and rationalist mindset seeks to find formulas, patterns and predictability using a ‘scientific method’. For the mechanical mindset, decisions are made by exploring options, weighing up the value of each option and deciding on the best option from a basic of values, strengths and weaknesses. However, this not how people really make decisions. Most human decision making is made without ‘thinking’: by intuition, emotion and the unconscious. The idea that a human can be programmable, predictable and controlled is a delusion. Yet, when it comes to risk and safety it seems the industry has a preoccupation with engineering and predictability and this mindset gets flabbergasted when humans don’t behave as per the controls. The only response can be that the other person was stupid, and idiot or both.
There has been a debate raging on Linkedin for sometime about the mythology of Heinrich’s Pyramid. The debate has focused on causality and percentages, as if humans are machines. The debate has been blindsided by the fixation on a formula create 80 years ago by speculation on the basis of insurance data. Many contributing to the debate have no experience or expertise in social psychology, neuropsychology or the unconscious and have been consumed by technical ‘safety speak’ about unsafe acts and conditions. It is strange to read the speculations of the mechanistic mindset, with no basis in research or psychology, sprout forth ideas on the predictability of human behavior. The quest for predictable controllable humans is a fundamentalist quest. When one creates a formula to conveniently make humans try and fit it, one ignores the very realities of life and human fallibility. Such an approach to risk and safety ignores the essential meaning of risk as uncertainty and gets astounded when people don’t conform, they must lack ‘common sense’.
The social influencing of behavior, judgment and decision making is a fascinating area of research. Research in social psychology by Milgram (Obedience to Authority), Zimbardo (The Stanford Experiment), Rosenhan (the Bystander Effect), Festinger (Cognitive Dissonance), Bandura (Reciprocal Determinism), Wilde (Risk Homeostasis), Adorno,Frenkel-Brunswik, Levinson, and Sanford (The Authoritarian Personality), Hoffer (The True Believer), Caldini (Influence), Simon (Bounded Rationality) and Bargh (The New Unconscious) demonstrate that humans do not conform or perform to the construct of formula’s or mechanistic ideas of control. In the human world, things are much more unpredictable than the mechanistic mindset would wish to believe. Rather than write off those who don’t fit a formula, the challenge should be to try and better understand human judgment and decision making. With a better understanding of human decision making we might create better working environments that make more sense.
An example of the mechanistic mindset gone made is observed in the excesses of systems in safety. Herbert Simon demonstrated in his work (Models of Man) that humans are limited in the amount of content and things they can process at one time (Bounded Rationality). When humans are overwhelmed by content and process they simply find new ways of functioning (heuristics, rules of thumb and ‘tick and flick’). This allows the unconscious mind to keep things going while the rationalist mind shuts down. Yet, every time some problem is encountered in the safety industry, we seem to end up with more systems, as if humans are machines who conform to some engineering formula of control. What makes things even more complicated is that the moment we put more systems in place it then becomes too challenging to human emotional security to suggest that we should get rid of any systems.
The best way for safety to move forward is to reduce rather than expand systems. Human systems need to be comprehensible, adaptable and practical and take into account that the best formula for predicting human behavior is to not get locked in to formulas.