An interesting article by Sydney Dekker, first published here by SafetyDifferently
THE ORIGINAL HEARTS AND MINDS CAMPAIGN, AND THE DERELICTION OF BEHAVIOR-BASED SAFETY
In 1960, shortly after his election, President Kennedy asked Robert McNamara to become secretary of defense in his new cabinet. McNamara, known as a star and a whiz-kid, had been president of the Ford Motor Company for all of five weeks, so it took a bit of cajoling. But he eventually joined the administration in 1961, taking with him the modernism of Ford’s production lines. A few years into his tenure, with Vietnam taking up ever more resources and political airtime, McNamara wanted to know from his top generals how to measure progress in the war. He told General Westmoreland that he wanted to see a graph that would tell the defence secretary whether they were winning or losing (McMaster, 1997). Westmoreland did as he was asked, although he produced two graphs:
One graph showed the enemy body count. Under pressure to show progress (and knowing that political fortunes of their masters, promotions for themselves and their comrades, decorations, rest- and recreation decisions, and resourcing all depended on it), those who did the accounting made sure that not a single dead enemy body was missed. Soon, the lines between soldiers and civilians had blurred completely: all dead bodies became enemy personnel. Implausibly, the total number of enemy dead soon exceeded the known strength of the Viet Cong and the North Vietnamese Army combined. Civilian casualties mounted, the frustrations and incentives even leading to some massacres. In the field, of course, the ‘enemy’ was nowhere near all dead, and certainly not defeated.
The other graph showed a measure of civilian sympathies for the US and against communism. It tracked the effects of the so-called Winning Hearts And Minds campaign (or WHAM), which had divvied Vietnam up into 12,000 hamlets, each of which was categorized into ‘pacified,’ ‘contested,’ or ‘hostile.’ Pressure to show McNamara progress here was relentless too. Militias on the side of the Americans were invented on paper. Incidents of insurgent activity or hostile takeovers of hamlets were ignored. In an ambiguous, messy and protracted war, it wasn’t difficult to skew numbers in favour of making the graph look good. It soon seemed that the entire countryside had become pacified.
The progress charts demanded by McNamara had produced a monstrous auditing system (Scott, 2012). It erased all meaningful difference and distinction: a dead body was a dead body. It could be counted, and that was all that counted. And a pacified hamlet was a pacified hamlet—with all the cross-currents, fluidities and complexities of a shredded social order collapsed into a single number on a chart. McNamara’s system may well have played its own small part in contributing to the continuation of war and the stifling of meaningful, rational discourse about its merits and demerits. The political back tapestry, for instance as painted by McMaster in Dereliction of Duty (1997), was one of civilian leaders who were obsessed with reputation and who had leaned further into military operational matters than turned out to be healthy.
We can safely say that what gets measured, gets manipulated. Do company leaders and board members—concerned with worker safety but also their own reputation and liability—who promote a hearts and minds campaign for safety, even know this ugly history? Today, ‘hearts and minds’ is an often-used cover for behavioral safety interventions. Behavioral safety programs tend to point the finger away from leadership. Behavior-based safety, after all, allows leaders to say that safety problems are created by all those other people, that those other people, the workers, are the problem. Even when the workers have been given everything to do the right thing, their behaviors still get the organization in trouble. Targeting the worker conveniently means that the manager, director or board is not the target.
Without any serious thought, questioning or critique, those promoting behavioral safety use a version of Heinrich’s highly dubious ‘finding’ that 88% of occurrences are caused by ‘man failure’ (Heinrich, Petersen, & Roos, 1980) or human error:
The popularity of this approach stems in part from the widely held view that ‘human factors’ are the cause of the great majority of accidents … As the general manager of Dupont Australia once said, ‘In our experience, 95 per cent of accidents occur because of the acts of people. They do something they’re not supposed to do and are trained not to do, but they do it anyway’ (Hopkins, 2006, p. 585).
Behavior-based safety interventions typically center around observation of behaviors, and feedback to those performing them. The four basic steps of a typical behaviour-based program are:
- Define the correct behaviors that eliminate unsafe acts and injuries;
- Train all personnel in these behaviors;
- Measure that personnel are indeed behaving correctly;
- Reward worker compliance with these correct behaviors.
Hearts and minds programs offer a variety of ways to achieve these steps. Some involve penalties, demerits or disincentives for undesirable behaviors. Many involve surveillance of worker behavior, either by people or by technology (e.g. cameras but also computerized monitoring systems installed on equipment, like vehicles). Others involve coaching and guidance. All assume that there is one best way to do a job, and that straying from that one best way can be bad for safety. Today, many programs are also supportive and training-oriented. Some deliberately pursue worker commitment to injury- and incident-free behavior, akin to achieving a religious conversion. Yet over all, “its emphasis is undeniably on behavior modification and that is how it is understood by many of its advocates as well as its critics” (Hopkins, 2006, p. 585). Indeed, a focus on behavior modification and conversion of the heart and mind are obvious from the way behavior-based safety programs are promoted:
Reinforcement occurs when a consequence that follows a behaviour makes it more likely that the behaviour will occur again in the future … For example, a toolbox talk addressing correct manual handling techniques might result in correct techniques on the day of the talk; however, over time employees will revert to old practices. This is because nothing has occurred after their correct behaviour to indicate that it is correct, or that it has benefitted the individual or the organisation to be so safety-conscious (HSA, 2013, p. 7).
Does any of it actually work? Without offering any studies or results, the U.S. Department of Energy boasted a few years ago that “Intensifying the Behavior Based Safety (BBS) observation cycle will often prevent an injury or accident” (DOE, 2002, p. 7). It continued that:
BBS is a process that provides organizations the opportunity to move to a higher level of safety excellence by promoting proactive responding to leading indicators that are statistically valid, building ownership, trust, and unity across the team, and developing empowerment opportunities which relate to employee safety. Secondly, but equally as important to the organizational culture, BBS provides line management the opportunity to prove and demonstrate their core values on the production floor (p. 9).
Where does the strident language come from? Achieving “higher levels of safety excellence” would need to be demonstrated if they really wanted their people (who include many nuclear scientists and physicists) to be convinced. But the DOE doesn’t offer such a demonstration anywhere. They don’t even bother to define a “higher level of safety excellence.” Just saying it doesn’t make it so—even if we would be able to agree on what a higher level of safety excellence actually means (other than tedious, tiresome management-speak). But of course, it is not evidence.
Studies that seem to show the success of behavioral interventions are exclusively published by those who have an active economic stake in promoting the practice (Geller, 2001; Krause & Seymour, 1999). And none of these studies meet even the most rudimentary criteria of scientific quality: that of a control condition so that the results of the intervention can be contrasted against a part of the organization where the intervention wasn’t done, or done differently.
The safety-scientific literature still offers no objective, reliable, valid empirical study or data to prove the efficacy of moral-behavioral safety interventions. Put simply: there’s no evidence that they work. And this should not be surprising. Even Frank Bird and Heinrich remind us that the conditions under which worker behaviors emerge are to be traced back to the organization. Fatigue, pressure and insufficient knowledge, for example, come from somewhere. They are consequences of organizational trade-offs and decisions, rather than causes of trouble brought to the workplace by frontline operators. Organizational conditions set the stage for them, enabled and even invited them. Think of Bird’s ‘pre-contact control’ (Bird & Germain, 1985). Is the organization doing enough of that? Think of, for example:
- Adequately resourcing people’s work with appropriate knowledge and tools;
- Recognizing and reducing goal conflicts;
- Instituting safety-by-design.
And most hideously, behavior-based safety has done nothing to reduce the risk of fatalities. Workers had to strictly follow driving and walking regulations on a Texas chemical plant site, but then four of them died in a toxic gas release in a building on that same site—two of them brothers (Hlavaty, Hassan, & Norris, 2014). Their deaths had nothing to do with any of the purported ‘safety behaviors’ that were on signs and posters on the site. And in a supreme and tragic irony, workers for a copper mine in Indonesia were taking part in a compulsory behavioral safety course in an underground training facility. Then the roof of the tunnel in which they were gathered collapsed. It killed 28 miners and injured 10, while they were attending the behavioral safety course (Santhebennur, 2013).
Behavioral safety practices can even contribute to a climate where the safety conversation dries up, and thus allows fatality and other risks to build up in a way that is unrecognized by the organization:
A behavior-based approach blames workers themselves for job injuries and illnesses, and drives both injury reporting and hazard reporting underground. If injuries aren’t reported, the hazards contributing to those injuries go unidentified and unaddressed. Injured workers may not get the care they need, and medical costs get shifted from workers compensation (paid for by employers) to workers’ health insurance (where workers can get saddled with increased costs). In addition, if a worker is trained to observe and identify fellow workers’ ‘unsafe acts,’ he or she will report ‘you’re not lifting properly’ rather than ‘the job needs to be redesigned.’ (Frederick & Lessin, 2000, p. 5)
If you want to change behavior, don’t target behavior. Target the conditions under which it takes place. Those conditions are not likely the worker’s responsibility. Or only in part. Think for a moment about whose responsibility the creation of those conditions is in your organization. And then take your aim there.
In the end, no indictment of ‘hearts and minds’ is perhaps as powerful as the title of McMaster’s book: “Deriliction of Duty.” Leaders—in any organization—who surrender to a worldview where the little guy is the problem, and where the little guy needs a change of heart and mind, are derilict in their duty.
References
Bird, R. E., & Germain, G. L. (1985). Practical loss control leadership. Loganville, GA: International Loss Control Institute.
DOE. (2002). The Department of Energy Behavior Based Safety Process: Volume 1, Summary of Behavior Based Safety (DOE Handbook 11/05/02). Washington, DC: Department of Energy.
Frederick, J., & Lessin, N. (2000). The rise of behavioural-based safety programmes. Multinational Monitor, 21, 11-17.
Geller, E. S. (2001). Working safe: How to help people actively care for health and safety. Boca Raton, FL: CRC Press.
Heinrich, H. W., Petersen, D., & Roos, N. (1980). Industrial accident prevention (5th edition). New York: McGraw-Hill Book Company.
Hlavaty, C., Hassan, A., & Norris, M. (2014). Investigation begins into 4 workers deaths at La Porte plant. Houston Chronicle, Sunday(November 16), 1-4.
Hopkins, A. (2006). What are we to make of safe behaviour programs? Safety Science, 44, 583-597.
HSA. (2013). Behavior Based Safety Guide. Dublin: Health and Safety Authority.
Krause, T. R., & Seymour, K. J. (1999). Long-term evaluation of a behavior based method for improving safety performance: A meta-analysis of 73 interrupted time-series replications. Safety Science, 32, 1-18.
McMaster, H. R. (1997). Dereliction of duty: Lyndon Johnson, Robert McNamara, the Joint Chiefs of Staff, and the lies that led to Vietnam. New York: Harper Perennial.
Santhebennur, M. (2013). Picking up the pieces: Indonesian mine collapse. Australian Mining, 25,4-5.
Do you have any thoughts? Please share them below