Phil LaDuke joins our panel of renown and highly respected safety writers including, Dr Rob Long, Bill Sims Jr and George Robotham. Everyone has enjoyed his previous articles (HERE). This is Phil’s latest article and provided exclusively to us. It is a much more sombre and less provocative than his previous work but one that sends a very powerful message about the importance of open communication.
Jargon Killed the Astronauts
I’m working with a large, faith-based healthcare system with an extremely strong culture. One of the practices this organization uses to continually reinforce its culture is to begin each meeting with a reflection. A reflection is a short, thought-provoking story that is meant to transition those gathered to a place mentally that would help center and focus the participants around a particular theme. What follows is a reflection I recently wrote on the importance of open and honest communication AND on accountability.
—Phil
The Cassandra Effect
FYI: The Cassandra effect is when one believes they know a catastrophic event is going to happen, having already seen it in some way, or even experienced it first hand. However the person knows there is probably nothing that can be done to stop the event from happening, and that nobody will believe them if they try to tell others.
Years ago I hired a man to teach Failure Modes Effects Analysis (FMEA) to engineers and program managers at the tier-one automotive supplier where I was employed as the Director of Training and Development. FMEA was an area about which I knew little and less so I hired a retired engineer who had worked on the space shuttle as a supplier to NASA and was a part of the Challenger program. In short, Gene was a rocket scientist. Despite his impressive credentials Gene was a soft-spoken, unassuming, man not given to hyperbole or braggadocio. He had indeed been on the Challenger team, (in fact, he was on the team when it exploded killing all aboard) but seldom if ever talked of it. It wasn’t a subject he eschewed; he just didn’t bring it up. I was acutely interested—how often does one get the opportunity to have a conversation to someone who was not only a first-person witness to history but a participant in a significant historical moment.
The purpose of an FMEA is to identify all the things that could go wrong (either with a design or manufacturing process) and identify both the preventive measure and the contingencies for when prevention fails. Perhaps because I lack social skills or maybe just because curiosity got the best of me, I asked Gene a question, “why did the FMEA for the Challenger fail to protect the astronauts?” Gene mood darkened. He explained to me that there was an FMEA that correctly predicted the disaster. “We knew the O-rings would fail in cold weather” he told me sadly. Visibly shaken after so many years, he related how the engineers warned NASA leaders that cold weather launches could cause catastrophic system breakdowns. “We had all the facts, all the data, and ultimately we were right. But none of that mattered,” he explained. “It doesn’t matter how much data you have or what you know, none of that means anything unless you are able to persuade the decision makers.”
“Unless you can put together a persuasive argument that compels the decision makers to take action with confidence, all your efforts are wasted”. I haven’t given Gene’s story much thought until lately. Last week, Challenger engineer, Roger Boisjoly, died at the age of 73. As I listened to the radio eulogy, I thought of Gene and the lessons he related to me. Roger Boisjoly lived out his life haunted by his failure to persuade.
Safety professionals tend to make the some mistakes (despite not being rocket scientists). Many hard working and deeply devoted safety professionals who work hard to research the causes of workplace injuries, are like Gene and Roger, frustrated by their inability to convince Operations to act. I’ve heard these safety professionals complain about a lack of leadership commitment to safety. Gene never blamed NASA leadership, instead he focused on his and his peers in ability to communicate. He talked to me about how the engineer spoke one language but the program managers spoke another.
Jargon Killed the Astronauts
Jargon is a language that evolves in a profession chiefly to exclude outsiders. It’s a natural phenomenon—as neophytes enter the field they want to adopt the vernacularly to demonstrate their belonging. It is an important part of the profession’s identity. Jargon in itself is neither bad nor dangerous but there is neither bad nor dangerous, but there is a time and place for it. Obviously speaking the same language is essential to communicate but it isn’t always obviously when we aren’t speaking the same language.
The difference between data and information
Gene told me that his conversations with the program managers were not information exchanges. Engineers kept presenting data—they talked in probabilities, risks, and failure modes—but it wasn’t framed in ways that the program managers understood or could use. As the engineers made more and more dire predictions none of them spelled out in no uncertain terms that if they launched the shuttle would explode no one were afraid to guarantee that if the launch took place disaster was certain. The end result was the engineers came of as timid worriers with guesses as to what might possibly go wrong. Program leaders were understandably reluctant to scrap a mission and waste millions of dollars based on analysis of data that they neither understood, trusted, nor respected.
Do you have any thoughts? Please share them below