Weick’s understanding of mindfulness

Weick’s understanding of mindfulness and why it is a helpful construct in understanding risk and safety.

By closely examining the characteristics of High Reliability Organisations (HROs), Weick, Sutcliffe and Obstfeld (1999) developed the concept of ‘collective mindfulness’. They articulated the five characteristics of mindfulness as preoccupation with failure, reluctance to simplify interpretations, sensitivity to operations, commitment to resilience and deference to expertise. The collective mindfulness construct is intrinsically helpful in understanding risk and safety because it defines, explains and embeds a process that:

… enacts alertness, broadens attention, reduces distractions, forestalls misleading simplifications … accelerates recovery and facilitates learning (Weick and Sutcliffe, 2007, p37).

The argument I will make is that the articulation of each characteristic and their interdependence enhances our understanding of risk and safety and, when the characteristics are collectively described as a construct, they facilitate an organisation’s reliable operations, despite exposure to unpredictable events and circumstances. Hanson (2007) and Weick (2007) define ‘collective mindfulness’ as ‘an intentional interaction with uncertainty and Weick’s mindfulness construct provides a prism through which an organisation can view and learn from the interactions with uncertainty. In this way the mindfulness construct is helpful in enhancing understanding of risk and the relationship between risk and safety.

This essay will examine each of the five characteristics of the collective mindfulness construct and provide examples of how they are helpful in improving our understanding of risk and safety. The helpfulness of the construct will also be highlighted in the context of the interdependence of the five characteristics and their enhancement of social learning.

Safety is enhanced and risk is better understood by people and organisations who seek out and encourage reporting of errors and departures from expected events. Henderson (1999) observed that nursing units that experienced a greater frequency of detection of adverse drug events, performed better than units with lower detection rates. She found that although the incidence of mistakes was no higher in the better performing units, their ability to detect mistakes was better because people were more willing to report and discuss the errors. Such a capacity to find, and the willingness to report, errors (or variations to what was expected) is referred to within Weick’s mindfulness construct as a preoccupation with failure.

Weick’s mindfulness construct treats these variations as opportunities to learn. This is helpful because it guards against the incubation of unexpected events into what they describe as ‘brutal audits’ (2007, p.35). Also, the act of seeking out small failures is an antidote to hubris or complacency. Weick’s construct acknowledges that:

… the common content thread in cultures that strive to be mindful, informed and safe is that they all focus on wariness” (Weick and Sutcliffe, 2007, p.342)

Hudson (1999) also supports the concept of preoccupation with failure, and the need for wariness, in his model for optimal safety culture when he states that,

A characteristic of this [generative] stage is the lack of complacency, even in the face of a dearth of accidents. This has been labelled chronic unease… (Hudson, 1999, p. 132)

The characteristic of preoccupation with failure is helpful because it accepts and embraces uncertainty, frames risk as unavoidable and facilitates learning. Henderson’s observations illustrate the effect of this characteristic on safety and Hudson’s observations support Weick’s construct.

Ubiquitous admonitions to ‘Be Safe’, ‘Think Safety’ or ‘Safety First’ have poor utility in helping us to understand, or engage with, risk or safety. Such a common approach is an example of an oversimplification that assumes we simply need to tell or remind each other to be safe. As an alternative, reluctance to simplify, the second characteristic of Weick’s construct, offers a pathway that is more helpful by encouraging people to interact with uncertainty by asking the question ‘Have you noticed anything out of the ordinary?’ and then praising them for both noticing and sharing observed deviations from what was expected. Weick’s construct asserts that we should ‘raise doubts to raise information.’ (2007. P. 177). For example, Weick et.al. alert us to the influence of confirmation bias on our expectations. Understanding that:

… we actively seek out evidence that confirms our expectations and avoid evidence that disconfirms them (Weick and Sutcliffe, 2007, p. 93)

This is helpful because it enhances our understanding of how we can be tempted to rationalise away the risk posed by unexpected events. It follows therefore, that understanding confirmation bias assists us to be wary of it. It allows us to value, and maximise, the fleeting moment of clarity we experience just after noticing a deviation from expectations. This knowledge is helpful because:

… all of us face an ongoing struggle for alertness because we face an ongoing preference for information that confirms. (Weick and Sutcliffe, 2007, p. 98)

The Cerro Grande fire which is referenced prominently by Weick et al (2007), originated from a planned, controlled burn, but escalated out of control into a brutal audit. The relationship between confirmation bias and oversimplification is illustrated by the observation that, although a test burn was done at the outset, as Weick states:

It is hard to spot signs that burning is unwise when twenty people are standing around ready to start the burn. (Weick and Sutcliffe, 2007, p.65)

and

… as with all projects that are underway, people are prone to interpret new data in ways that confirm their expectations. (Weick and Sutcliffe, 2007, p.65)

Expectations (and hubris) create blind spots, so even though the test burn revealed new data, oversimplification prevailed, with a catastrophic result. So understanding confirmation bias and wariness of oversimplifications helps us to learn that blind spots exist which in turn is helpful in understanding why we miss signals that compromise safety.

Sensitivity to operations, the third characteristic of the construct, is achieved by managers who stay in touch with events and people at the front line, by rewarding those who speak up and by encouraging scepticism. Weick and Sutcliffe emphasise the desire to obtain the:

… rich, neglected remainder of information that less mindful actors leave unnoticed and untouched (Weick and Sutcliffe, 2007, p.436).

Their mindfulness construct elevates face to face contact and conversations to the level of the ‘richest source of discriminatory detail’ and observes that:

richness declines as people move from face to face interaction to interaction by telephone, …letters, memos and email. (Weick and Sutcliffe, 2007, p.422).

Weick and Sutcliffe argue that relationships and continuous conversation are the best way to manage risks that systems haven’t anticipated and that mindless routines pose a threat to the sensitivity required for reliable operations. They illustrate this point by reference to the findings of Eisenhardt (1989) who learnt and observed that the better performing organisations in the microcomputer industry conduct frequent operations meetings and facilitate ‘nearly continuous face-to-face interaction’ (1989, p. 543-576).

Weick and Sutcliffe (2007) also suggest that the way near misses are treated in an organisation provides insight into the organisation’s sensitivity to operations. For example, if ‘near misses’ or ‘close calls’ are viewed as evidence of safe operations then there is a lack of sensitivity to operations that has an opportunity cost in terms of how the organisation perceives risk. Weick’s mindfulness construct is more helpful because a near miss is embraced as an opportunity to have a conversation that captures some of the otherwise neglected information.

The three components of Weick’s construct considered so far, preoccupation with failure, reluctance to simplify and sensitivity to operations, are anticipatory. They require imagination and are helpful in facilitating alertness, avoiding complacency and capturing opportunities for learning. The remaining two components, commitment to resilience and deference to expertise, acknowledge that mistakes and errors are inevitable and are about containment, resilience and learning.

It is instructive to consider the link between anticipation and containment of unpredicted events in the context of expectations and planning. Mintzberg developed a concept he called ‘the fallacy of predetermination’ (1994). In short, this means that when we make plans, we reinforce our expectations. Weick’s construct accounts for the fact that planning and enactment inadvertently ‘reduce[s] the number of things people notice’ (2007, p.199). Developing plans and procedures assumes that reliability of actions leads to reliable outcomes. The problem with this assumption is that it desensitises people to unexpected events. Weick’s construct recognises that ‘… plans preclude improvisation.’ This is what Weick calls ‘bricolage’ (2007, p.200) and is helpful because it encourages an organisation to anticipate adverse events or ‘sense the unexpected in a stable manner’ (2007, p. 201) so that containment can prevent escalation.

The fourth characteristic of Weick’s construct, commitment to resilience, is described as being ‘mindful about errors that have already occurred.’ (2007, p.203) The helpfulness of commitment to resilience, which acknowledges the inevitability of fallibility, is illustrated because it is interdependent on a safety culture that gives permission for an organisation to make mistakes. In doing so, it creates a learning culture. Hudson (1999) describes an evolutionary model which, like Weick’s construct, refers to the need to respond to unusual situations. Hudson describes the desirable level of operations as being generative and specifically aligns his model with Weick’s construct. Hudson’s generative safety culture is characterised by ‘being informed and trusting’ (1999), characteristics which underpin a commitment to resilience. That is, in order that people and organisations learn from their mistakes, there must be enough trust to have a conversation about the mistakes. Hudson says of his generative stage,

The model of human behaviour has shifted from one in which workers have to be driven, and are not to be trusted, to a more mature understanding of what makes people tick. (Hudson, 1999)

James Reason also identifies reporting and learning as a key characteristics of safety culture noting that

A reporting culture is about protection of the people that report…to encourage the ‘free lessons’ from knowledge gained from rare incidents, mistakes, near misses…(Reason cited in Weick and Sutcliffe, 2007, p.354).

Weick notes that Schein calls this freedom ‘psychological safety’ (1999, pp.124-126). Its utility or helpfulness is derived from the freedom and permission to learn and the capturing and dissemination of lessons learned. Reason warns that:

… candid reporting of errors takes trust and trustworthiness. (2000, p. 88)

Weick and Sutcliffe’s (2007) example of a seaman on board the aircraft carrier, USS Carl Vinson, who reported the loss of a tool, illustrates how a reporting culture can be encouraged and a climate of trust created. A tool that finds its way onto the deck of a carrier can be sucked into an engine and lead to catastrophic failure. Reporting it resulted in all aircraft aloft being redirected to shore, no doubt causing great disruption and cost. Rather than being punished for being careless, the seaman was lauded for reporting his mistake and publicly commended at a formal ceremony.

The resilience component of Weick’s construct is helpful because it reinforces the inevitability of fallibility and reveals its relationship with trust. This connection in turn, provides for a deeper understanding of safety in the context of the link between trust and safety as expressed by Weick and reinforced by Hudson (1999), Reason (2000) and Schein (1999).

The final pillar of the Weick construct is the concept of deference to expertise. Weick’s definition of expertise is:

… an assemblage of knowledge, experience, learning and intuitions that is seldom embodied in a single individual. (2007, p. 229).

The key to understanding this characteristic of mindfulness is the misleading assumption that authority equates to expertise. To illustrate how this works effectively Weick provides an example, drawn from Roberts, Stout and Halpern (1994) that applies on board aircraft carriers where ‘decisions are pushed down to the lowest levels in the carriers as a result of the need for quick decision making.’ (2007, p. 220). Rochlin found

that emerging crises on aircraft carriers are often contained by informal networks. [ ] Such networks allow for a rapid pooling of expertise to draw on. [ ] This flexible strategy for crisis intervention enables a system to deal with inevitable uncertainty and imperfect knowledge. (1989, p. 17)

As well as speeding up decision making, the mindful practice of deference to expertise also acts to counter the ‘fallacy of centrality’, an erroneous assumption described by Westrum (1988), that a person in a central position in an organisation knows all that is happening. This assumption can result in people with less authority failing to report their observations because they assume the person in authority already knows, or may not be interested in their opinion. Weick and Sutcliffe cite the example of paediatricians during the early 1960s missing signs of child abuse because they felt that:

… if parents were abusing their children, [they’d] know about it (2007, p.430).

The helpfulness of the characteristic of deference to expertise stems from the understanding that learning results from sharing of information and ideas.

Weick’s mindfulness construct provides organisations with both a rationale and and a tool to increase mindfulness. The main message to be derived from the construct is that:

… expectations can get you into trouble unless you create a mindful infrastructure that continually does all of the following; tracks small failures, resists oversimplification, remains sensitive to operations, maintains capabilities for resilience and takes advantage of shifting locations of expertise. (2007, p.32-33)

The construct encourages people and organisations to view assumptions about success and failure, planning and strategy, authority and expertise, and relationships and trust through a different prism. It encourages people to be wary of assumptions. The mindfulness construct facilitates a different way of thinking about risk and safety. Schein noted that:

… the most fundamental characteristic of culture is that it is a product of social learning (1997, p. 243).

Weick’s mindfulness construct is most helpful because it facilitates, and indeed depends upon, learning, both for individuals and organisations. Each of the five characteristics described in this essay serves the function of providing opportunities for learning, and it is the learning that best answers the question of why the construct is helpful. Risk, or interaction with uncertainty, is unavoidable. Without mindfulness, opportunities for people and organisations to learn from risk, may pass by unnoticed. When learning is compromised, so too is safety.

Bibliography

Edmondson, A., (1999) Psychological Safety and Learning behaviour in Work Teams. Administrative Science Quarterly 44: 350-383.

Eisenhardt, K., (1989) Making Fast Strategic Decisions in High-Velocity Environments Academy of Management Journal 32: 543-576.

Hansson, S., (2012) Risk The Stanford Encyclopedia of Philosophy. Edward N. Zalta (ed.) Standord University Press. Chigago.

Hudson, P,. (1999) Safety Culture – Theory and Practice Centre for Safety Science. Universiteit Leiden. The Netherlands.

Landau, M and Chisholm, D., (1995) The Arrogance of Optimism: Notes on Failure. Avoidance Management Journal of Contingencies and Crisis Management 3: 67-80.

Mintzberg, H., (1994) The Rise & Fall of Strategic Planning New York: Free Press. New York.

Reason, J., (2000) Human Error: Models and management British Medical Journal 320 (2000): 770.

Rochlin, G., (1989) Informal organizational Networking as a Crisis-Avoiding Strategy: U.S. Naval Flight Operations as a Case Study. Industrial Crises Quarterly 3: 167

Roberts, K., Stout, S. & Halpern, J., (1994) Decision Dynamics in Two High Reliability Military Organisations Management Science 40 (1994): 622

Schein, E., (1997) Organisational Culture and Leadership San Francisco : Jossey-Bass

Schein, E., (1999) The Corporate Culture Survival Guide San Francisco: Jossey-Bass

Weick, K. E., Sutcliffe, K. M., & Obstfeld, D., (1999). Organizing for High Reliability: Processes of Collective Mindfulness. In B. M. Staw & L. L. Cummings (Eds.), Research in Organizational Behavior (Vol. 21, pp. 81-123). Greenwich, CT: JAI Press, Inc.

Weick, K. and Sutcliffe, K. (2007) Managing the Unexpected: Resilient Performance in the Age of Uncertainty 2nd edition John Wiley & Sons 2007

Westrum, R. (1993) Cultures with Requisite Imagination Verification of and validation of Complex Systems: Human Factors issues Springer-Verlag. London.

Barry Spud

Barry Spud

Safety Crusader, Zero Harm Zealot and Compliance Controller at Everything Safety
Barry Spud

Latest posts by Barry Spud (see all)

Barry Spud
What is a Safety Spud? Lets look at a few more spud head activities in risk and safety: 1. Coming on to site saying there is a safety issue when in fact there’s no such thing, it’s a political issue. 2. ‘Falling apart’ when people make choices that we think are stupid because they won’t do as we ‘tell’ them. Then we put on the angry face and think that overpowering others creates ownership. 3. Putting on the zero harm face, presenting statistics, knowing it has nothing to do with culture, risk or safety. 4. Putting on the superman (hazardman) suit and pretending to be the saviour of everything, this is good spud head cynic stuff. 5. Thinking that everyone else is a spud head except me. 6. Thinking there’s such a thing as ‘common’ sense and using such mythology to blame and label others. 7. Accepting safety policies and processes that dehumanize others. 8. Blaming, ego-seeking, grandstanding and territory protecting behind the mask of safety. 9. Thinking that risk and safety is simple when in fact it is a wicked problem. Denying complexity and putting your spud head in the sand. 10. Continually repeating the nonsense language and discourse of risk aversion that misdirect people about risk, safety, learning and imagination.

Do you have any thoughts? Please share them below