Decision Making for Safety: Part 2
Organizational Improvement Strategies
Introduction
In the first part of this series, we shared findings from our research on organizational decision making. We are learning more and more about the opportunities leaders have to impact serious injuries and fatalities (SIF). This article explores some of the most significant situations and types of decisions where these opportunities arise. It also presents a new perspective on incident causation with important implications for improvement strategies aimed at SIF prevention.
Let’s examine one of the publicly available cases from the study of over 600 decisions leading up to 60 life-altering and fatal events.
In 2010, a metal dust explosion, inside a metal blender, caused three fatalities and one serious injury.[i] One of the blender blades had struck the side of the blender, causing a spark which ignited the extremely explosive metal dust. During investigation, it was found that the blender had been repaired multiple times. When a blade continued to function improperly, they decided to replace it with a used blade from another blender.
On its face, it might seem completely reasonable to point to the blade decision as a cause of the incident. However, the investigation revealed a chain of decisions, years before the explosion, that set the stage for that final, fatal, decision.
The Network Effect
When we think about the sheer number of decisions leaders make, the task of improving them can seem daunting. The task is simplified greatly by Safe Decision Network™ analysis. Because individual decisions are part of a network of decisions, improvement in one part of the network impacts others. We call this the “Network Effect.” Each day we discover new examples of the Network Effect leading to powerful improvement strategies. Following are some examples from the Metal Dust Explosion case mentioned above. Notice how risk accumulated across all these decisions:
Executive Level
- Decision to acquire the metal recycling facility created new risk and new production capacity, leading to numerous decisions aimed at understanding this risk and designing risk mitigations, and missed opportunities to verify the risk was controlled as intended.
- Decision to operate the facility 24/7, leading to decisions about how to keep equipment up and running.
Site Level
- Decision to manage the risk of metal dust accumulation through housekeeping, leading to numerous decisions about how, when, and when not to ensure standards were being followed.
- Decision to manage equipment risks through maintenance, leading to numerous decisions about how and when to do that, and missed opportunities to recognize the risk wasn’t controlled.
Front-Line Level
- Decision to request immediate maintenance in an effort to keep the facility in operation.
- Decision to install a used blade, leading to ignition of material in the blender, a small explosion, and a catastrophic explosion fueled by accumulated metal dust.
Figure 1: A simple network diagram illustrating connections between the above decisions.
These situations and errors combine in powerful ways. For example, while doing a safety audit, these data suggest that it matters greatly who we invite to participate. When we’ve spotted a significant risk and are trying to come up with a way to protect people, it matters greatly what information we use to inform the problem. When we plan hazardous work for the day, it matters with whom we communicate, how, and on what we focus on in those conversations. If leaders can learn to recognize which of their decisions are safety-sensitive, and in which situations they seem to matter most, we may be able to start with a relatively small number of decisions that have the greatest impact on safety. As these decisions improve, so will other decisions.
What is Cognitive Bias and What Role did it Play?
Cognitive bias is a systematic deviation from rational thought.[ii] The study found evidence of bias in 85% of the decisions. Here are some examples:
- Overconfidence in one’s knowledge: “I know everything I need to know about this problem.”
- Overconfidence in the team: ”We don’t need Paul’s expertise … Peter knows what he’s doing.”
- Preservation of the status quo: “We’ve managed this way for years, we can make it one more.”
Advances in psychology, biology, and technology have led to breakthroughs in our understanding of bias and where it comes from. For example, we know that our brains are structured in ways that trade off speed for accuracy, especially in stressful situations. We have also identified specific areas of the brain where bias is generated. This means that all humans, including the greatest leaders, are biologically predisposed to biases that can derail their decisions.
Fortunately, bias-producing situations can be anticipated and we can learn to identify and reduce their negative impact. A better understanding of cognitive bias and practical tools will help leaders make better and safer decisions.
The New View: Leadership Decision Making is a Critical Component of Safety Improvement.
What this all says is that leadership decision making is a critical component of safety improvement. All kinds of leadership decisions impact the amount of risk and the ways it is managed. What’s more, most decisions influence the assumptions, values and beliefs of the organization (the safety culture). If you have the first insight from our book 7 Insights into Safety Leadership, then you know that improving decision-making capability in safety facilitates business improvement in general.
Given what we know today about how to build safety leadership capability in an organization, combined with what we know about decision making leading to SIFs, we recommend:
- Start with individual leaders, as high in the organization as is practical.
- Strategically select a set of decisions and situations to focus on first, based on the decisions’ impacts on serious injury or fatality potential.
- Assess individual leaders’ decision-making capability and build skills as needed.
- Teach leaders more about the most common forms of cognitive bias and create checklists that can be used to help avoid them during decision making.
- Observe decisions as they are made, understanding both the decision process and the context. In especially critical strategic decision-making sessions, create a meeting role for the observer to watch for common bias pitfalls and spend some time after the decision is made to debrief to ensure cognitive bias was avoided.
- Provide feedback to the leader on the decision process; provide feedback to the organization on the context.
- Engage your senior-most leadership team in the effort of understanding and improving the context for decision making.
References
[i]U.S. Chemical Safety and Hazard Investigation Board. (2014, 07 16). AL Solutions Fatal Dust Explosion Report. Retrieved from CBS.gov: http://www.csb.gov/al-solutions-fatal-dust-explosion/
[ii] Khaneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.
Originally Published in 2016; Updated May 23, 2023
Great article and on point. This point is especially important on construction multi. Employer sites. For example the GC desides to save money with less temp power poles on site causing the field employees use longer cords. The Contractor desides to have field employees to use saw blades longer to save money. Then saw lacerations go up and every one wants to train employees on proper saw use. Never looking at the fact that the voltage was droping so bad that it caused to saw to bind and the dull blade to compound the bad power which caused the saw cut without ant training on what happens when the electrical power is to low or to High and how to check it. And if power is low and blades are over used can cause well maintained equipment to fail what can it do to poorly maintained equipment increase the chance of Injury. Everything has cause and effect.
Fantastic article, it captures a topic that needs to be communicated and better understood by Leaders. Well done!
I have had the opportunity as well to have been involved in a few Safety programs that focuses on leaderships behaviour and commutations. Those that talked daily with there teams about Safety, and listened to feedback and took action on safety concerns, appeared to have lower serious injuries on their teams.
Kristen,
Great insights! I like taking the “whys” past the first five. I don’t think cognitive biases have been included in very many root cause analysis. This fundamental systemic change has tremendous potential for step change in the reduction of SIFs. Measuring leading indicators relative to bias would be an interesting next step.
I look forward to exploring this more.
Brava well written and referenced. Kahneman’s insight into our problem solving is a great place to start the inquiry into why we do what we do when it comes to safety decision making. The obvious and the adjacent. “If a baseball bat and a ball cost a total of $1.10, and the bat costs $1 more than the ball, then how much does the ball cost? Our reflexive – fast mode of thinking – kicks in and the answer is $0.10 cents for the ball. It is the easy answer. After going through the math it becomes clear that it is also the wrong answer. Whether on a ship, farm or factory just because you survived the last shortcut doesn’t mean you’ll be as lucky the next time around.ActaNonVerba