September 26, 2016

Science of Decision Making in Safety: Stating the Obvious

You are the maintenance manager at a recycling plant.  You have 30 work orders outstanding and you’re already over your budget for the month, but a request comes in to repair an important piece of equipment.  Your worker assigned to evaluate the problem says it will be $1000 to repair the machine. He can keep it running for a while, but the blade needs to be replaced.  You realize that you are about to make an important decision that has safety, quality, and production impacts. What you may not realize is that it matters how you frame your options.

You could frame the issue as a simple yes/no decision: “Should I spend $1,000 to fix this issue now?” You could frame it as a choice: “Should I spend $1,000 to fix it now, or $5,000 to fix it later?” Or you could frame it as a choice in which you state the downsides: “Should I spend $1,000 to fix it now, $0 for the next year and no risk of starting a fire, or should I spend $0 now, pay $5,000 in a year, and accept some risk of starting a fire?”

Research by Daniel Khaneman and others spanning four decades confirms that framing influences choices, even in cases when logically equivalent options are proposed. We recently came across a 2008 study by Magen, Dweck, and Gross who found that that framing choices in a way that makes the ‘‘obvious’’ downsides of alternatives explicit rather than implicit may help decision makers choose in a more informed, more thoughtful, and less impulsive manner. Such rationality is good for safety-related decisions.

How does stating the downsides encourage safe decisions? When a choice is framed as a simple yes/no option, it assumes that the implications of that choice are known, understood, and it neglects what won’t happen as a result. Magen, Dweck and Gross called it the “hidden zero effect.” We call it “stating the obvious.”  When you take the time to spell out those things that won’t happen, we suspect it engages what Khaneman referred to as System 2 thinking. In contrast to the intuitive, impulsive, fast and automated System 1, System 2 is more deliberate, rational, slower and better able to delay gratification. Safety-related decisions benefit from System 2’s rational judgment. Whatever the explanation, Magen and colleagues showed that framing decisions as a sequence of options with explicit statements about what will not happen is good when you want to encourage rationality.

An Exercise to Try

Write down three decisions you made today.  First state them as a simple choice.  Then re-state them in a way that identifies the obvious downsides within each option.  Notice how changing the frame in this way affects your choice. Here are examples from a recent flight.

  1. Sitting in the airport lounge, Kristen faced a decision, leave for the gate now or later? Reframed with the ‘obvious downsides’: Leave for the gate now and miss out on 5 minutes in this airport lounge? Or leave for the gate 5 minutes from now and risk having to run to make the boarding call? (Whereas I was initially reluctant to move, reframing the options made the more conservative choice much more attractive.)
  2. Getting ready for take-off: A client taught us about reducing risk on airplanes. He suggested always having shoes on, ID and cell phone on one’s person for take-off and landing because that is when a problem is most likely to arise requiring evacuation.   It has stuck with us for over 10 years and the decision presented itself once again: Do I remove my shoes now or later?  Reframed with the ‘obvious downsides:’ Should I take off my shoes now and risk being shoe-less during an emergency? Or should I take my shoes off after we are safely in the air and suffer these hot, uncomfortable shoes for 15 more minutes?  (The risk of finding myself shoe-less during an emergency was too remote to outweigh the desire for immediate comfort, but I did choose a 3rd option: I removed my shoes and kept them close by.)
  3. Ordering dinner on the airplane: Should I have the full meal or the “Express light meal?”  Reframed with the obvious downsides: Should I have the full meal and not be able to do anything else for an hour or should I have the express light meal and risk getting hungry in the middle of the flight (I chose the former, which I would not have done without reflecting on the decision).

At Krause Bell Group, we partner with multinational corporations, researchers, and subject-matter experts to develop practical yet data-driven solutions in an effort to improve safety in the workplace.  If you are interested in becoming a part of an exciting collaboration, contact us at +1-888-859-2661.


Khaneman, Daniel; Slovic, Paul & Tversky, Amos (1981).  Judgment under uncertainty: Heuristics and Biases. New York, NY: Cambridge University Press.

Khaneman, Daniel (2011). Thinking fast & slow. New York, NY: Farrar, Straus and Giroux.

Magen, Eran; Dweck, Carol; & Gross, James (2008). The hidden-zero effect: Representing a single choice as an extended sequence reduces impulsive choice. Psychological Science, 19 (7), 648-649.

Learn more about our approach to safety leadership

Science of Decision Making in Safety: Stating the Obvious

September 26, 2016


Posted in:

Search for articles