April 29, 2016

Cognitive Bias in Terminal Two

I was on the way home from Houston, after several days of meetings with clients about decision making, cognitive bias and the prevention of serious and fatal injuries. I was in the terminal walking toward the gate.

First I saw the kid, wild eyed, head jerking back and forth, looking down toward the gate and then back up the long airport corridor. Panicky looking. Then I saw a white haired man in his fifties, running toward him. He moved like he hadn’t done much running for a long time. Stiff joints, trying hard to move as fast as he could but moving slowly. Harried looking, tense, in emergency mode. As the father moved closer, the kid said breathlessly “Get on down there!” Pointing to the gate behind him, “They’re holding it,” and then, “Where’s mom?” The father didn’t miss a step, shook his head and kept on going with his eyes focused straight ahead.

The boy crouched to a half squat pose like a martial artist, looked back at the gate, then up the corridor, couldn’t decide for a moment, then put his head down and took off running, away from the gate, like he was in competition to break a record. A straight shot down the corridor, fast, the kid had wheels. At that point I saw the mother, way down near the end of the corridor, moving toward the kid.

She had a bag of some kind tucked under one arm and her other arm was extended forward like a halfback running a football. She was running, weaving, zigging and zagging, her loosely fitting muumuu flying to one side and the other, a bit immodestly. She was red faced and sweating and modesty was just not an issue. She was on a mission. She was not going to miss that flight.

When the boy got about ten feet from the mother he spun around and headed back, running interference for mom, who could now just put her head down and go all out behind her son. Dad was standing at the gate by now waving at them both to keep on coming. This was a family united, purposeful, together. It was touching. I was rooting for them, especially the son, who I saw as the hero. For sure he was the one holding things together. He was gonna see that they caught that airplane, all of them.

I remembered one time many years ago when my family was visiting Paris, we couldn’t find a cab and our teen-age son had seen one across a very busy street. Without saying anything he streaked across the busy Parisan boulevard and nabbed the cab. His sisters thought he was a hero, so did his mother and I. But he had put himself at a risk disproportional to the gain.

What struck me was that in both cases the action taken was purposeful. It meant something to the families involved. They were united in that moment. It was a good thing. They were also giving more risk than the situation deserved, operating out of an odd kind of biased decision making. It was an automatic response, not a considered one, and it put them at risk. Most of the time the ending was happy, but some of the time it would be tragic. Actually life-threatening, putting the value that made the action heroic at stake.

Had anyone thought about it, they would have realized that the worst possible outcome if they did nothing, was waiting a short time. An inconvenience, but a minor one. Getting the calendar on your lap top to work properly could take longer. But we all acted like it was life and death. The real life and death issue was the risk of colliding with someone in the corridor, or being hit by a car, or falling or having a heart attack. But somehow the situation looked and felt like it was much more serious than it actually was.

In both these cases the actors were acting on a decision influenced by cognitive bias. Somehow the process of initiating action went into emergency mode when what was called for was a simple calculation and a small adjustment.

We see this kind of miscalculation in organizations frequently. In healthcare the nurse dives across the room to stop a patient from falling. Maybe she stops the fall, and maybe there are now two falls. Or a fall and a strained back. Seeing what was needed to move the patient safely would have been better. A decision higher up to assure all the elements needed for safe patient handling would have been better yet.

We are naturally wired for this mistake, it comes honestly to good people. But the overall effect is not good for safety. How do you address it?

First, design the work in such a way that there are few or no surprises. Look for the traps that may be lurking and design them out. Assure that pre-job planning happens to get a preview of potential risks.

Second, be wary of any “emergency response” that may not be an emergency. Missing a turn off, adding a step to the expected procedure, having to gather additional information, all are examples of situations where the inappropriate emergency response is a temptation, but not a requirement.

Lastly, teach your people how cognitive bias works. It will build awareness, common vocabulary and new sensitivity to the issue.

Learn more about our approach to safety behavior

Cognitive Bias in Terminal Two

April 29, 2016

By 

Posted in:

Search for articles