Making Good Decisions in a Sea of Statistics
In a previous life, I was a statistician. I used to be able to tell you how many ways you could win any given poker-hand, hand-calculate complicated multivariate statistical equations using matrix algebra, and explain the difference between an autoregressive integrated moving average and an autoregressive fractionally integrated moving average. But in all those years of studying statistics, I stayed far, far away from epidemiological statistics. Why? Because they were too hard. Epidemiologists have to work with limited data, from biased samples, taken from flawed reporting systems, that shift and change over time: Impossible!
Now, suddenly, every scientist, writer, scholar, businessperson, and consumer who reads news on their iPhone has been thrown into a sea of epidemiological statistics. Every day we face a barrage of forecasts ranging from, “Ten million people could die of the Covid-19” and “60% of our population could be infected” to “Coronavirus will disappear by April.”
Making sense of wide-ranging predictions is critically important to organizational leaders who face decisions like, “Do we have sufficient safeguards in place to protect our employees from Covid-19 infection”, “Should we close down our plant? When is it safe to re-open our office buildings? How long will we need to keep the current precautions in place?”
These important decisions depend directly on the quality of information available to the leaders making them. Quality in this case means how much noise there is in the data, how valid were the assumptions made while analyzing it, how skillfully the resulting information is organized and presented, and the extent to which the conclusions are justified, given that whole process.
To all you organizational leaders out there drowning in the sea of coronavirus statistics, in writing this article it is my intent to throw you a life ring: something (5 things, actually) to grab onto to keep afloat until you find yourself once again on solid ground.
Five Tips for Effective Decision Making
- Study the evidence behind the claims. It takes time. It isn’t easy. But you need to know what’s behind the claims. Credible authors cite their sources. Unfortunately, journalists don’t always report studies accurately, either through lack of knowledge or intent to give emphasis of their own. You (or someone in your organization) should get the study and read it yourself. After you’ve read a few, you will learn to recognize the assumptions and begin to decide for yourself if those assumptions are valid. As the claims become more extreme, your standards should increase accordingly: Be prepared to dig into the evidence even further.
- Watch for selection bias. Whenever you see a Covid-19 statistic, ask yourself: Who is included in this number and who is not? What makes that group of people similar to and different from the people I am extrapolating to? What time period is represented, and how is that different than the time period I am concerned about? As you read Covid-19 articles, watch for blind assumptions about how China’s statistics will generalize to the US, for example. Watch for assumptions that estimates coming from early data will stay the same over time (remember, epidemiological data change dramatically over time). A credible study will discuss these issues, state the authors’ assumptions, and allow you to judge their validity for yourself.
- Look closely at denominators. Did you know there are multiple ways to calculate a death rate? The differences are huge! On February 29, 2020 in Washington State, the Covid-19 death rate was 33%. How could that be? The death rates reported for China had varied from .5% to 5%. Three people were known to have Covid-19, and one had died. The words “known to have” in that last sentence are critical: at that point in Washington, testing had not begun and only three cases had been identified. Oftentimes, the death rate is expressed as the # of people who died per person tested. Other times, you’ll see it expressed per person infected (where the # people infected is itself an estimate). I’ve even seen it expressed per population (which epidemiologists refer to as a crude death rate). Crude indeed. The lesson? Look at the denominators.
- Treat uncertainty as information. A great practice for decision makers, when gathering any kind of information is to ask, “How confident are you in your estimate?” In statistics, there’s a method for answering called a confidence interval: Think of it as the range of plausible outcomes.
We grab onto single estimates like the 3.5% Covid-19 deaths per confirmed cases and assume they are meaningful. But wouldn’t your view change if someone told you with 80% confidence that the range is between .5% and 5%? You’d know immediately that the information is poor, and you should not rely heavily on any single point estimate such as 3.5%. Or if someone said the range of values in the denominator varied by a factor of 20 (the # of confirmed cases, in the last example), you should be skeptical about the usefulness of any extrapolation, especially without explicit acknowledgement of the issue and a rationale for any given use.
Uncertainty gets amplified when two estimates are multiplied together. The wild estimates that I mentioned in the beginning “10 million people could die …” all rely on formulas that combine several other estimates, and each estimate carries its own degree of uncertainty. If someone is making extraordinary claims about the future and doing so while expressing any degree of certainty about the outcome, you’d be right to question it.
- Keep your eye on the ball. And “the ball” is protecting life. I emphasize life because life is more than being coronavirus-free. Life is being out in the world, interacting, creating, falling in love, caring for others, making the world go around. The risk of one person getting Covid-19 is not the only consideration in these life decisions leaders face right now. There’s also the risk of passing on the virus to the community, business viability, continuity, and all the lives that depend on productive work. Keeping your eye on the ball means being able to optimize multiple outcomes.
It Comes Down to Rigor
Our efforts to answer questions like, “Do we have sufficient safeguards in place to protect our employees from Covid-19 infection”, “Should we close down our plant? When is it safe to re-open our office buildings? And How long will we need to keep the current precautions in place?” are not fundamentally different than other complex decisions organizational leaders make every day. Nevertheless, these feel different. The stakes have never been higher. The impact of the Covid-19 pandemic weighs on all of us.
Good information helps every decision process. The blessing of the internet is that we have extremely rapid access to valuable information. But the internet does not discriminate between good information and bad information. Failure to recognize that can turn the blessing into a curse. We think we know things we really don’t know, and as a result our decisions are comprised.
Effective decision makers are willing to apply some rigor.
The deeper our commitment to objective, honest, non-politicized information, the
better our decisions will be, and the sooner we will find ourselves back on
 I distinguish between data and information. When I say “data” I am referring to raw numbers. Data becomes “information” through a process of analysis, interpretation, presentation, and application.