Often Wrong, Never in Doubt - Six Ways Assumptions Mislead Us

By Chuck Dinerstein, MD, MBA — Dec 19, 2018
Models (not those kind) help us understand our world. But the assumptions we make in creating models can lead us astray. Here's a list of six ways the devil is in the details.
Courtesy of Johnathan simcoe

We often use models to understand the complexities of our world and to predict what happens next. But all models are wrong because they simplify reality. Our predictive models consist of three components, facts, and assumptions that help us reach conclusions.

Facts are far harder to obtain than assumptions; they may require long periods of observations or expensive, sensitive measurement devices. Assumptions can be made more easily, in the comfort of the office, frequently papering over or shaping missing data. One of the unintended results of this approach is that given a limited set of facts, the strength of our conclusions is based upon our certainty in the strength of our assumptions. Assumptions are just not as sexy as conclusions and are frequently overlooked in our haste to know or do – it is a variation of often wrong, never in doubt. Here are six ways assumptions can lead us astray. 

  • Conventional certainties – There are some models we trust, without knowing or understanding their assumptions. Congress in an infrequent bipartisan moment, considers the predictions of the Congressional Budget Office to be valid and quite specific, putting numbers on costs or benefits. In reality, the CBO carefully explains the underlying assumptions and how variations on that assumption result in ranges of costs or benefits, not specific numbers. 
  • Dueling certainties – In certain instances, changing from one equally valid assumption to another “flips the script” drastically changing the conclusion. The studies of sin taxes are good examples. Taxes increase price which in turn results in the substitution of a cheaper product for the now more expensive one. We assume that faced with substitution people will make the rationale, healthy choice, water in place of soda. But that assumption is frequently wrong, and people instead choose juice in place of soda. So that our model’s conclusion that raising the price of soda will mean we take in fewer calories turns out to be wrong, at least some of the time. Most scientific articles look at how altering our assumptions may change our conclusions through “sensitivity” analysis. But again it is frequently more convenient to remember the conclusion and not the assumptions. 
  • Illogical certainties – unfounded conclusions based on errors of thought. This is a prevalent source of difficulties, with two intertwined approaches. You can make the mistake that a statistical significance is actually significant, e.g., any of current GWAS genetic studies where a set of statistically significant genes are associated with this or that condition. In the small print you find that it is true 20 or 30% of the time, so how significant is it? And of course, I was careful not to use the word cause, because the other illogical source of certainty is believing that correlation reflects causation.

But here are my two favorites, the two most popular ways to give us false confidence.

  • Conflating science and advocacy – In describing models there is an arrow, facts and assumptions lead us to conclusions; seems to be a one-way street, like the arrow of time going in only one direction. But physics tells us that time can go forwards or back. Similarly, advocates can begin with their conclusions to reverse engineer their assumptions, making everything fit together in a neat package. Not I am not suggesting that this is necessarily intentional, but we can pick our assumptions, not our facts making for an insidious problem. When we talk about a paper that is data mining we are referring to this reverse engineering of assumptions. The partner in this error is “wishful extrapolation.” Some data cannot be found, and we often extrapolate or estimate the missing components by drawing a trend line into this unknown area. The entire discussion around radiation’s harmful effects at very, very low doses is a question of extrapolation. Some believe the trend to be linear; others do not. But no one really knows, and each side can choose the method of extrapolation that serves their conclusion. Perhaps this entire problem can be summed up this way, “When all you have is a hammer, everything is a nail.” 
  • Media overreach – This is a great source of environmental pollution. “If it bleeds, it leads” expresses our interest in things that make us fearful; it is, after all, an underlying Darwinian force. But the media in its search for attention is only too willing to repackage science and policy into sound bites and optics that create a response at little cognitive costs. Our transient emotional response is just as real whether we fully understand or are even told the underlying assumptions. 

 

Source: Communicating Uncertainty in Policy Analysis  Proceedings of National Academy of Science DOI: 10.1073/pnas.1722389115

Category
ACSH relies on donors like you. If you enjoy our work, please contribute.

Make your tax-deductible gift today!

 

 

Popular articles