The Polarization of Society: Even Scientists Become Tribal

By Alex Berezow, PhD — Nov 29, 2018
One would think that in a world where facts can be easily verified, it shouldn't become so polarized. But a new paper in the European Journal for Philosophy of Science argues that polarization is the natural outcome when groups of people disagree. In fact, the authors document a major example of polarization within the scientific community itself.
Credit: Storyblocks

It is increasingly difficult to have a conversation about any topic that is even remotely political. We appear to have entered a world in which there is no longer a common set of agreed upon facts.

All one needs to do to verify this is to flip between cable news channels. Doing so reveals a strange universe of "red facts" and an equally strange, parallel universe of "blue facts." Occasionally the two worlds intersect, but more often than not, they are completely separate realities. Yet, inhabitants of each parallel universe claim to know THE TRUTH and believe that the inhabitants of the other parallel universe are delusional. Or worse.

Something has gone horribly wrong. A world in which facts can be easily verified should not become so polarized, right?

Well, maybe not, according to a new paper in the European Journal for Philosophy of Science. The authors, Cailin O'Connor and James Owen Weatherall, argue that polarization is the natural outcome when groups of people disagree. In fact, they document a major example of polarization within the scientific community itself.

The Lyme Wars

Lyme disease is due to a bacterial infection transmitted by ticks. Untreated, it can cause arthritis, pain, fatigue, and other problems. Some patients, who have these symptoms but no sign of an active infection, are convinced that they suffer from "chronic Lyme disease." Many doctors are convinced the condition is real, so they provide their patients with long-term antibiotic therapy.

No matter which side a scientist takes in the Lyme War, there is a common set of facts upon which we should all agree. (For what it's worth, ACSH is in the anti-chronic Lyme disease camp.) First, nobody wants these patients to suffer. Second, we want to find a cure, if there is one. And third, there is an ultimate truth yet to be discovered. Either "chronic Lyme" is real or it's not. (If it's not, there are several alternative explanations for the symptoms, such as an autoimmune disorder, perhaps triggered by Lyme disease.)

Given this dedication to public health and goodwill, it is difficult to see how the biomedical community could ever become polarized. And yet it has. The man who discovered Lyme disease, Allen Steere, was skeptical of the chronic Lyme diagnosis as well as long-term antibiotic therapy. So, he started receiving death threats from patients who were convinced he was wrong.

That's right. The hero is now a villain. People who are working for the exact same goal have turned against each other. That's the effect that polarization has. And here's the scary part: The authors of the aforementioned paper demonstrate using a mathematical model that there is nothing we can do to prevent it. Polarization is a natural consequence of disagreement.

Why Polarization Is a Natural Outcome of Disagreement

The authors employ a mathematical model to show that, even when scientists are acting in good faith over the correct interpretation of evidence, polarization is still a likely outcome. How so?

Suppose a scientist believes that Hypothesis X is more likely to be correct than Hypothesis Y. He may perhaps come to believe that other scientists who also accept Hypothesis X are slightly more reliable than scientists who accept Hypothesis Y. Over time, this slight initial bias against data provided by scientists who believe Hypothesis Y can morph into outright distrust. Once that happens, a stable state of polarization develops, in which neither side can "win" the debate, even if the facts clearly support one hypothesis over the other.

The authors reach a rather disturbing conclusion:

"We do not need to suppose that anyone is a bad researcher (in our models all agents are identical), or that they are bought by industry, or even that they engage in something like confirmation bias or other forms of motivated reasoning to see communities with stable scientific polarization emerge. All it takes is some mistrust in the data of those who hold different beliefs to get scientific polarization."

In other words, everybody acting in good faith can result in a society in which we cannot agree on a common set of facts.

Yikes.

If you liked this piece, please consider donating to support our work. Happy Holidays!

Source: Cailin O'Connor and James Owen Weatherall. "Scientific polarization." European Journal for Philosophy of Science 8 (3): 855-875. Published: October 2018

Alex Berezow, PhD

Former Vice President of Scientific Communications

Dr. Alex Berezow is a PhD microbiologist, science writer, and public speaker who specializes in the debunking of junk science for the American Council on Science and Health. He is also a member of the USA Today Board of Contributors and a featured speaker for The Insight Bureau. Formerly, he was the founding editor of RealClearScience.

Recent articles by this author:
ACSH relies on donors like you. If you enjoy our work, please contribute.

Make your tax-deductible gift today!

 

 

Popular articles