Media Should Have Far Less Confidence In Meta-Analysis Claims Than They Do

By Hank Campbell — Sep 20, 2018
There are two ways that the media get meta-analysis claims wrong. And here's how to spot them.
Credit: Storyblocks

Do you think video games have led to more violent attacks by young people? You are not alone. Lots of people do. It was in every major newspaper because a meta-analysis once showed it was so.  But then another meta-analysis showed that belief is false.

Journalists gushed over both claims(1) even though one was suspect to anyone who understands the nature of selection bias in meta-analyses. So let's discuss what a meta-analysis is and what it can and cannot do. 

What meta-analysis is: It is just what it sounds like, an analysis of analyses, which is better than a literary criticism of literary criticisms, though in the wrong hands not by much. The first meta-analysis was in 1976 and as Gene Glass described it then, the goal was to integrate the findings of collections of analysis results from individual studies. In other words, he wanted a way to try and compare apples to apples from different studies. He was using systematic review.

What it can do: In the ideal scenario, a meta-analysis can make for a more informed debate than we get if people cherry-pick studies.(2) That was the starting point for Glass, a psychologist who believed psychoanalysis was given a bad rap while behavior therapy was adored,(3) which to him looked like proponents of behavioral techniques were simply promoting papers they liked and ignoring others. He wanted better information for himself and the public and showed a meta-analysis can do that. But just like a study in rats can never show causes of cancer in humans, it can only eliminate them, a meta-analysis will not show something to be true. It can only make arguments for validity better or worse.

Despite that, we see junk science purveyors using rats studies and meta-analysis to make claims about cause all of the time. Journalists don't have time to learn every statistical technique so if the journal is reputable, they will accept it, the same way they do climate change or medical findings or food claims.

"When the evidence points clearly in one direction, there is little need for a meta-analysis. When it doesn't, a meta-analysis is unlikely to give the final answer.” - University of Cambridge philosopher of science Jacob Stegenga, as told to Jop de Vrieze in Science.

Who doesn't agree with the quote by Stegenga? One glaring group is the subset of epidemiologists who want to try and pool lots of underpowered studies in order to claim everything causes or cures cancer, so they can "suggest" XYZ Industry is "just like Big Tobacco" and get in a New York Times article by Eric Lipton or Danny Hakim. In other words, the kinds of epidemiologists hoping to get called up to U.S. National Institute of Environmental Health Sciences (NIEHS) so they can get on an International Agency for Research on Cancer (IARC) Working Group and then get rich as an expert witness for Prop 65 trial lawyers after they get a compound declared a probable carcinogen - now known as The Chris Portier Method.(4)

Here are the two things evidence-based skeptics out there should keep in mind:

1. Meta-analyses are not objective. So we shouldn't get excited if we read about a meta-analysis paper that appeals to our hot button issues. Using an unweighted random effects meta-analysis, I can show cows whinny and the price of steel causes autism. 

Long before Freedom of Information Act requests exposed Chuck Benbrook, Ph.D., the favorite economist of the organic movement, as being an industry-front who was only allowed on the Washington State University campus as long as those corporate checks cleared, the science community knew it. Because his methodologies were the "secret sauce" that is in favor at groups like IARC and NIEHS and are only now being thrown out of the U.S. Environmental Protection Agency.

2. See if the authors published their systematic review protocols in advance and disclosed each analytical step and judgment.

This prevents data dredging and HARKing (hypothesis after the results are known) which, unfortunately, are all too common in the academic activist community.

If your methodology is transparent, I can duplicate your meta-analysis and prove it. Or I can duplicate it and prove you wrong. Either way, if the goal is informing scientists and the public, transparency is essential.

This is why the claim that video games caused school shootings or other violence was debunked. As Jop de Vrieze notes in Science, three of the papers by Brad Bushman have been retracted, but that occurred long after he had been appointed to President Barack Obama’s committee on gun violence. The administration looked at those claims through the lens of confirmation bias, they had new corporations to blame.

The popularity of the claim would never have happened if journalists knew to "ask the awkward questions" of papers whose claims they like as often as they do studies that defy their ideological beliefs.

NOTES:

(1) And scholars are not dumb, they know that journalists love meta-analyses and their suitably cosmic claims and that therefore journals do also. The only reason PNAS published nonsense like that more people die during hurricanes with female names because of sexism was they knew it would get attention. And journalists like that PNAS gave something clearly wacky the whiff of legitimacy. Because the goal of media is to generate revenue and journals have positioned meta-analyses as a higher power over actual studies, they have exploded in popularity, with 11,000 produced last year.  As you might expect, most of them are awful. Meta-analysis has become what its first critic, psychologist Hans Eysenck, called, “an exercise in mega-silliness” but they remain popular because they cost nothing to produce but can make bombastic claims sound science-y which can lead to funding if they get enough attention from the public and therefore government.

(2) Which can easily result from the WebMD approach to health information. Everything can cause or cure a disease.

(3) Baseball statistician Nate Silver was similarly inspired during the 2008 Presidential election when he believed surveys were unfairly powering Senator Hillary Clinton rather than Silver's choice, Senator Barack Obama. Using his own 'secret sauce' to normalize polling results, he showed Senator Obama was a real contender. This led to a nice job at the New York Times and now he has his own organization to analyze information.

(4) Poor Martyn Smith. His Council for Education and Research on Toxics (CERT), which he founded specifically so the lawyer that wanted to pay him to be an expert witness could have an NGO to astroturf juries with, beat Portier in that strategy by a decade, but Chris gets all of the attention.

ACSH relies on donors like you. If you enjoy our work, please contribute.

Make your tax-deductible gift today!

 

 

Popular articles