The Role of Press Releases in Overhyping Health Research Findings

Today, things are going to get a little meta. I’m going to write about a scientific study about writing about scientific studies.

The basic question up for investigation is: why do the results of scientific studies on health-related topics so often get overstated when they’re covered in the media?

You might already have a sense that news articles about scientific findings often lean toward sensationalization. It’s why you can read an article titled “New study shows that having a glass of wine at dinner will add ten years to your life” one week and read another article titled “New study shows that having a glass of wine at dinner will kill you without a doubt” the next.

OK, I’m exaggerating a little, but not by much. The point is that an article titled “New study finds small statistical correlation between moderate alcohol consumption and longevity which might be due to other factors and probably won’t replicate in future studies anyway” just doesn’t pack the same punch.

In 2014, a team of researchers at Cardiff University published a study highlighting one possible source of how overblown claims about the results of health studies begin circulating in the media: press releases from researchers.

The idea is that there’s something of an unholy alliance between researchers and health journalists. A group of researchers run a study, and then to get as much publicity as possible they put out a press release that hypes up and exaggerates the results of their study. Journalists then pick up that press release and repeat the overstated claims in media outlets.

To investigate this idea, the authors of the 2014 study collected 462 health science press releases put out by 20 top universities in the United Kingdom.

They found that a third of the press releases exaggerated claims about cause-and-effect. This is an important point because health studies often show correlation without necessarily providing any evidence about causation.

For example, if I do a study showing that people who wear running shorts more often live longer, that doesn’t prove that wearing running shorts causes people to live longer. People who wear running shorts might live longer simply because they exercise more; or, for that matter, they might exercise more simply because they’re in better health to begin with. The study doesn’t say anything about what the underlying cause-and-effect is.

In psychology, this kind of ambiguity crops up a lot because it’s extreme difficult to definitively show cause-and-effect with something as complex as human behavior. That’s why almost every post I write on here about a new psychology says something along the lines of “it’s unclear from these results what the underlying cause-and-effect is.”

Analyzing the 462 press releases, the authors of the 2014 study also found that 36 percent exaggerated the conclusions that could be drawn from animal research. Typically, this kind of hype would arise with researchers doing a study on animals and then assuming that their findings must hold for humans as well, even though it’s not always the case that what’s true for a rat is true for a person.

Once researchers put out a press release containing these kinds of exaggerations, it was basically a given that the overstated claims would be repeated across the media. The 2014 study found that when press releases contained exaggerated claims, the vast majority of news articles did too; and when press releases painted a more realistic picture of a study, so did news articles.

When the researchers did a followup study to replicate their findings, it turned up similar similar results. Overall, it appears that a lot of exaggerated claims about scientific studies may be coming from researchers themselves in the form of overhyped press releases.

Although the 2014 study focused on media coverage of health studies in general, it seems plausible that media coverage of psychology studies in particular follows similar patterns. Incidentally, I’ll note here that when I write posts for this blog, I never write them based purely on a press release, and I often don’t look at the press releases at all.

Since that study came out in 2014, it’s worth asking: has the situation improved in recent years? Maybe.

A newly published study compares the contents of press releases from 2014, before that study sounded the alarm, and from 2015, after the study was published. It found that 28 percent of the press releases analyzed from 2014 overstated scientific findings while only 13 percent of those analyzed from 2015 did.

Of course, as the authors of the study are quick to point out, those results can’t determine that the publication of the 2014 study is what caused the number of overhyped press releases to decrease.