Reading Peer Reviewers’ Unprofessional Comments

When you read about a psychology study covered on this blog, or basically any published scientific study for that matter, you can rest easy knowing that the study has been through a process of rigorous peer review.

Peer review is part of what gives published research its aura of credibility: papers that appear in peer-reviewed journals have been through a process of critique from scientists in the field before being given the green light for publication.

The advantage of this approach is obvious in theory. In practice, though, peer review can get notoriously messy, arbitrary and brutal.

Since peer reviewers are usually anonymous, they are science’s equivalent of internet commenters: without much accountability attached to their comments, they’re free to vent in whatever way they see fit. This phenomenon has given rise to sites like Shit My Reviewers Say, which collects absurd, mean-spirited and condescending comments researchers have received in peer reviews.

Now, an international team of researchers has attempted to study the phenomenon of peer reviewers run amok in a more scientific way. Their study analyzed 1,491 reviews in the fields of ecology and behavioral medicine, tallying different categories of comments that appeared in the reviews.

One category encompassed unprofessional comments, including those that made personal attacks or questioned researchers’ motives. Some of these comments were insulting, some accused the authors of lying, and some were just downright sexist:

this young lady is lucky to have been mentored by the leading men in the field

Another category was comments that were dubbed incomplete, inaccurate or unsubstantiated critiques because they tore down research in ways that didn’t provide evidence or simply didn’t make sense. These included unsubstantiated comments like “this is JUST wrong,” comments that revealed an ignorance of common techniques on the part of the reviewers, comments that included blatant factual errors, and superficial reviews that didn’t say much at all.

It’s clear that reviews in either of these categories – unprofessional reviews or “IIUC reviews” as the authors termed them – don’t add much to the cause of science. But how common are reviews that include pointless personal attacks and unfounded criticisms?

Extremely common, it turns out. Of the reviews considered, 12 percent contained unprofessional comments and 43 percent contained IIUCs.

In other words, as the authors pointed out, about one in eight reviews contained unprofessional comments. Some more examples of such comments the authors encountered (with original spelling mistakes included):

Utterly disapointed in this submission, it achieves nothing, and was a waste of funding

This mansucirpt was not worth my time so I did not read it and recommend rejection

as is common from research from China

I mean, how outdated is that

It was garbae when I read it first and its still garbae

The authors of the study highlight a recent Nature editorial on high levels of anxiety and depression among early-career researchers, which cited bullying and harassment as problems. They point out that, “given the frequency of egregious examples of bullying and harassment identified in our assessment,” such unprofessional peer reviews probably aren’t helping.

That’s not to say peer review doesn’t have an important role in maintaining a high standard for published research, but there seems to be plenty of room for the process to be improved to include more careful, substantiated criticisms and fewer personal attacks.

Image: Flickr/Nic McPhee