Here’s an example of peer review in action.
This is from a study on the effects of selenium contamination on two bird species that nest along creeks and rivers of the Elk Valley in southeastern British Columbia. I found this version first, presented at the 2004 BC Mine Reclamation Symposium. At this point the study was unpublished and hadn’t gone through peer review.
Below I’ve highlighted one conclusion from the Abstract.
But here’s a paragraph from the Results section. Does it seem strange that there’s “no difference” in hatching success between an exposed site at 74% and a reference site at 91%?
As readers, we have two options:
- Call the primary author to clarify. The obvious limitation is that the author may decline to talk, and even if the confusion gets cleared up, the scientific record may never be corrected. This scientific error stands.
- Look for a peer-reviewed version of the study.
I found a peer-reviewed version:
The first thing to notice is that after peer review the title changed. The study no longer concludes “lack of effects” but instead concedes to “lack of severe effects.”
Here’s the new Abstract:
In this version of the study, after addressing recommendations from peer review, the authors test the difference in spotted sandpiper hatching success between exposed and reference sites. And it turns out that the difference is statistically “highly significant.”
Why is this important? Because the first visible sign of selenium toxicity in birds is reduced hatching success. By the time birds start laying eggs that develop with deformed embryos, it’s too late. The populations could already be collapsing.
Now, this study doesn’t prove that selenium is the cause of lower hatching success, but it does suggest that more work is needed. For selenium toxicologists, this is an interesting study that adds to a body of research on selenium toxicity in birds.
Without peer review, this science would have remained an “all’s-well-nothing-to-see-here” flawed study that muddles our understanding of selenium in the ecosystem instead of adding to scientific knowledge.
Peer review makes science better. It catches mistakes, shuts down bias, and sets a bar for quality.
That’s why we need it.
Photographs from the Twitterverse, top – Women’s March, January 21, 2017. Bottom – February 3.