Most science journals put up multiple barriers to ensure that faulty studies don’t get published, for example, peer review committees. Most think these panels of independent scientists reviewing the work of another is enough to make sure bad science and shoddy workmanship doesn’t get published in their journals, at least, not in high amounts. What if they are wrong?
Recently, more and more studies of the reproducibility of modern scientific studies — that is, their ability to be recreated by scientists that weren’t involved in the original work — have been making news. A study by biotech giant Amgen failed to reproduce 47 of 53 landmark cancer studies. That’s 88 per cent of the studies.
Retractions are also rising. Multiple cases of large numbers of fraudulent studies have made the news recently because they have had to be retracted from the journals they were published in, either because of faulty figures, made-up data or just accidental mix-ups.
Today, a company called Science Exchange announced a new program — called the Reproducibility Initiative — that pairs researchers with companies to reproduce their findings. The initiative is led by Elizabeth Irons, a professor at the University of Miami.
“In the last year, problems in reproducing academic research have drawn a lot of public attention, particularly in the context of translating research into medical advances. Recent studies indicate that up to 70% of research from academic labs cannot be reproduced, representing an enormous waste of money and effort,” Iorns said in the announcement. “In my experience as a researcher, I found that the problem lay primarily in the lack of incentives and opportunities for validation — the Reproducibility Initiative directly tackles these missing pieces.”
As she mentioned, Iorns has had her own run in with reproducibility. She tried to reproduce a Nature paper linking a gene called SATB1 to cancer in cultured cells, and couldn’t reproduce the results using the protocols in the paper. She had to shop the resulting paper around to multiple journals before getting it published.
Carl Zimmer wrote up an article for Slate on the new initiative:
Iorns and her colleagues are trying to reprogram the incentives in science. Right now, a lot of the incentives to take extra care rather than rushing to publish research are on the stick side of the carrot and stick equations — first and foremost, the fear that your paper gets retracted.
“If you are retracted, it’s career-breaking, and you’re a fraudulent scientist. It’s very negative,” Iorns says. “We said, ‘Why don’t we reward scientists who use high-quality data?’ Eventually the culture shifts from just funding originality, and instead we shift to rewarding things that are really true.”
Will Iorn’s initiative work? Who knows. But it’s a start.