Replicating scientific results is tough — but essential
Replicabillity — the ability to obtain the same result when an experiment is repeated — is foundational to science. But in many research fields it has proved difficult to achieve. An important and much-anticipated brace of research papers now show just how complicated, time-consuming and difficult it can be to conduct and interpret replication studies in cancer biology1,2.
Nearly a decade ago, research teams organized by the non-profit Center for Open Science in Charlottesville, Virginia, and ScienceExchange, a research-services company based in Palo Alto, California, set out to systematically test whether selected experiments in highly cited papers published in prestigious scientific journals could be replicated. The effort was part of the high-profile Reproducibility Project: Cancer Biology (RPCB) initiative. The researchers assessed experimental outcomes or ‘effects’ by seven metrics, five of which could apply to numerical results. Overall, 46% of these replications were successful by three or more of these metrics, such as whether results fell within the confidence interval predicted by the experiment or retained statistical significance.
The project was launched in the wake of reports from drug companies that they could not replicate findings in many cancer-biology papers. But those reports did not identify the papers, nor the criteria for replication. The RPCB was conceived to bring research rigour to such retrospective replication studies.
LEGGI TUTTO https://www.nature.com/articles/d41586-021-03736-4?WT.ec_id=NATURE-20211216&utm_source=nature_etoc&utm_medium=email&utm_campaign=20211216&sap-outbound-id=53195D6C25A5119BBC6325CE2343BC9AC13FF203