Charlottesville, VA — The Registered Reports publishing format fundamentally changes how papers are selected for publication and how researchers are rewarded for their work. Today, an article published in Nature Human Behaviour provides new evidence that Registered Reports are also associated with greater rigor and research quality compared with the standard publishing format. (See preprint.)
Registered Reports (RRs) involve journals conducting initial peer review prior to knowing the outcomes of the research. Authors submit an introduction motivating their research question, preliminary evidence, and the proposed study or studies that they will conduct. Reviewers evaluate the importance of the research question and the quality of the methodology to investigate it. If the RR passes peer review it receives in principle acceptance, meaning that the journal commits to publishing the findings regardless of outcome as long as the authors follow through on conducting the research according to the agreed design and quality. After the proposed research has been conducted and written up, the paper goes through a second stage of review, in which the reviewers assess only whether authors completed the work as proposed and whether they interpreted the results responsibly.
To date, 295 journals have adopted the RR format since its introduction in 2013, most in neuroscience and the social-behavioral sciences. Prior studies of the format suggest that RRs are effective at eliminating bias favoring publication of positive over negative results. And, RRs are cited at similar rates as other articles published in the same journals around the same time. Today’s publication in Nature Human Behaviour finds preliminary evidence that RRs are also associated with higher research quality. 29 published RRs and 57 matched comparison articles - half from the same lead authors as the RRs and half on similar topics published in the same journal around the same time - were peer reviewed by 353 researchers on 19 outcome criteria.
After matching based on topical expertise, each reviewer evaluated an RR and a matched paper in a random order. For both papers, reviewers first read the introduction, preliminary evidence, and proposed methodology of the last study and evaluated the papers on 8 criteria including: rigor of the proposed methodology, importance of the research regardless of what results will be observed, and novelty of the research question. Then, they read the results and discussion of the last study and evaluated the papers on 7 criteria including: the rigor of the analysis strategy, what was learned, and the importance of the findings. Finally, they read the title and abstract and provide 4 final ratings including: overall quality of the paper and if the findings would inspire new research.
Across all 19 outcome criteria, RRs were rated more favorably on average than the comparison articles. Some differences were substantial such as RRs performing better on rigor of the proposed methodology, rigor of the analysis strategy, and overall quality of the paper. And, some differences were slight such as the novelty of the research question and if the findings would inspire new research with estimates of uncertainty ranging from RRs modestly outperforming comparison articles to comparison articles slightly outperforming RRs.
“This is the first evidence that Registered Reports not only address publication bias, but may also improve research quality” said Tim Errington, Director of Research at the Center for Open Science and one of the lead authors of the paper. He added, “However, this is an observational study of existing papers. We did not randomly assign research to be conducted as a Registered Report or not. As a consequence, the causal inference is more uncertain than in a randomized trial and hinges on the appropriateness of our selection of matched comparison articles.” Co-lead Courtney Soderberg, previously at the Center for Open Science and now at Facebook, added “We believe that our systematic selection of articles from the same lead authors and from the same journals published around the same time is the strongest available strategy for an observational study short of conducting a randomized trial. We interpret this as preliminary evidence that Registered Reports can improve research quality and rigor, and we are interested to see other investigations test this possibility further.”
The possibility that Registered Reports could improve research quality is plausible and could have substantial implications for improving research rigor and reproducibility. “In the standard model, authors are primarily rewarded for novel, exciting results and the quality of the methodology may get less scrutiny,” said Julia Bottesini a graduate student at UC Davis and co-author of the study. “Just the fact that Registered Reports don’t have the research outcomes may result in higher quality papers being selected for publication because reviewers have to focus on the question and quality of methodology.”
Sarah Schiavone, also a graduate student at UC Davis and co-author, added “Registered Reports also have an important feature for potentially improving research quality - the research has not been completed during peer review. If reviewers find errors or weaknesses in the design, they can point them out and even suggest ways to fix it. That is very different from the standard model in which the research is already done. All the reviewers can do is point out what went wrong.”
“Changing the criteria for publication from the outcomes of the research to the research question and methodology fundamentally changes the incentives for researchers” said Simine Vazire, co-author and Professor at the University of Melbourne. “With Registered Reports, researchers are rewarded for asking important questions and designing the most effective tests of those questions that they can. The outcomes are just what we all learn from the study having been conducted.”
As a new publishing model that has mostly been applied in social-behavioral research and neuroscience so far, there is still a lot to learn about RRs. “We don’t yet know how well these findings will generalize to broader adoption of Registered Reports across researchers, disciplines, and different types of research methodologies,” said Kevin Esterling, co-author and Professor at UC Riverside. “We expect that there will be boundary conditions of types of research that benefit more and less from this model. Also, there may be qualities of research that we did not assess that will reveal benefits for the standard publication model over Registered Reports. There’s a lot more research to be done.”
Nevertheless, “this is promising initial evidence for the continued adoption of the Registered Reports publishing model,” said Felix Singleton Thorn, co-author and recently completed graduate student at the University of Melbourne. “If a publication model can help improve the quality and rigor of research, then it could have a profound impact on reproducibility and the pace of research advancing knowledge, solutions, and cures. But, like any other research, these findings must be subjected to additional scrutiny including, ideally, a randomized controlled trial to test the causal impact of the Registered Report model more directly.”
Access the recording of a community call from July 12, 2021 discussing the cumulative evidence about Registered Reports and the research priorities shaping what needs to be investigated next to clarify its benefits, costs, and impact on research practices.
About the Center for Open Science
COS is a non-profit technology and culture change organization founded in 2013 with a mission to increase openness, integrity, and reproducibility of scientific research. COS pursues this mission by building communities around open science practices, supporting metascience research, and developing and maintaining free, open source software tools. The OSF is a web application that provides a solution for the challenges facing researchers who want to pursue open science practices, including: a streamlined ability to manage their work; collaborate with others; discover and be discovered; preregister their studies; and make their code, materials, and data openly accessible.
Inquiries: Claire Riss claire@cos.io
Web: cos.io
Twitter: @osframework