Replicability Project: Health Behavior (RPHB)

Advancing Scientific Integrity Through Replication in Health Research

Overview

The credibility of scientific findings is essential to public trust and evidence-based decision-making. In health research, published studies shape policy, guide clinical practice, and influence funding priorities, underscoring the importance of credible findings in those literatures. Although replication studies can be resource-intensive, they play a critical role in strengthening the evidence base. By re-examining the same research questions using new data, replication efforts can contribute to theoretical understanding, reduce uncertainty, and increase confidence in reported conclusions.

The Center for Open Science (COS) is launching the Replicability Project: Health Behavior (RPHB), a large-scale, multi-team effort to replicate a diverse sample of published quantitative studies to help support a transparent and trustworthy foundation in health research. The project aims to assess the robustness of findings and promote best practices in research transparency, methodological rigor, and reproducibility.

Who should get involved?
We are seeking researchers with experience in quantitative methods, particularly in the social scientific study of health behavior and related fields, such as public health, epidemiology, health policy, and behavioral science. We welcome individual scholars, labs, and research teams across all career stages and institutions to join the effort.

What studies are being replicated?
Replications will focus on a sample of studies published between 2015 and 2024 in the following journals:

How to Participate

There are two ways you can get involved with this initiative, and you may participate in either or both activities:

  • Replication: Perform a replication of an original study by investigating the same empirical claim selected from the original paper by collecting new data or secondary data sources independent from the data used in the original study.
  • Rapid peer-review: Review preregistrations describing the planned protocol for these replication studies to ensure rigor and transparency; can serve as a Reviewer and/or as an Editor.

If you are interested in joining this initiative, please complete the general interest sign-up form. If you have any questions, feel free to reach out by filling out the contact form below.

Interest FormContact Form

Key Information for Replicators

  • Expertise: We are looking for quantitative researchers with the expertise to conduct good-faith, high quality replications in the research domains outlined above.
  • Timeline: All replication studies must be completed and reported to COS by January 31, 2026.
  • Funding: We have limited funding that will enable us to support replication projects. Based on our available funds we anticipate being able to support approximately 60 replication projects averaging ~$3,000 USD per replication (however, actual needs may vary). We will ask researchers to provide a breakdown of their anticipated costs based on what is essential for project participation. 
  • Transparent research practices: Replicators will be asked to preregister their replications, upload all of their materials and output, and report their outcomes via the Open Science Framework (OSF). COS employees will be available for guidance with using the OSF and carrying out these open science practices.

Process & Timeline

We are accepting general interest sign-ups until September 1, 2025. Replication sign-ups and funding will be approved on a rolling basis, so we recommend signing up as soon as possible to secure a replication study you are interested in. All replications must be completed and reported to COS by no later than January 31, 2026.

SMArt Steps-crop

 

Sign-Up
If you are interested in joining this initiative, please begin by filling out our general interest sign-up form. We will be in touch within a few days with additional information and a list of available papers for you to look through and consider. We will also provide a replication sign-up form for you to indicate which paper(s) you are interested in replicating.

Approval & Onboarding
Once we confirm that the study you are interested in replicating is feasible given the projected timeline and budget, we will send you additional instructions and next steps to get started on your replication. One of the first steps will be to submit a budget proposal (should you require funds to support your replication attempt). Once the budget is approved you will be asked to sign an agreement through your institution. Templates and other resources you will need to carry-out your replication are provided on the RPHB OSF project.

Preregistration & Ethical Review
If your study requires ethical review, you must submit your study for review via your local institution as soon as possible as you will need to receive approval before beginning data collection.

To foster transparency and careful planning, you will complete a preregistration of your replication protocol to document the details of your methodological and analytical approach. We will provide you with a template to use along with guidance on how to fill it out. After your preregistration draft is complete, it will go through a review process where peers will have one week to review and provide comments on your protocol. Once you have addressed the reviewers feedback, your protocol will be approved by an editor to move forward (pending ethics approval if required). 

Following preregistration and ethical approval, you will register your study on the OSF (instructions will be provided).

Data Collection
You may begin data collection as soon as your study has been registered on the OSF. Please keep COS informed on the progress of your data collection, and let us know if you encounter any setbacks or difficulties with achieving your target sample size.

Analysis & Outcome Reporting
You should conduct your pre-planned analysis as soon as you have collected enough data to meet your target sample size. Then you will complete a reporting template to document your outcomes, and upload all sharable outputs (including deidentified data) to your OSF project. A team member from COS will check your report for completeness and verify that you have shared all study materials and outputs on the OSF. All replication studies must be complete and reported to COS by January 31, 2026.

 

Frequently Asked Questions

Will funding be provided for these replication studies?

The Center for Open Science recognizes that institutions may require funding to support personnel and non-personnel costs (e.g., equipment rental and participant compensation). We have limited funding that will enable us to support replication projects with anticipated costs averaging $3,000 USD, recognizing that actual needs may vary. Rather than setting a fixed limit, we ask researchers to provide a clear breakdown of their anticipated personnel and non-personnel costs based on what is essential for project participation. To ensure equitable distribution of funds, we ask that personnel and overhead costs only be requested if they are necessary to complete the work.

Additionally, while overhead costs are generally not allowed as part of the budget process, we have a limited pool of funding intended to enable institutions that can not defray those costs through other kinds of flexible, general funding (e.g., underrepresented, rural, and smaller institutions). To ensure equitable distribution of resources, we kindly ask that institutions only request indirect funding if the replicator’s participation would not be feasible without it. Budget proposals should justify all funding requests, and the Center for Open Science will review and approve costs based on available funding and program priorities.

Can I change the identified claim from the paper?

No, the claims were identified following an established process, and maintaining the same claim is important for the goals of this project.

Why health research?

Replication challenges have been well-documented across disciplines, yet health research stands out in its direct implications for human health and well-being. By revisiting published findings systematically, this project contributes to broader efforts to strengthen research credibility and better understand characteristics of studies that remain consistent (or not) when independently replicated.

Can we publish our replication outcomes as a preprint? Or submit our study for publication in a journal?

Replication teams are encouraged to publish their replication outcomes publicly as a preprint, however we ask that you wait to do so until May 2026, at which point all of the outcomes from the replication effort will be made public. Teams are also free to submit their replication study for publication in a journal - we just ask that you notify us before you do so, and keep the OSF project, outcomes, and drafts of the publication private until we are ready to make all replication outcomes public.

Will there be other authorship opportunities (for this replication effort at large)?

COS plans to lead a collaborative paper/report to summarize all of the replication outcomes included in this effort. We will invite all researchers who participate in the replication effort to be co-authors on the paper (this includes all replicators, as well as preregistration reviewers and editors).

What is the OSF?
The Open Science Framework (OSF), is a free and open source project management tool that supports researchers in sharing their work throughout their entire project lifecycle. Visit cos.io/products/osf to learn more.
What do we mean by replication?

We use the term replication, in the context of this project, to mean investigating the same empirical claim from an original study in a new sample (i.e., by collecting new data, or by using secondary data sources independent from the data used in the original study). More broadly, replication can be understood as a study for which any outcome would be considered diagnostic evidence about a claim from prior research. The purpose of replication is to advance theory by confronting existing understanding with new evidence. See this article for a more comprehensive discussion of replication.

How do I determine the sample size for my replication study?
The replication team will be responsible for conducting an a priori power analysis to determine the target sample size. The collaborator should conduct an appropriate power analysis for their study design and analytical approach, and use an alpha level of .05 to achieve 90% power to detect the claim’s focal effect or analytical result. We will provide resources (R scripts, templates) to assist you with calculating your sample size. If needed, we are available to consult and assist with your power analysis. Your target sample size, power analysis, and justification will be reviewed along with the rest of your methodology and analysis plan, by peer researchers during the ‘Preregistration Review’ process.