Preregistration: A Plan, Not a Prison

May 23rd, 2017, Alexander DeHaven


change-transform

Preregistration is the process of specifying key study and analysis details and decisions before conducting the experiment. The main goal of preregistering one’s research is to make it easier for readers (and yourself) to distinguish between what the you set out to do (confirmation) and what was discovered along the way (exploration). Both are vital to science, but conflating these two types of work can lead to misinterpretation of the context of any claim. Preregistration prevents us from tricking ourselves and allows the argument to have meaning. We recently published an article (preprint) that provides much more detail .

Despite its benefits, preregistration can oftentimes seem daunting and binding. Researchers may have questions like, “What happens if I make a mistake in my preregistration? Am I forced to stick to this plan, or can I deviate without nullifying the preregistration?” Fortunately, a preregistration is a document of your pre-specified research plans before seeing the results; it is not a prison sentence. Depending on what stage of the research project you are in, you can either update the preregistration, or rely on transparency to give context to any unanticipated decisions. As you’ll see, this transparency gives others the information they need to evaluate key decisions made to the planned research.


Example 1: Prior to Data Collection

This one is easy. If you want to make a change to your research plan before you’ve begun data collection, you can start a new preregistration in the same OSF project with the updated information. You can either withdraw the first one, or keep it to reference in the second, new preregistration.


Example 2: During Data Collection

This phase gets to some of the core questions about making a preregistration, the biases it is designed to address, and the role of transparency in achieving the overall goal. For the Preregistration Challenge, we permit researchers to create these research plans up to the point of data analysis, which includes calculating summary statistics in a dataset. After that point, there is too much room for bias to creep in. The data being used to test the hypothesis would have also been used to create the hypothesis test, thus making confirmatory and exploratory work harder to distinguish.

Given the above rationale, you are free and encouraged to make a new preregistration up to the point of data analysis. We recommend that you preserve the original preregistration, include a link to it in the new preregistration, and specify the changes that you are making and why. Is there still room for bias to creep in? Of course there is. But by maintaining the trail of ideas, you preserve the ability to evaluate that very question of bias. Every preregistration must include a statement that describes the degree to which data exist, thereby allowing for transparency into this part of the research process.

One deviation that may occur during this stage is a smaller than expected sample size. If you don’t have full control over your sample size, your preregistration must have a stopping rule that will dictate when to stop collecting data. The purpose of this rule is avoid continued rounds of data collection in an attempt to find statistically significant results. This repeated measuring and testing dramatically inflates the likelihood of seeing a significant finding and makes the resulting p-values meaningless, especially if they are not all reported. However, being unable to recruit as many subjects as planned is often totally innocuous. This lower than expected sample size may come after exhaustive efforts to recruit and not after peeking at the incoming data, in which case transparently reporting your efforts will allow others to judge the validity of the reported tests.


Example 3: After Data Analysis Has Begun

After data analysis has begun, it becomes nearly impossible to determine if decisions are justifiable and are outcome-independent, as the outcome is now known. The short answer for making changes or finding “better” analyses after data collection is to simply label any new analyses as what they are: data-dependent, exploratory analyses. Such analyses are an important feature of scientific research. Without a puzzling new finding, discoveries wouldn’t be made. If we don’t value the results of exploratory research, those who have never preregistered an analysis could rightly claim that preregistration will stymie future discoveries by limiting us to only reaffirming what is known. So go ahead, do whatever you want with the data as it comes in, but make sure you conduct and report the originally specified analyses, as those were your strongest predictions before running the experiment.

That being said, there are exceptions to every rule. From time to time, it really would be ineffective to stick to an outdated plan. If the results are uninterpretable because the data do not meet the assumptions of the specified tests, state that. State the plan, the reason for the change, provide a link to the outdated analyses so that the reader can evaluate the assertion, and then move on with the appropriate test. Such situations are difficult to address in a generic way, but the guiding principle for any such decisions should be to be transparent. Transparency allows the conversation to be meaningful.

If you are worried about harming the publishability of your work once you have deviated from the original plan, focus on journals that have made a commitment to rewarding transparency, those that have signed the Transparency and Openness Promotion Guidelines, those that issue Open Practice Badges, or those that accept Registered Reports. These practices signal that the journals have a core commitment toward open and reproducible research and so are best suited to evaluating work based on ideal scientific practices.

Another thing to consider when making a change to the preregistration is making changes to the OSF project that the preregistration is attached to. Though it is not frozen and timestamped, it does create a log of the changes made. This change should still be addressed in the final manuscript, but it is also helpful to have that extra evidence.

In all of the above recommendations, the central theme is to be as transparent as possible. There are likely many other solutions to the problems discussed above. The question one should ask when deciding to make a change is “how can I be as transparent as possible so that my audience will know exactly why I am doing what I am doing?” If you can satisfactorily answer this question, your reviewers and readership should have no problem accepting your changes.


Recent Blogs

The Content of Open Science

What Second Graders Can Teach Us About Open Science

What's Going on With Reproducibility?

Open Science and the Marketplace of Ideas

3 Things Societies Can Do to Promote Research Integrity

How to Manage and Share Your Open Data

Interview with Prereg Challenge Award Winner Dr. Allison Skinner

Next Steps for Promoting Transparency in Science

Public Goods Infrastructure for Preprints and Innovation in Scholarly Communication

A How-To Guide to Improving the Clarity and Continuity of Your Preregistration

Building a Central Service for Preprints

Three More Reasons to Take the Preregistration Challenge

The Center for Open Science is a Culture Change Technology Company

Preregistration: A Plan, Not a Prison

How can we improve diversity and inclusion in the open science movement?

OSF Fedora Integration, Aussie style!

Replicating a challenging study: it's all about sharing the details.

How Preregistration Helped Improve Our Research: An Interview with Preregistration Challenge Awardees

Some Examples of Publishing the Research That Actually Happened

Are reproducibility and open science starting to matter in tenure and promotion review?

The IRIS Replication Award and Collaboration in the Second Language Research Community

We Should Redefine Statistical Significance

Some Cool New OSF Features

How Open Source Research Tools Can Help Institutions Keep it Simple

OSF Add-ons Help You Maximize Research Data Storage and Accessibility

10 Tips for Making a Great Preregistration

Community-Driven Science: An Interview With EarthArXiv Founders Chris Jackson, Tom Narock and Bruce Caron

A Preregistration Coaching Network

Why are we working so hard to open up science? A personal story.

One Preregistration to Rule Them All?

Using the wiki just got better.

Transparent Definitions and Community Signals: Growth in the Open Science Community

We're Committed to GDPR. Here's How.

Preprints: The What, The Why, The How.

The Prereg Challenge Is Ending. What's Next?

We are Now Registering Preprint DOIs with Crossref

Using OSF in the Lab

Psychology's New Normal

How Open Commenting on Preprints Can Increase Scientific Transparency: An Interview With the Directors of PsyArxiv, SocArxiv, and Marxiv

The Landscape of Open Data Policies

Open Science is a Behavior.

Why pre-registration might be better for your career and well-being

Interview: Randy McCarthy discusses his experiences with publishing his first Registered Report

Towards minimal reporting standards for life scientists

Looking Back on the Prereg Challenge and Forward To More Credible Research

OSF: Origin, growth, and what’s next

A Critique of the Many Labs Projects

The Rise of Open Science in Psychology, A Preliminary Report

Strategy for Culture Change

New OSF Registries Enhancements Improve Efficiency and Quality of Registrations

Registered Reports and PhD’s – What? Why? How? An Interview with Chris Chambers

How to Collaborate with Industry Using Open Science

How to Build an Open Science Network in Your Community

Seven Reasons to Work Reproducibly

COS Collaborates with Case Medical Research to Support Free Video Publishing for Clinicians and Researchers

Advocating for Policy Improvements at Your Institution

Announcing a joint effort to improve research transparency: FAIRSharing and TOP Guidelines

OSF as a tool for managing course-embedded research projects

Journals test the Materials Design Analysis Reporting (MDAR) checklist

This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.