New Preprint Introduces Major Update to the TOP Guidelines

The Center for Open Science (COS) is pleased to introduce a major update to the Transparency and Openness Promotion (TOP) Guidelines, dubbed TOP 2025. The TOP Guidelines are an influential policy framework (Figure 1) designed to enhance the verifiability of empirical research claims in research. The preprint introducing TOP 2025 is available now at MetaArXiv. The revisions are the result of an extensive consultation process with the TOP Advisory Board, currently chaired by Sean Grant and co-chaired by Suzanne Stewart. 
top graphic finalTOP 2025 makes three major classes of updates (see Table). First, standards are reconfigured into three distinct types: (1) research practices, (2) verification practices, and (3) verification studies. Second, for research practices, levels are now streamlined so that they follow the same logical structure for all practices, with Level 1 reflecting disclosure, Level 2 connoting sharing and citation, and Level 3 implying independent certification. Finally, we have made changes to existing research practices, including integrating the former data citation standard into Level 2, renaming “Design and Analysis Transparency” to “Reporting Transparency,” and extracting “Study Protocol” and “Analysis Plan” separately from “Study Registrations” (formerly “Preregistrations”). The full preprint contains details on each of these changes. 

Together, the revisions streamline the framework and provide conceptual clarity surrounding research practices and levels of implementations of those practices. The revisions also address weaknesses identified in the framework identified over the last 10 years of implementation. The levels for implementing research practices now consistently progress for each standard and provide a more clear action at each level. By incorporating citation into the second level, every research output is clearly a contribution to the scholarly record that should be recognized as a citable item. It also treats newly created outputs the same as existing outputs: regardless of where the data or materials are, readers should be able to know where they are available and how to access them.

The third level, certification, now asserts how each practice should be conducted according to disciplinary best practices (e.g., datasets must include clear metadata, registered studies must be done by a specified time and include sufficient content, code must be well documented) and the fulfillment of these best practices should be checked by a party independent of the authors (e.g., a journal or university).

Finally, TOP 2025 includes Verification Practices and Programs that address specific needs. Computational Reproducibility and Comprehensive Reporting ensure that the reported results are reproducible and include all relevant outcomes, whereas scholarship such as replications, Registered Reports, and mult-analyst or lab studies are specifically promoted.   

 

Research Practices
Practice Level 1: Disclosed Level 2: Shared and Cited Level 3: Certified
Study 
Registration
Authors stated whether or not a study was registered—and, if so, where and when it was registered. Authors registered the study (at an appropriate time per disciplinary standards) and cited the registration. A party independent from the authors certified that the study was registered at an appropriate time, and the registration was complete per disciplinary best-practice.
Study 
Protocol
Authors stated whether or not the study protocol is available—and, if so, where and when it was shared. Authors publicly shared the study protocol (at an appropriate time per disciplinary standards) and cited it. A party independent from the authors certified that the study protocol was shared and complete per disciplinary best-practice.
Analysis 
Plan
Authors stated whether or not the analysis plan is available—and, if so, where and when it was shared. Authors publicly shared the analysis plan (at an appropriate time per disciplinary standards) and cited it. A party independent from the authors certified that the analysis plan was shared and complete per disciplinary best-practice.
Materials Transparency Authors stated whether or not materials are available—and, if so, where. Authors have cited materials deposited by themselves or others in a trusted repository. A party independent from the authors certified that materials were deposited and documented per disciplinary best-practice.
Data 
Transparency
Authors stated whether or not data are available—and, if so, where.  Authors have cited data deposited by themselves or others in a trusted repository. A party independent from the authors certified that data were deposited with metadata per disciplinary best-practice.
Analytic Code 
Transparency
Authors stated whether or not analytic code is available—and, if so, where. Authors have cited analytic code deposited by themselves or others in a trusted repository. A party independent from the authors certified that analytic code was deposited and documented per disciplinary best-practice.
Reporting Transparency Authors stated whether or not they used a reporting guideline—and, if so, which guideline. Authors publicly shared a completed reporting guideline checklist and cited the reporting guideline. A party independent from the authors certified that the authors adhered to the cited reporting guideline.
Verification Practices
Practice Definition
Comprehensive Reporting A party independent from the authors verified that the study registration, protocol, and analysis plan match the final report–and the final report acknowledges any deviations from these outputs.
Computational Reproducibility A party independent from the authors verified that reported results reproduce using shared data and code.
Verification Studies
Study Type Definition
Replication A study that aims to provide diagnostic evidence about claims from a prior study by repeating the original study procedures in a new sample. 
Registered Report A registered study in which a study protocol and analysis plan are peer reviewed, and the study is pre-accepted by a publication outlet, before the research is undertaken.
Multiverse A study that tests the research question of interest across multiple datasets arising from different, reasonable choices for processing and analyzing the same data.
Many Analyst A study in which independent analysis teams conduct plausible alternative analyses of a research question on the same dataset.

 

The authorship team invites feedback from journals, preprint servers, funders, academic institutions, research labs, research publishers, and other users of the TOP framework. The team anticipates submitting the paper for publication at a journal in the coming weeks, but would be pleased to receive feedback either before or after submission at the form below.  

Submit Feedback

Recent Posts