Journals test the Materials Design Analysis Reporting (MDAR) checklist

October 22nd, 2019, Center for Open Science

MDAR thumb

This update is being shared on the work of the Minimum Standards Working Group through coordinated posts on member platforms. If you would like more information about the work and progress of the working group, please contact Veronique Kiermer and Sowmya Swaminathan.

We are pleased to share results from a pilot with 13 journals that tested the Materials Design Analysis Reporting (MDAR) checklist, a minimum standards reporting checklist for the life sciences.

The MDAR framework, a minimum standards reporting framework for life sciences, was designed to provide a harmonizing principle for reporting requirements currently in use at various journals. It is meant to be flexible to adapt to various journal policies, and provides two levels of reporting stringency, a minimum recommendation and a best practice recommendation. The checklist was designed as an optional instrument to help adoption of this reporting framework. A statement of task can be found here.

We are very grateful to the 13 participating journals and platforms – BMC Microbiology, Ecology & Evolution, eLife, EMBO journals, Epigenetics, F1000R, Molecular Cancer Therapeutics, Microbiology Open, PeerJ, PLOS Biology, PNAS, Science, Scientific Reports – for testing the MDAR checklist.

The pilot had two main goals: first, to understand whether the checklist was accessible and useful for authors and editors to help comply with journal policy and second, to understand whether the elements within the checklist are clearly conveyed so as to help fulfil policy expectations. In total, 211 authors across participating journals tested the checklist and provided their feedback.  Participating journal teams screened 289 manuscripts using the checklist and 89 of these manuscripts were subject to a dual-assessment by independent reviewers, which allowed us to determine inter-assessor agreement, and thus clarity of specific items on the checklist.

We are encouraged to find that about 80% of authors and editors found the checklist useful to different degrees and that the majority of participating editors reported only a small increase in manuscript processing time as a result of using the checklist. While participating authors and editors did not identify major gaps in the requirements covered in the checklist, the feedback from authors and editors and the inter-assessor agreement results have given us a better understanding of areas in the checklist and elaboration document where the language is unclear and needs to be improved.

We are making the draft MDAR Framework, MDAR Checklist and MDAR Elaboration document and the pilot datasets available here. This work was also recently presented at the NASEM workshop on Enhancing Scientific Reproducibility through Transparent Reporting (slides available here).

We are currently gathering feedback on the MDAR framework, checklist and elaboration document from a broad group of about 40-50 experts on transparency and reproducibility. Based on the feedback from the pilot and the expert input, we anticipate revising all three MDAR outputs by the end of 2019.

On behalf of the “minimal standards” working group:

Karen Chambers (Wiley)
Andy Collings (eLife)
Chris Graf (Wiley)
Veronique Kiermer (Public Library of Science;
David Mellor (Center for Open Science)
Malcolm Macleod (University of Edinburgh)
Sowmya Swaminathan (Nature Research/Springer Nature;
Deborah Sweet (Cell Press/Elsevier)
Valda Vinson (Science/AAAS)

Recent Blogs

The Content of Open Science

What Second Graders Can Teach Us About Open Science

What's Going on With Reproducibility?

Open Science and the Marketplace of Ideas

3 Things Societies Can Do to Promote Research Integrity

How to Manage and Share Your Open Data

Interview with Prereg Challenge Award Winner Dr. Allison Skinner

Next Steps for Promoting Transparency in Science

Public Goods Infrastructure for Preprints and Innovation in Scholarly Communication

A How-To Guide to Improving the Clarity and Continuity of Your Preregistration

Building a Central Service for Preprints

Three More Reasons to Take the Preregistration Challenge

The Center for Open Science is a Culture Change Technology Company

Preregistration: A Plan, Not a Prison

How can we improve diversity and inclusion in the open science movement?

OSF Fedora Integration, Aussie style!

Replicating a challenging study: it's all about sharing the details.

Some Examples of Publishing the Research That Actually Happened

How Preregistration Helped Improve Our Research: An Interview with Preregistration Challenge Awardees

Are reproducibility and open science starting to matter in tenure and promotion review?

The IRIS Replication Award and Collaboration in the Second Language Research Community

We Should Redefine Statistical Significance

Some Cool New OSF Features

How Open Source Research Tools Can Help Institutions Keep it Simple

OSF Add-ons Help You Maximize Research Data Storage and Accessibility

10 Tips for Making a Great Preregistration

Community-Driven Science: An Interview With EarthArXiv Founders Chris Jackson, Tom Narock and Bruce Caron

A Preregistration Coaching Network

Why are we working so hard to open up science? A personal story.

One Preregistration to Rule Them All?

Using the wiki just got better.

Transparent Definitions and Community Signals: Growth in the Open Science Community

We're Committed to GDPR. Here's How.

Preprints: The What, The Why, The How.

The Prereg Challenge Is Ending. What's Next?

We are Now Registering Preprint DOIs with Crossref

Using OSF in the Lab

Psychology's New Normal

How Open Commenting on Preprints Can Increase Scientific Transparency: An Interview With the Directors of PsyArxiv, SocArxiv, and Marxiv

The Landscape of Open Data Policies

Open Science is a Behavior.

Why pre-registration might be better for your career and well-being

Interview: Randy McCarthy discusses his experiences with publishing his first Registered Report

Towards minimal reporting standards for life scientists

Looking Back on the Prereg Challenge and Forward To More Credible Research

OSF: Origin, growth, and what’s next

A Critique of the Many Labs Projects

The Rise of Open Science in Psychology, A Preliminary Report

Strategy for Culture Change

New OSF Registries Enhancements Improve Efficiency and Quality of Registrations

Registered Reports and PhD’s – What? Why? How? An Interview with Chris Chambers

How to Collaborate with Industry Using Open Science

How to Build an Open Science Network in Your Community

Seven Reasons to Work Reproducibly

COS Collaborates with Case Medical Research to Support Free Video Publishing for Clinicians and Researchers

Advocating for Policy Improvements at Your Institution

Announcing a joint effort to improve research transparency: FAIRSharing and TOP Guidelines

OSF as a tool for managing course-embedded research projects

Journals test the Materials Design Analysis Reporting (MDAR) checklist

Now you can endorse papers on OSF Preprints with Plaudit

Many Labs 4: Failure to Replicate Mortality Salience Effect With and Without Original Author Involvement

Approach and vision for the OSF Preprint infrastructure

Conflict between Open Access and Open Science: APCs are a key part of the problem, preprints are a key part of the solution

Re-engineering Ethics Training: An Interview with Dena Plemmons and Erica Baranski

Answering Your Preregistration Questions

UBC leads the way as first Canadian institutional OSF member

2019 Recap: OSF Growth and Open Knowledge Exchange

Getting Started with OSF

This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.