Seven Reasons to Work Reproducibly

August 6th, 2019, Dalson Figueiredo, Nicole Janz

Seven Reasons to Work Reproducibly

The following is a guest post from Dalson Figueiredo and Nicole Janz who helped to co-author Seven Reasons Why: A User's Guide to Transparency and Reproducibility, the first guide of its kind outlining reproducibility best practices for government institutions and the political science community in Brazil.

The scholarly interest in transparency, reproducibility and replication has been rising recently, with compelling evidence that many scientific studies have failed to be replicated in political science, sociology, economics, medicine and genetics. In short, both the natural and the social sciences are facing a reproducibility crisis. Worse still, there is no agreement as to what transparency, reproducibility and replication mean exactly. 

Nicole Janz and Dalson Figueiredo
The authors, Nicole Janz of the University of Nottingham, UK and Dalson Figueiredo of Federal University of Pernambuco, Brazil.

With a grant from the British Academy, we faced the problem of trying to introduce transparency norms to the political science community in Brazil – for scholars and journal editors. After running several workshops, we realized: there’s no simple, short user’s guide for scholars looking at transparency for the first time. Together with colleagues in Brazil, among them our brilliant students, we wrote that guide, in collaboration with our Brazilian colleagues Rodrigo Lins, Amanda Domingos, and Lucas Silva. Here are seven reasons to work and publish reproducibly.

  1. Replication materials required by journals help to avoid disaster: Data posting reduces potential scientific misconduct in journals and helps authors to compile and review their files carefully before article submission. Evidently, nobody wants to get it wrong, but if your data and code are publicly available, the scholarly community will know in advance that you did your best to promote open science.

  2. Transparency makes it easier to write papers: Creating replication materials on-the-go will force researchers to plan every step of their work carefully, and document their decisions. This will help with writing the paper later on, especially with writing a good methods section.

  3. Replication materials can lead to better peer review: When authors give peer reviewers access to replication materials, they can judge the work much more easily, find potential issues, help improve the paper – or simply see for themselves that the analysis is sound.

  4. Replication materials enable the continuity of academic work: Have you ever tried to pick up an old project and complete it after a longer break? Have you ever tried to write a follow-up article based on your (long ago) PhD? Working transparently helps, in the first instance, authors to work on projects continuously. If you run your data in Excel or by click-and-play software, there is a high probability that you will face problems in following the same steps one week later. Believe us, we have been there more times than that we would like to admit.

  5. Replication materials help to build scientific reputation: Journals and authors who work toward transparency of research and publications have much more to gain than to lose: a good reputation. Even if an error occurs, they need not be afraid of accusations – all data and code were available, and errors are human. Withholding data would cause much more suspicion in the community than human errors.

  6. Replication is a powerful tool to learn data analysis: From our teaching experience, we have observed that students are more motivated to work with real data and arriving at the same figures than working with some dull, repetitive homework assignment unrelated to what they are studying.

  7. Replication materials increase the impact of scholarly work: Most articles are never cited papers – but papers that share data increase their citation count massively.

If we’ve persuaded you to give transparent workflows a try (or you’re a journal editor wanting to support your authors), our full article – co-authored with our Brazilian colleagues Rodrigo Lins, Amanda Domingos, and Lucas Silva – contains the most important tools and initiatives to use, introducing: TOP guidelines for journals, Project TIER with teaching tools, Github and Rmarkdown, OSF data archive, pre-analysis plans and more. 

View the full Seven Reasons Why: A User's Guide to Transparency and Reproducibility article for in-depth reading and to access the tools and resources compiled by Figueiredo, Janz, and their colleagues.

Recent Blogs

The Content of Open Science

What Second Graders Can Teach Us About Open Science

What's Going on With Reproducibility?

Open Science and the Marketplace of Ideas

3 Things Societies Can Do to Promote Research Integrity

How to Manage and Share Your Open Data

Interview with Prereg Challenge Award Winner Dr. Allison Skinner

Next Steps for Promoting Transparency in Science

Public Goods Infrastructure for Preprints and Innovation in Scholarly Communication

A How-To Guide to Improving the Clarity and Continuity of Your Preregistration

Building a Central Service for Preprints

Three More Reasons to Take the Preregistration Challenge

The Center for Open Science is a Culture Change Technology Company

Preregistration: A Plan, Not a Prison

How can we improve diversity and inclusion in the open science movement?

OSF Fedora Integration, Aussie style!

Replicating a challenging study: it's all about sharing the details.

How Preregistration Helped Improve Our Research: An Interview with Preregistration Challenge Awardees

Some Examples of Publishing the Research That Actually Happened

Are reproducibility and open science starting to matter in tenure and promotion review?

The IRIS Replication Award and Collaboration in the Second Language Research Community

We Should Redefine Statistical Significance

Some Cool New OSF Features

How Open Source Research Tools Can Help Institutions Keep it Simple

OSF Add-ons Help You Maximize Research Data Storage and Accessibility

10 Tips for Making a Great Preregistration

Community-Driven Science: An Interview With EarthArXiv Founders Chris Jackson, Tom Narock and Bruce Caron

A Preregistration Coaching Network

Why are we working so hard to open up science? A personal story.

One Preregistration to Rule Them All?

Using the wiki just got better.

Transparent Definitions and Community Signals: Growth in the Open Science Community

We're Committed to GDPR. Here's How.

Preprints: The What, The Why, The How.

The Prereg Challenge Is Ending. What's Next?

We are Now Registering Preprint DOIs with Crossref

Using OSF in the Lab

Psychology's New Normal

How Open Commenting on Preprints Can Increase Scientific Transparency: An Interview With the Directors of PsyArxiv, SocArxiv, and Marxiv

The Landscape of Open Data Policies

Open Science is a Behavior.

Why pre-registration might be better for your career and well-being

Interview: Randy McCarthy discusses his experiences with publishing his first Registered Report

Towards minimal reporting standards for life scientists

Looking Back on the Prereg Challenge and Forward To More Credible Research

OSF: Origin, growth, and what’s next

A Critique of the Many Labs Projects

The Rise of Open Science in Psychology, A Preliminary Report

Strategy for Culture Change

New OSF Registries Enhancements Improve Efficiency and Quality of Registrations

Registered Reports and PhD’s – What? Why? How? An Interview with Chris Chambers

How to Collaborate with Industry Using Open Science

How to Build an Open Science Network in Your Community

Seven Reasons to Work Reproducibly

COS Collaborates with Case Medical Research to Support Free Video Publishing for Clinicians and Researchers

Advocating for Policy Improvements at Your Institution

Announcing a joint effort to improve research transparency: FAIRSharing and TOP Guidelines

OSF as a tool for managing course-embedded research projects

Journals test the Materials Design Analysis Reporting (MDAR) checklist

Now you can endorse papers on OSF Preprints with Plaudit

Many Labs 4: Failure to Replicate Mortality Salience Effect With and Without Original Author Involvement

Approach and vision for the OSF Preprint infrastructure

This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.