Massive Collaboration Testing Reproducibility of Psychology Studies Publishes Findings


Aug. 27, 2015


Today, 270 researchers investigating the reproducibility of psychological science published their findings in Science Magazine. Launched nearly four years ago and coordinated by the Center for Open Science, the Reproducibility Project: Psychology has produced the most comprehensive investigation ever done about the rate and predictors of reproducibility in a field of science. The project conducted replications of 100 published findings of three prominent psychology journals. They found that regardless of the analytic method or criteria used, fewer than half of their replications produced the same findings as the original study. Mallory Kidwell, one of the project coordinators from the Center for Open Science, concluded, “The results provide suggestive evidence toward the challenges of reproducing research findings, including identifying predictors of reproducibility and practices to improve it.”

Science is unique from other ways of gaining knowledge by relying on reproducibility to gain confidence in ideas and evidence. Reproducibility means that the results recur when the same data are analyzed again, or when new data are collected using the same methods. As noted by Angela Attwood, team member from University of Bristol, “Scientific evidence does not rely on trusting the authority of the person that made the discovery. Rather, credibility accumulates through independent replication and elaboration of the ideas and evidence.”

The team emphasized that a failure to reproduce does not necessarily mean the original report was incorrect. Elizabeth Gilbert, team member from the University of Virginia, noted, “A replication team must have a complete understanding of the methodology used for the original research, and shifts in the context or conditions of the research could be unrecognized but important for observing the result.” Belen Fernandez-Castilla, team member from Universidad Complutense de Madrid, added, “Scientists investigate things that are not yet understood, and initial observations may not be robust.”

Yet a problem for psychology and other fields is that incentives for scientists are not consistently aligned with reproducibility. “What is good for science and what is good for scientists are not always the same thing. In the present culture, scientists’ key incentive is earning publications of their research, particularly in prestigious outlets,” said Ljiljana Lazarević, team member from the University of Belgrade. Research with new, surprising findings is more likely to be published than research examining when, why, or how existing findings can be reproduced. As a consequence, it is in many scientists’ career interests to pursue innovative research, even at the cost of reproducibility of the findings.

Fewer than half of the original findings were successfully replicated. This held true across multiple different criteria of success. The team noted three basic reasons this might occur: 1) Even though most replication teams worked with the original authors to use the same materials and methods, small differences in when, where, or how the replication was carried out might have influenced the results. 2) The replication might have failed to detect the original result by chance. 3) The original result might have been a false positive. Johanna Cohoon, another Project Coordinator from the Center for Open Science, concluded that, “The findings demonstrate that reproducing original results may be more difficult than is presently assumed, and interventions may be needed to improve reproducibility.” In keeping with the goals of openness and reproducibility, every replication project posted its methods on a public website, and later added their raw data and computer code for reproducing their analyses.

Many organizations, funders, journals, and publishers are already working on improving reproducibility. For example, in 2014, the journal Psychological Science - one of the journals included in this study - implemented practices such as badges to acknowledge open sharing of materials and data to improve reproducibility. “Efforts include increasing transparency of original research materials, code, and data so that other teams can more accurately assess, replicate, and extend the original research, and pre-registration of research designs to increase the robustness of the inferences drawn from the statistical analyses applied to research results,” said Denny Borsboom, a team member from the University of Amsterdam who was also involved in creation of the Transparency and Openness Promotion (TOP) Guidelines published in Science in June.

Since the Reproducibility Project began in 2011, similar projects have emerged in other fields such as the Reproducibility Project: Cancer Biology. And, a discipline of metascience is emerging - scientific research about scientific research. These and the widespread efforts to improve research transparency and reproducibility are indications that, as suggested by team member Susann Fiedler from the Max Planck Institute for Research on Collective Goods, “Science is actively self-examining and self-correcting to maximize the quality and efficiency of the research process in the service of building knowledge for the public good.”


---

The Reproducibility Project: Psychology is funded by the Laura and John Arnold Foundation.

The Center for Open Science is funded by grants and donations from the John Templeton Foundation, the Laura and John Arnold Foundation, and several other foundations and agencies.


More information

  1. Science publication of the results of the Reproducibility Project: Psychology: http://www.sciencemag.org/lookup/doi/10.1126/science.aac4716
  2. Reproducibility Project: Psychology website (https://osf.io/ezcuj/) at the Open Science Framework with all project details, materials, data, and authors
  3. TOP Guidelines website (http://cos.io/top) with link to the June Science publication introducing them and information about the guidelines, signatories (>500 journals already), and related efforts across science disciplines
  4. Center for Open Science (http://cos.io), sponsor of the Reproducibility Project: Psychology and Reproducibility Project: Cancer Biology (https://osf.io/e81xl)
  5. Open Science Framework (https://osf.io): Free, open-source infrastructure for researchers to facilitate sharing of research materials and data

About the Center for Open Science

The Center for Open Science (COS) is a non-profit technology startup founded in 2013 with a mission to increase openness, integrity, and reproducibility of scientific research. COS pursues this mission by building communities around open science practices, supporting metascience research, and developing and maintaining free, open source software tools. The Open Science Framework (OSF), COS’s flagship product, is a web application that connects and supports the research workflow, enabling scientists to increase the efficiency and effectiveness of their research. Researchers can use the OSF to collaborate, document, archive, share, and register research projects, materials, and data. Learn more at cos.ioand osf.io, or follow us on Twitter @OSFramework.


Contact

The Reproducibility Project: Psychology Represents the collective efforts of 270 researchers. The following is a selection of contacts for the project to facilitate connection with local or regional researchers that can speak about the project, findings, and implications.

Project Organization
Center for Open Science, Executive Director
Brian Nosek, nosek@cos.io
Nosek's Scheduler: Denise Holman, denise@cos.io

Reproducibility Project: Psychology, Project Coordinators
Johanna Cohoon, johanna@cos.io
Mallory Kidwell, mallory@cos.io

Project analysis, R-Scripts, and data management
Marcel van Assen, M.A.L.M.vanAssen@uvt.nl
Sacha Epskamp, sacha.epskamp@gmail.com
Fred Hasselman, f.hasselman@pwo.ru.nl

Regional Contacts:

Australia
Patrick Goodbourn, patgoodbourn@gmail.com

Austria
Martin Voracek, martin.voracek@univie.ac.at

Belgium
Wolf Vanpaemel, wolf.vanpaemel@ppw.kuleuven.be

Canada
Michael Barnett-Cowan, mbc@uwaterloo.ca
Stanka Fitneva, fitneva@queensu.ca
Sean Mackinnon, mackinnon.sean@gmail.com

Czech Republic
Štěpán Bahník, bahniks@seznam.cz

England
Angela Attwood, Angela.Attwood@bristol.ac.uk
Gavin B. Sullivan, gavin.sullivan@coventry.ac.uk

Germany
Susann Fiedler, susann.fiedler@gmail.com

Hong Kong
Cathy O.-Y. Hung, oycathy@connect.hku.hk

Israel
Samuel Shaki, Samuel_shaki@hotmail.com

Italy
Michelangelo Vianello, michelangelo.vianello@unipd.it

Netherlands
Kai Jonas, K.J.Jonas@uva.nl
Fred Hasselman, f.hasselman@pwo.ru.nl

Scotland
Alissa Melinger, a.melinger@dundee.ac.uk

Serbia
Ljiljana Lazarević, ljiljana.lazarevic@f.bg.ac.rs

Spain
Belen Fernandez-Castilla, bfcastilla@gmail.com
David Santos, david.santos@inv.uam.es

Sweden
Gustav Nilsonne, gustav.nilsonne@ki.se

Uruguay
Alejandro Vasquez Echeverria, avasquez@psico.edu.uy

USA: East
Franziska Plessow, fplessow@bidmc.harvard.edu
Katherine Moore, moorek@arcadia.edu
Russ Clay, russ.clay@csi.cuny.edu

USA: South
Ann Calhoun-Sauls, anncalhounsauls@bac.edu
H. Colleen Sinclair, csinclair@psychology.msstate.edu

USA: Midwest
Chris Chartier, cchartie@ashland.edu
Jeff Galak, jgalak@andrew.cmu.edu

USA: West
Kim Kelso, kakelso@adams.edu
Michael Cohn, michael.cohn@ucsf.edu
Carmel Levitan, levitan@oxy.edu
Leslie Cramblet Alvarez, lcramblet@adams.edu




This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.