The Social Sciences Replication Project (SSRP) attempted to replicate 21 high-profile experimental studies in the social sciences published in Nature and Science in 2010-2015. These are supporting materials for journalists covering the release of this paper in Nature Human Behavior, August, 2018.


Primary Materials


SSRP Press Releases


Press releases from all sources (available 8/27/18)

Read the Paper


Read the paper, titled Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015.

To contact the original authors of the replicated studies, access this page


Methods and Analyses


This document accompanies the main paper and provides details of the methodology and additional analyses.

SSRP Web Site



The project has a substantial web site that outlines the project, what studies were included, who was involved and other useful information.

OSF Project for SSRP



This OSF repository accompanies the publication and contains all data and materials reported in the article as well as supplementary information.

Additional Commentary


The Reproducibility Opportunity - Malcolm R. Macleod

Read the correspondences and commentary regarding SSRP. 

Some members of the research team held a press conference call on August 22, 2018 about the project. The video recording is available here.


Replication Research Team


Colin F. Camerer
Professor, California Institute of Technology

camerer@hss.caltech.edu

Anna Dreber

Professor, Stockholm School of Economics
anna.dreber@hhs.se

Felix Holzmeister
Graduate Student, University of Innsbruck
felix.holzmeister@uibk.ac.at

Teck-Hua Ho
Professor, University of California, Berkeley
dprhoth@nus.edu.sg

Jürgen Huber

Professor, University of Innsbruck
juergen.huber@uibk.ac.at

Magnus Johannesson
Professor, Stockholm School of Economics
magnus.johannesson@hhs.se

Michael Kirchler
Professor, University of Innsbruck

michael.kirchler@uibk.ac.at

Gideon Nave
Assistant Professor, University of Pennsylvania, Wharton School
gnave@wharton.upenn.edu

Brian Nosek
Professor, University of Virginia, Center for Open Science
nosek@cos.io

Thomas Pfeiffer
Professor, Massey University
pfeiffer.massey@gmail.com

Adam Altmejd
Graduate Student, Stockholm School of Economics
adam@altmejd.se

Nick Buttrick 
Graduate Student, University of Virginia
nicholas.buttrick@gmail.com

Taizan Chan
Senior Lecturer,
Queensland University of Technology
dprtchan@nus.edu.sg

Yiling Chen
Professor, Harvard University
yiling@seas.harvard.edu

Eskil Forsell
Experimentation Scientist, Spotify
eskil.forsell@gmail.com

Anup Gampa
Graduate Student, University of Virginia
anup.gampa@gmail.com

Emma Heikensten
Graduate Student, Stockholm School of Economics
emma.heikensten@gmail.com

Lily Hummer
Research Coordinator, Center for Open Science

Taisuke Imai
Assistant Professor, LMU Munich
taisuke.imai@econ.lmu.de

Siri Isaksson
Graduate Student, Stockholm School of Economics
siri.dorothea.isaksson@googlemail.com

Dylan Manfredi
Research Assistant, University of Pennsylvania, Wharton School

mandylan@wharton.upenn.edu

Julia Rose
Graduate Student, University ofInnsbruck
julia.rose@uibk.ac.at

Eric-Jan Wagenmakers
ej.wagenmakers@gmail.com

Hang Wu
Research Fellow, National University of Singapore
hang.wu@hit.edu.cn


Additional Materials



Expert Commentary

Here is contact information for experts who can provide independent commentary about the paper:


John Ioannidis
Email  |  Web Site

Malcolm Macleod
Email  |  Web Site

Edward "Ted" Miguel
Email  | Web Site

Nicole Janz
Email  |  Web Site

Richard Lucas
Email  |  Web Site

Simine Vazire
Email  |  Web Site

Betsy Levy Paluck
Email  |  Web Site

Susan Fiske
Email  |  Web Site

Lisa Feldman Barrett
Email  |  Web Site

Neil Lewis
Email  |  Web Site

Sanjay Srivastava
Email  |  Web Site

Rolf Zwaan
Email  |  Web Site

Bob Reed
Email  |  Web Site

Donald Green
Email  |  Web Site

Macartan Humphreys
Email  |  Web Site

John List
Email  |  Web Site

Fiona Fidler
Email  |  Web Site

Fritz Strack
Email  |  Web Site

Hal Pashler
Email  |  Web Site

Chris Chambers
Email  |  Web Site

Materials and best practices that are part of the reformation in scientific practice


The Transparency and Openness Promotion (TOP) Guidelines provide journals, publishers, and funders with eight policies that they can implement in order to increase trust and rigor in science. Social-behavioral journals in particular have taken steps to increase expectations about publication practices. TOP includes tiered policies for data, materials, and analytical code transparency; use of better reporting guidelines to ensure that the design and analysis of a study is comprehensively reported; preregistration to increase discoverability of conducted research, with analysis plans to increase credibility of conducted tests; and replication studies in order to incentivize the conduct of this cornerstone of scientific process. Thousands of journals have become TOP Signatories, and over 850 have implemented policies compliant with TOP.

Registered Reports: A publishing model to improve rigor and reduce publication bias. Over 120 journals that accept Registered Reports will peer review submitted study proposals and evaluate for possible publication based on the soundness of the proposed methodology and importance of the research question. High quality submissions are provisionally accepted until final results are submitted, at which point a second round of peer review evaluates compliance with the registered protoc

Badges to acknowledge open practices: Journals can acknowledge authors for making available data underlying reported results, materials used to conduct the study, or for preregistering their study protocols. Issuing badges is associated with an increased rate of data sharing and with better availability of purportedly shared data and research materials (Kidwell et al., 2016).  Thirty-eight peer reviewed journals currently issue Open Science Badges, including those from the American Psychological Association  and the Association for Psychological Science

PreregistrationAuthors can increase rigor and credibility by specifying in advance how data will be collected and analyzed. A preregistration makes it easier to determine which analyses are testing a specific hypothesis, as opposed to the analyses that are done post-hoc in a search for unexpected, preliminary findings. Our recent review article in the Proceedings of the National Academy of Science, the Preregistration Revolution (Nosek et al., 2018), lays out the value of this process and ways to implement in in most research contexts. 

The OSF is an open source web tool for researchers to collaborate, share data, register their research plans and post preprints of their results. Used by over 100,000 researchers, the OSF is built in solid technology and backed by a preservation fund to ensure longevity. 

OSF by the Numbers



Participating Institutions

This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.