Psychology's New Normal

July 21st, 2018, Stephen Lindsay


lindsey lab

 


Psychology’s Renaissance is the title of a 2018 Annual Review of Psychology chapter by Leif Nelson, Joseph Simmons, and Uri Simonsohn.  In 2011, that same trio published a game-changing article titled False Positive Psychology in the journal Psychological Science.  That article converged with several other sources of influence (e.g., Geoff Cumming’s “dance of the p values;” Daryl Bem’s 2011 reports of magic; Deiderik Stapel’s unmasking as a data fabricator) to inspire a groundswell of support for changes to how psychological science is pursued.  Examples of efforts toward reform include the founding of the Center for Open Science, the Psychonomic Society’s 2012 Statistical Guidelines, the Transparency and Openness Promotion Guidelines published in Science in 2015, and Eric Eich’s 2014 editorial in Psychological Science titled “Business not as usual.”

Warranting the term “renaissance” will require major advances in theory and measurement to make psychology a genuinely cumulative and useful science.  Those are hard problems.  But the good news is that excellent progress is being made on solving easier problems that have often undermined the replicability of scientific findings, smoothing the way to what Simine Vazire (2018) dubbed “the credibility revolution.”  Specific practices include (a) specifying plans for conducting research and analyzing findings before data collection so as to differentiate between planned and post hoc tests of hypotheses (preregistration), (b) increasing sample sizes and measurement reliability to improve power and precision, (c) using double-blind procedures whenever experimenter effects may be an issue, and (d) sharing data, analysis scripts, and research materials with other researchers to enhance transparency and reproducibility.

As one means of encouraging these transparent science practices, the Center for Open Science developed the idea of awarding badges to articles that met certain criteria.  The data badge is awarded if the data needed to reproduce the analyses reported in the article can be directly accessed from a permanent, third-party site by other researchers.  The materials badge is awarded if the materials needed to reproduce the procedure can be directly accessed.  And the preregistration badge is awarded if the researchers show that they had a detailed plan for how they would conduct and analyze the study before they looked at the data.

Eric Eich, my predecessor as Editor in Chief of Psychological Science, made the journal the launch vehicle for such badges, beginning in 2014.  Uptake was gradual but steady, and may have been assisted by a report by Kidwell et al. (2016) with evidence that the badges were making a real difference to the likelihood that other researchers could access the data associated with a paper.


kidwell


In the last year or so, Psychological Science switched from having APS staff adjudicate badges to having the action editor do so.  My hope is that this will increase the standards demanded before a badge is awarded.  We also added, to each article, an Open Practices Statement, in which authors must speak to preregistration and to availability of data and materials. 

This month, July of 2018, the Table of Contents for Psychological Science  is like a billboard announcing the new normal.  As you can see, 13 of the 15 “regular” articles received the data badge (nine also received the materials badge and we had three of the once-rare triple-badgers).  We’re not done yet.  The quality of the preregistrations that earn badges is still very mixed and we cannot guarantee that other researchers will be able to reproduce the analyses of articles that earn the data badge.  But we are moving in the right direction.  And it is very exciting to hear that other societies and other journals are also taking up badges as a way of making transparency normative.


psych today


D. Stephen (Steve) Lindsay (@dstephenlindsay) is Professor of Psychology at the University of Victoria, British Columbia, Canada. He earned his BA in Psychology at Reed College in 1981 and his PhD in Psychology under the supervision of Marcia Johnson at Princeton University in 1987. Steve taught at Williams College from 1987 until 1990 and then spent a year in Larry Jacoby’s lab at McMaster University. He joined the Department of Psychology at the University of Victoria as an Assistant Professor in 1991 and he was promoted to Associate in 1994 and to Professor in 1997. 

Steve is a cognitive psychologist and his research explores the relationship between memory, current performance, and conscious experience. Steve’s largest contribution to psychological theory is the source-monitoring (SM) framework, which he codeveloped with Marcia Johnson. The SM framework is an account of how the mind/brain infers the sources or origins of its own contents (e.g., attributing some ideas to memory and others to new thinking).

Steve served as Editor in Chief of the Journal of Experimental Psychology: General from 2001 to 2006. He served on Governing Board, the Publications Committee, and the Ethics Committee of the Psychonomic Society, leading the effort to develop new statistical guidelines for that Society in 2012. In 2015 Steve became Editor in Chief of Psychological Science, the flagship journal of the Association for Psychological Science. In that capacity Steve has introduced a number of requirements and recommendations aimed at enhancing the transparency, rigor, and replicability of research in psychology.


Recent Blogs

The Content of Open Science

What Second Graders Can Teach Us About Open Science

What's Going on With Reproducibility?

Open Science and the Marketplace of Ideas

3 Things Societies Can Do to Promote Research Integrity

How to Manage and Share Your Open Data

Interview with Prereg Challenge Award Winner Dr. Allison Skinner

Next Steps for Promoting Transparency in Science

Public Goods Infrastructure for Preprints and Innovation in Scholarly Communication

A How-To Guide to Improving the Clarity and Continuity of Your Preregistration

Building a Central Service for Preprints

Three More Reasons to Take the Preregistration Challenge

The Center for Open Science is a Culture Change Technology Company

Preregistration: A Plan, Not a Prison

How can we improve diversity and inclusion in the open science movement?

OSF Fedora Integration, Aussie style!

Replicating a challenging study: it's all about sharing the details.

Some Examples of Publishing the Research That Actually Happened

How Preregistration Helped Improve Our Research: An Interview with Preregistration Challenge Awardees

Are reproducibility and open science starting to matter in tenure and promotion review?

The IRIS Replication Award and Collaboration in the Second Language Research Community

We Should Redefine Statistical Significance

Some Cool New OSF Features

How Open Source Research Tools Can Help Institutions Keep it Simple

OSF Add-ons Help You Maximize Research Data Storage and Accessibility

10 Tips for Making a Great Preregistration

Community-Driven Science: An Interview With EarthArXiv Founders Chris Jackson, Tom Narock and Bruce Caron

A Preregistration Coaching Network

Why are we working so hard to open up science? A personal story.

One Preregistration to Rule Them All?

Using the wiki just got better.

Transparent Definitions and Community Signals: Growth in the Open Science Community

We're Committed to GDPR. Here's How.

Preprints: The What, The Why, The How.

The Prereg Challenge Is Ending. What's Next?

We are Now Registering Preprint DOIs with Crossref

Using OSF in the Lab

Psychology's New Normal

How Open Commenting on Preprints Can Increase Scientific Transparency: An Interview With the Directors of PsyArxiv, SocArxiv, and Marxiv

The Landscape of Open Data Policies

Open Science is a Behavior.

Why pre-registration might be better for your career and well-being

Interview: Randy McCarthy discusses his experiences with publishing his first Registered Report

Towards minimal reporting standards for life scientists

This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.