Charlottesville, VA — Today, the Center for Open Science launches TOP Factor, an alternative to journal impact factor (JIF) to evaluate qualities of journals. TOP Factor assesses journal policies for the degree to which they promote core scholarly norms of transparency and reproducibility. TOP Factor provides a first step toward evaluating journals based on their quality of process and implementation of scholarly values. This alternative to JIF may reduce the dysfunctional incentives for journals to publish exciting results whatever their credibility.
“Too often, journals are compared using metrics that have nothing to do with their quality,” says Evan Mayo-Wilson, Associate Professor in the Department of Epidemiology and Biostatistics at Indiana University School of Public Health-Bloomington. “The TOP Factor measures something that matters. It compares journals based on whether they require transparency and methods that help reveal the credibility of research findings.”
TOP Factor is based primarily on the Transparency and Openness Promotion (TOP) Guidelines, a framework of eight standards that summarize behaviors that can improve transparency and reproducibility of research such as transparency of data, materials, code, and research design, preregistration, and replication. Journals can adopt policies for each of the eight standards that have increasing levels of stringency. For example, for the data transparency standard, a score of 0 indicates that the journal policy fails to meet the standard, 1 indicates that the policy requires that authors disclose whether data are publicly accessible, 2 indicates that the policy requires authors to make data publicly accessible unless it qualifies for an exception (e.g., sensitive health data, proprietary data), and 3 indicates that the policy includes both a requirement and a verification process for the data’s correspondence with the findings reported in the paper. TOP Factor also includes indicators of whether journals offer Registered Reports, a publishing model that reduces publication bias of ignoring negative and null results, and badging to acknowledge open research practices to facilitate visibility of open behaviors.
“TOP Factor is not a single number to rank journals,” said David Mellor, Director of Policy at the Center for Open Science. “TOP Factor is a modular set of indicators of journal policies to facilitate the visibility of good research practices.” At the TOP Factor website, users can filter TOP Factor scores by discipline, publisher, or by subsets of the standards to see how journal policies compare. This customization is important because some practices are emerging or more relevant in some research domains compared to others. Users can, for example, see how TOP Factor scores change within a discipline when preregistration standards are included or excluded. “Disciplines are evolving in different ways toward improving rigor and transparency,” noted Brian Nosek, Executive Director of the Center for Open Science. “TOP Factor makes that diversity visible and comparable across research communities. For example, economics journals are at the leading edge of requiring transparency of data and code whereas psychology journals are among the most assertive for promoting preregistration.”
“TOP Factor is a rating, not a ranking,” said Nosek. “In principle, it is possible for every journal to receive the highest possible rating. Indeed, the primary purpose of releasing the TOP Factor is to provide journals with information about their peers so they can improve their own policies.” Since release of TOP Guidelines in 2015, more than 1,100 journals have adopted transparency policies, and all major scholarly publishers have become signatories to the Guidelines. “We have been tracking publishers efforts to improve research rigor and transparency at the Center for Open Science, and have been very impressed about how the progressive journals and editors have pushed the envelope with their policies. TOP Factor provides a way to give those journals’ efforts credit and visibility,” said Mellor.
TOP Factor complements other efforts to improve research culture and practice. The Declaration of Research Assessment (DORA) states the research should be evaluated on its own merits, not the journal in which it is published. TOP Factor helps realize the DORA vision. Authors can use TOP Factor to identify journals that have policies aligned with their values and credit their effort to be more rigorous and transparent. Funders can use TOP Factor to assess which journals are most likely to support their policy mandates for grantees. And, publishers can use TOP Factor to identify journals with progressive policies for inspiration, and monitor trends in policies by discipline.
Unlike some metrics, the scoring process and data behind TOP Factor are freely available and verifiable. This maximizes transparency of the basis for TOP Factor and the reusability of the data by any organization that wishes to promote awareness of journal policies. It is already being put to use by FAIRSharing, a registry of data policies that curates and advocates for better data standards. “FAIRsharing is delighted to work with COS on TOP Factor and to display these on our FAIRsharing data policy pages (see e.g. PLOS Publisher data policy),” says Peter McQuilton, FAIRsharing Coordinator. “TOP Factor is a transparent, defensible way to assess the openness and FAIRness of a journal’s policy for sharing data, code and other materials.”
JIF is maligned, in part, because it is common to misinterpret “citation impact” as meaning “research quality,” and because it has been inappropriately used to evaluate individual research articles published in the journal. There are similar risks for TOP Factor. “A journal’s TOP Factor score does not measure the quality of the research published in that journal,” noted Mellor. “TOP Factor measures the extent to which journal policies facilitate the ability to evaluate the quality of the research.” Nosek added, “It is important to remember that research can be completely transparent and terrible at the same time. Policies promoting transparency and reproducibility make it easier to evaluate research quality.” TOP Factor assesses the journal’s policies, not the features of any individual paper.
“If we don't know how the research was conducted, we will have a hard time evaluating how rigorous it is or identifying and correcting flaws in the research,” said Simine Vazire, Professor of Psychology at UC Davis and Editor-in-Chief at the journal Collabra. “Of course, transparency is not the same thing as quality, but it is a necessary precondition for evaluating quality. If a journal does not emphasize transparency, we should not take the research it publishes very seriously, because we are in a terrible position to evaluate its quality.”
Another risk for TOP Factor is known as Goodhart’s Law — as soon as a metric becomes a target, it ceases to be a good metric. TOP Factor is defined in line with scholarly values. The most direct way to achieve a good score is to implement policies for promoting transparency and reproducibility. This is the goal of providing the metric. Nevertheless, bad actors could achieve a good score by having strong policies and not enforcing them. TOP Factor’s requirement for posting policies publicly provides some accountability against this gaming strategy. But to reduce this risk further, future iterations of TOP Factor could incorporate auditing of journals’ adherence to their own policies as part of the scores.
So far, over 250 journal policies have been evaluated and are presented on the TOP Factor website. The initial journals are heavily represented by psychology, economics, education, and general science outlets. The initial emphasis is on fields and journals that have been particularly progressive in adopting policies for transparency and rigor.
“Some general science outlets like Science, Nature, and Cell have impressive transparency policies particularly considering that they publish research from so many fields with diverse norms. It illustrates how seriously these journals are taking improvement of research quality,” said Mellor. Daniel Simons, Editor-in-Chief of Advances in Methods and Practices in Psychological Science, one of the most progressive journals on transparency policies said, “Our journal is devoted to communicating advances in research practices to a wide readership. The articles we publish should exemplify best practices for transparency and rigor, and by documenting our implementation of the TOP Guidelines, we can convey those values to our readers and potential authors.”
Journals will be added continuously over time. Editors and community members can complete a journal evaluation form on the TOP Factor website to accelerate the process. Center for Open Science staff review those submissions and confirm with the journal’s publicly posted policies before posting the scores to the TOP Factor website. Journals that have a TOP Factor score already can update their policies and submit for revised scoring at any time.
The initial release of TOP Factor focuses on policies relating to transparency of research process and outputs. As TOP Factor improves over time, it could incorporate more features for evaluating the quality of journal processes such as the transparency of its journal operations, operating costs, and editorial and peer review services. Long-term, TOP Factor should increase visibility and adoption of best practices for how journals can facilitate and accelerate research progress.
About the Center for Open Science
The Center for Open Science (COS) is a non-profit technology and culture change organization founded in 2013 with a mission to increase openness, integrity, and reproducibility of scientific research. COS pursues this mission by building communities around open science practices, supporting metascience research, and developing and maintaining free, open source software tools. The OSF is a web application that provides a solution for the challenges facing researchers who want to pursue open science practices, including: a streamlined ability to manage their work; collaborate with others; discover and be discovered; preregister their studies; and make their code, materials, and data openly accessible. Learn more at cos.io and osf.io.
Inquiries: David Mellor email@example.com