A substantial, years-long project, designated SCORE, involving hundreds of researchers across numerous countries, has uncovered a concerning trend within the social sciences: approximately half of published research findings cannot be independently replicated. This endeavor, which scrutinized close to 3,900 articles from 2009 to 2018 across various social and behavioral disciplines—including economics, political science, psychology, and sociology—found no straightforward method to predict which studies would falter.

The primary factor correlating with a higher chance of replication was the availability of data. Only a third of the papers examined had their data and computational code readily accessible, yet these papers exhibited a significantly better rate of successful reproduction. This points to data accessibility as a crucial, though currently underutilized, element in ensuring research reliability.

The Scope of the Problem
The SCORE initiative, a sprawling effort in both the United States and internationally, examined a broad spectrum of social science fields. Researchers meticulously tested previously published results to assess their reproducibility, robustness, and general replicability. The findings suggest that the reliability of the scientific literature in these areas is far from absolute. This mirrors previous investigations in psychology and biomedical science, raising further questions about the foundations of knowledge in these disciplines.
Read More: Study: Hot Weather Does Not Make People Less Cooperative

Data Availability: A Glimmer of Hope?
While the overall replication rate remains low, the SCORE study identified data availability as a key differentiator. Papers that made their underlying data and analytical code publicly accessible demonstrated a markedly higher success rate in replication attempts. This underscores the potential of 'open science' practices to bolster the credibility of research, although the study notes that only one-third of papers in the SCORE sample met this standard.

Broader Implications
The implications of these findings extend to how new research is perceived. Economists involved in replication efforts, like Abel Brodeur, founder of the Institute for Replication at the University of Ottawa, report maintaining a degree of skepticism towards newly published papers. The challenge lies not only in replicating existing findings but also in understanding why replication fails, a complex issue potentially involving variations in research design, participant recruitment, and analytical methods across different studies.
Read More: Economics and Political Science Studies Hard to Repeat, New Research Shows
A Historical Context
This broad investigation builds upon earlier, similar projects that have highlighted issues with reproducibility. The Nature Human Behaviour journal has featured work in 2018, for instance, that found only 13 out of 21 high-profile social science experiments from top journals could be reproduced. These ongoing concerns signal a persistent struggle to establish a consistently reliable body of evidence within the social sciences.
The Nuances of Replication
It is important to distinguish between 'reproducibility'—achieving the same result using the same data and analytical methods—and 'replicability,' which involves testing the same research question with new data. While SCORE focused on replication, other aspects like robustness checks, which involve re-analyzing existing data with different methods, are also part of the broader effort to assess research integrity. The failure to replicate does not automatically invalidate original findings, as novel analytical approaches can legitimately yield different results. However, it does raise questions about the stability and generalizability of those findings.
Read More: New tools may help diagnose endometriosis faster for women
Behind the Scenes
The SCORE project, described in publications in Nature and Science, represents a significant, seven-year undertaking. Data and code from individual replication projects are made available through repositories like the OSF (Open Science Framework), facilitating transparency. The study's methodology and results are detailed in supplementary information accompanying the primary publications.