Replications in the social sciences: New study confirms ongoing challenges

Posted: February 26th, 2018 | Author: | Filed under: Data Policy, Data Sharing | Tags: | No Comments »

Much has been said on the importance of replications. Recently, nature has published another comment that deals with this question. Paul Gertler, Sebastian Galiani and Mauricio Romero (GGR) conducted a survey, in which they focussed on the fields of economics, political science, sociology and psychology. They conclude that ‘the current system makes original authors and replicators antagonists.’

They found, that in the top-tier economics journals only few articles are replications – and all of those refute the original results. That said, GGR also asked 35 editors and co-editors of these economics journals about their perceptions towards publishing replications. While all editors who responded would publish a study that refutes the original findings, only a fourth can image to publish a study which confirms the original results.

GGR also reported of their experiences as members of the International Initiative for Impact Evaluation (3ie), a non-governmental organization that funds software-code replication. 3ie’s programme funded 27 studies. Of those completed, a little more than a third (35%) found conflicting results – so the majority of the studies confirmed previous findings. But the only replication who made it into a peer-reviewed journal was one of those who refutes the results of the original paper. Apparently, replications refuting the original findings are more of interest than replications that confirm previous findings. The also results in a bias in the scholarly publishing system and could also influence researchers attitudes towards replications. Those who want their replications published in a peer-reviewed journal might feel an incentive to “could lead to overstatement of the magnitude of criticism” and to falsify published findings.

Journals’ data policies and replications

As we all know, access to data and program code eases replication attempts. Or reverse expressed, the availability of data and code are prerequisites to conduct a replication. That’s why GGR also evaluated journal websites for the availabilities of policies that ask or require authors to make their replication files available (an issue that has also been regarded by the EDaWaX-project two years ago). They found, that the mid-tier economics journals and those in sociology and psychology rarely have such policies. By contrast, almost all of the top-tier journals in economics and most of those in political science have policies that require software code and data to be made available to editors before publication (-> chart).

The three authors also tried to conduct replications for 415 articles published in 9 economics journals in May 2016. Of these 415 articles, 212 employed proprietary or confidential data and therefore have been excluded from the analyses. In the end, GGR were only able to reproduce a small minority of the remaining 203 papers. Apparently, authors failed to honour the data policies of the journals and journals did not enforce their policy: Only 32% of the 203 papers investigated provided the raw data used in the studies, the final dataset was available for 60% of the articles. 42% of the papers also provided code that transforms the raw data to the final data set and another 72% provided the code of computation for the final tables and figures. Here another problem arose: Only 40% of the code ran. In total GGR were able to reproduce 37% of the final tables and figures. Only 14% of the results of the 203 studies were reproducible starting with the raw data (-> chart).

Again this sheds light on the ‘irreproducibility challenge’ in the social sciences. The findings of GGR are in line with many other studies, which report the same results (for a list of those studies, please have a look at the related literature page on this website).


GGR also give some recommendations for journals. In their opinion, more journals should take on a professional responsibility to implement data policies. In addition, using data editors to help to enforce these policies would be useful.

Journals could oversee that data and code are executable and replicable after conditional acceptance of a manuscript but before publication. Within this period of time, they could check that data used in and code are included and executable.

But GGR also concede that their recommendations might slow down the time from acceptance to publication for some papers. But even in these cases they see benefits: GGR argue that authors will eventually internalize the requirements and submit accurate, error-free materials. In the end, this will help to restore the credibility in science and research.



Leave a Reply