Against the replication crisis: New international journal encourages replication studies

Posted: August 1st, 2017 | Author: | Filed under: Data Sharing, journals, Projects | Tags: , , | Comments Off on Against the replication crisis: New international journal encourages replication studies
 Replications are pivotal for the credibility of empirical economics. Only findings which are robust and replicable can be generalized and can serve as evidence based advice to economic policy. But, already in 1983 Edward Leamer stated (p. 37):

Read the rest of this entry »


American Economic Review publishes AEA’s Session Papers on Replication

Posted: May 16th, 2017 | Author: | Filed under: Conference | Tags: , , | Comments Off on American Economic Review publishes AEA’s Session Papers on Replication

Two weeks ago, the American Economic Review published the ‘Papers and Proceedings‘ of the 129th annual meeting of the American Economic Association (AEA) held in January, 2017.

At this year’s meeting, one session was dedicated to the topic of ‘Replication in Microeconomics‘ while another focussed on ‘Replication and Ethics in Economics: Thirty Years after Dewald, Thursby, and Anderson“.

In both sessions, very interesting and excellent papers were presented.

Below, I list all presentations of these sessions and the corresponding links to the papers (if available): Read the rest of this entry »


German Research Foundation (DFG) publishes Statement on Replicability

Posted: April 26th, 2017 | Author: | Filed under: found on the net, German, Opinion | Tags: , | Comments Off on German Research Foundation (DFG) publishes Statement on Replicability

The German Research Foundation (DFG) has currently released a statement on the replicability of research results.

Interestingly (at least for me), the five-pager first starts with a broader definition of what replicable research is NOT.

Of course, replication is a very important method for testing empirical knowledge claims based on experimental and quantitative research in medicine, the natural, life, engineering, social and behavioural sciences, as well as the humanities.

But, according to DFG, there are limitations:

  • Replicability is not a universal criterion for scientific knowledge.
  • Ascertaining the replicability or non-replicability of a scientific result is itself a scientific result. As such, it is not final but subject to methodological scepticism and further investigation.
  • Non-replicability is not a universal proof by falsification.
  • Non-replicability is not a universal indicator of poor science.

Well, an unorthodox starting point for a paper on reproducible research‘ – so, at least, were my thoughts when I read the first page of the statement. Wouldn’t it be more common to first depict the important aspects of reproducible research and to suggest measures to support it, instead of rowing back at the beginning of such a statement? Read the rest of this entry »


New Working Paper: “Perceptions and Practices of Replication by Social and Behavioral Scientists”

Posted: April 21st, 2016 | Author: | Filed under: Data Sharing, Report | Tags: , , | Comments Off on New Working Paper: “Perceptions and Practices of Replication by Social and Behavioral Scientists”

300 CoverOne of our project partners has just released a publication that deals with the replication crises in economics and the social sciences.

In the abstract the three autors state:

We live in a time of increasing publication rates and specialization of scientific disciplines. More and more, the research community is facing the challenge of assuring the quality of research and maintaining trust in the scientific enterprise. Replication studies are necessary to detect erroneous research. Thus, the replicability of research is considered a hallmark of good scientific practice and it has lately become a key concern for research communities and science policy makers alike.

In their discussion paper Fecher, Fräßdorf and Wagner analyze perceptions and practices regarding replication studies in the social and behavioral sciences. Their analyses are based on a survey of almost 300 researchers.

Read the rest of this entry »


Contemporary, useful and subject-based: The replication network

Posted: October 23rd, 2015 | Author: | Filed under: found on the net | Tags: , | Comments Off on Contemporary, useful and subject-based: The replication network

3d social network_by_stockmonkeys.com_cc_by_3.0

Today I would like to introduce the replication network (TRN) to our readers, a network whose purpose is “to encourage economists and their journals to publish replications.” This is all along in line with the purpose of our own project.

The website of the replication network serves as a channel of communication to both update scholars about the state of replications in economics and to establish a network for the sharing  of information and ideas among economists. It offers important information on the possibility to publish replication studies in economics journals and provides lists of publications dealing with the topic of replications in economic research. Also a list of published replication studies is available. Read the rest of this entry »


New working paper: “Is Economics Research Replicable?”

Posted: October 7th, 2015 | Author: | Filed under: Data Policy, found on the net | Tags: , | 2 Comments »

400_ReplicAndrew Chang and Phillip Li, two researchers working at the Board of Governors of the Federal Reserve System and at the Office of the Comptroller of the Currency/U.S. Department of the Treasury, attempt to replicate 67 papers published in 13 well-regarded economics journals (American Economic Journal: Economic Policy, American Economic Journal: Macroeconomics, American Economic Review, American Economic Review: Papers and Proceedings (P&P), Canadian Journal of Economics, Econometrica, Economic Journal, Journal of Applied Econometrics, Journal of Political Economy, Review of Economic Dynamics, Review of Economic Studies, Review of Economics and Statistics, and Quarterly Journal of Economics), using author-provided replication files that include both data and code.

Some journals in the sample of Chang and Li require data and code replication files, and other journals do not require such files. Aside from 6 papers that use confidential data, they obtain data and code replication files for at least 29 of 35 papers (83%) that are required to provide such files as a condition of publication, compared to 11 of 26 papers (42%) that are not required to provide data and code replication files. Read the rest of this entry »


Biomedical Sciences: Journals unite to forster reproducible research

Posted: November 6th, 2014 | Author: | Filed under: found on the net | Tags: , | Comments Off on Biomedical Sciences: Journals unite to forster reproducible research

300_Bill_Dickinson_Laboratory_Science_biomedical_flickr_comWhen it comes to the topic of replication, it always is a good idea to consult the webpages of the nature journal. Yesterday, for instance, the journal reported that a group of editors representing more than 30 major journals, representatives from funding agencies as well as scientific leaders discussed principles and guidelines for preclinical biomedical research in June 2014.

The gathering was convened by the US National Institutes of Health, Nature and Science.

The attendees agreed on a common set of principles and guidelines in reporting preclinical research that list proposed journal policies and author reporting requirements in order to promote transparency and reproducibility. Read the rest of this entry »


EDaWaX: First funding period terminates with evaluation workshop

Posted: November 27th, 2013 | Author: | Filed under: EDaWaX, Workshop | Tags: , , , | 1 Comment »

250_eval_wsA week ago our project held its final evaluation workshop. We presented the main results of some of our work packages and also introduced a beta version of our pilot application for the management of publication-related research data in journals.

In preparation of the workshop we invited more than 30 editors of scholarly journals and almost a dozen scientists from 15 journals accepted our invitation. Read the rest of this entry »


New Nature Special: Challenges in irreproducible Research

Posted: April 30th, 2013 | Author: | Filed under: found on the net | Tags: , , | Comments Off on New Nature Special: Challenges in irreproducible Research

 

Replication and reproducibility are rare.

nature has published a new special issue on challenges in irreproducible research. The journal addresses the challenges and barriers of reproducibel research:

No research paper can ever be considered to be the final word, and the replication and corroboration of research results is key to the scientific process. In studying complex entities, especially animals and human beings, the complexity of the system and of the techniques can all too easily lead to results that seem robust in the lab, and valid to editors and referees of journals, but which do not stand the test of further studies. Nature has published a series of articles about the worrying extent to which research results have been found wanting in this respect. The editors of Nature and the Nature life sciences research journals have also taken substantive steps to put our own houses in order, in improving the transparency and robustness of what we publish. Journals, research laboratories and institutions and funders all have an interest in tackling issues of irreproducibility. We hope that the articles contained in this collection will help.

All articles within this issue are available free of charge. The table of contents is available here.

Graphic: pasukaru76, www.flickr.com


RunMyCode.org – Make research easier to use and replicate

Posted: September 28th, 2012 | Author: | Filed under: Projects | Tags: , | 1 Comment »

Last week Patrick, one of our project partners, made me aware of a very interesting website and service for researchers that is called runmycode.org. The concept of RunMyCode can be viewed as a novel attempt to provide  an executable paper solution.

Therefore I am very happy that Prof. Pérignon, one of the co-founders, has written a short introduction for our blog. If you would like to get more information about RunMyCode just visit the website or contact the team. Read the rest of this entry »


Requirements for Data Availability Policies to enable Replications

Posted: June 21st, 2012 | Author: | Filed under: Data Policy, EDaWaX | Tags: , , | 2 Comments »

In our analyses for work package 2 we collected some criteria to evaluate the quality of the data policies we found in our sample.

It was important to identify some core requirements that aim to ensure the replicability of economic research.  This was not an easy task, because we had to find some criteria that are suitable for many fields of research in economics.

Therefore we consulted several research papers and used the recommendations we found in the papers as a basis for analysing and assessing the suitability of data availability policies of economics journals in our study.

We’d like to discuss these criteria with our readers. Feel free to submit comments or send me an e-mail.

Read the rest of this entry »


OSF- Reproducibility Project tries to replicate the results published in three psychological journals

Posted: April 23rd, 2012 | Author: | Filed under: journals, Opinion | Tags: , , , | 1 Comment »

“If you’re a psychologist, the news has to make you a little nervous…”. With this statement Tom Bartlett introduced his article  “Is Psychology About to Come Undone?” in the Chronicle of Higher Education.

The source of his fears is the Reproducibility Project  – a group of researchers that aim to replicate every study within the three journals Psychological Science, the Journal of Personality and Social Psychology and the Journal of Experimental Psychology: Learning, Memory, and Cognition published in the year 2008.

The project is part of Open Science Framework (OSF), a group that is interested in increasing the alignment between scientific values and scientific practices. Despite developing some tools and infrastructure projects its stated mission is to “estimate the reproducibility of published psychological science.”

Read the rest of this entry »


Data Availability Policy: American Economic Review

Posted: February 9th, 2012 | Author: | Filed under: Data Policy, EDaWaX | Tags: , , | 1 Comment »

As announced in my previous blogpost, I ‘m starting the presentation of some data availability policies and replication policies with the American Economic Review (AER). The AER is a flagship of the economic profession and one of the top ranked journals in this scientific discipline.
The AER was published in 1911 for the first time. Only 7 – 10 percent of the submissions are accepted and later on published.

The AER adopted a so called replication policy in 1986 – despite the fact that studies (for example by Dewald, Thursby and Anderson) already claimed, that a replication policy is not enough to promote replicable results.
In their policy, the Review pledged authors to provide datasets and code for processing the data to other scientists that are interested in replicating the results on request.

Replication policies have often failed, even if the corresponding author is willing to support other researchers…and I imagine that this szenario is not very common …After publishing an article, authors mostly don’t have incentives to prepare the data and code for other researchers. It costs time and the rewards the scientific system pays for sharing data often are marginal.

Read the rest of this entry »


Nature Magazine Special Issue: Data Replication & Repoducibility

Posted: December 22nd, 2011 | Author: | Filed under: Data Sharing | Tags: , | Comments Off on Nature Magazine Special Issue: Data Replication & Repoducibility

The nature magazine has just published a special issue about data replication and reproducibility.

In their introduction the authors are claiming that replication is considered the scientific gold standard. To give a broader view about replication, the journal explores some of the issues associated with the replication of results in different scientific disciplines as for example primate cognition and behaviour research, computer sciences, biology and climate change studies.

Worth reading is the Editorial by J. Crocker and L. Cooper that is dealing with the fraud of Diederik Stapel and raises the question “what could be done to protect science and the public from fraud in the future?”. The answer of the authors, both psychologists, is:

“Greater transparency with data, including depositing data in repositories where they can be accessed by other scientists […], might have sped up detection of this fraud, and it would certainly make researchers more careful about the analyses that they publish.”

Read the rest of this entry »